CN116720819B - Impregnated paper raw material management system integrating knowledge graph and neural network - Google Patents
Impregnated paper raw material management system integrating knowledge graph and neural network Download PDFInfo
- Publication number
- CN116720819B CN116720819B CN202311005069.9A CN202311005069A CN116720819B CN 116720819 B CN116720819 B CN 116720819B CN 202311005069 A CN202311005069 A CN 202311005069A CN 116720819 B CN116720819 B CN 116720819B
- Authority
- CN
- China
- Prior art keywords
- raw material
- layer
- raw materials
- knowledge graph
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000002994 raw material Substances 0.000 title claims abstract description 179
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 9
- 239000000463 material Substances 0.000 claims abstract description 25
- 238000004458 analytical method Methods 0.000 claims abstract description 19
- 230000008859 change Effects 0.000 claims abstract description 9
- 238000007689 inspection Methods 0.000 claims abstract description 8
- 238000012549 training Methods 0.000 claims description 55
- 230000006870 function Effects 0.000 claims description 42
- 239000013598 vector Substances 0.000 claims description 30
- 238000010586 diagram Methods 0.000 claims description 16
- 238000000034 method Methods 0.000 claims description 16
- 238000011478 gradient descent method Methods 0.000 claims description 15
- 239000011159 matrix material Substances 0.000 claims description 15
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- 238000005457 optimization Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 12
- 238000005070 sampling Methods 0.000 claims description 12
- 230000004913 activation Effects 0.000 claims description 9
- 238000011161 development Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 5
- 239000000203 mixture Substances 0.000 claims description 4
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 abstract description 8
- 238000012544 monitoring process Methods 0.000 abstract description 2
- 230000000007 visual effect Effects 0.000 abstract description 2
- 238000007726 management method Methods 0.000 description 33
- 238000004590 computer program Methods 0.000 description 7
- 230000000875 corresponding effect Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 239000002904 solvent Substances 0.000 description 4
- 238000003860 storage Methods 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- RNFJDJUURJAICM-UHFFFAOYSA-N 2,2,4,4,6,6-hexaphenoxy-1,3,5-triaza-2$l^{5},4$l^{5},6$l^{5}-triphosphacyclohexa-1,3,5-triene Chemical compound N=1P(OC=2C=CC=CC=2)(OC=2C=CC=CC=2)=NP(OC=2C=CC=CC=2)(OC=2C=CC=CC=2)=NP=1(OC=1C=CC=CC=1)OC1=CC=CC=C1 RNFJDJUURJAICM-UHFFFAOYSA-N 0.000 description 2
- TWRXJAOTZQYOKJ-UHFFFAOYSA-L Magnesium chloride Chemical compound [Mg+2].[Cl-].[Cl-] TWRXJAOTZQYOKJ-UHFFFAOYSA-L 0.000 description 2
- 229920001131 Pulp (paper) Polymers 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005314 correlation function Methods 0.000 description 2
- 239000003063 flame retardant Substances 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- KXGFMDJXCMQABM-UHFFFAOYSA-N 2-methoxy-6-methylphenol Chemical compound [CH]OC1=CC=CC([CH])=C1O KXGFMDJXCMQABM-UHFFFAOYSA-N 0.000 description 1
- 239000004254 Ammonium phosphate Substances 0.000 description 1
- 235000017166 Bambusa arundinacea Nutrition 0.000 description 1
- 235000017491 Bambusa tulda Nutrition 0.000 description 1
- 241001330002 Bambuseae Species 0.000 description 1
- 235000015334 Phyllostachys viridis Nutrition 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 229910000148 ammonium phosphate Inorganic materials 0.000 description 1
- 235000019289 ammonium phosphates Nutrition 0.000 description 1
- 239000003242 anti bacterial agent Substances 0.000 description 1
- 230000000845 anti-microbial effect Effects 0.000 description 1
- 239000004599 antimicrobial Substances 0.000 description 1
- 239000011425 bamboo Substances 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229920002678 cellulose Polymers 0.000 description 1
- 239000001913 cellulose Substances 0.000 description 1
- 229920002301 cellulose acetate Polymers 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- MNNHAPBLZZVQHP-UHFFFAOYSA-N diammonium hydrogen phosphate Chemical compound [NH4+].[NH4+].OP([O-])([O-])=O MNNHAPBLZZVQHP-UHFFFAOYSA-N 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 229910001629 magnesium chloride Inorganic materials 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000003960 organic solvent Substances 0.000 description 1
- 235000020825 overweight Nutrition 0.000 description 1
- 239000005011 phenolic resin Substances 0.000 description 1
- 229920001568 phenolic resin Polymers 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 239000008104 plant cellulose Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 238000012954 risk control Methods 0.000 description 1
- 229920002545 silicone oil Polymers 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- -1 silver ions Chemical class 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000002562 thickening agent Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
- G06Q10/0875—Itemisation or classification of parts, supplies or services, e.g. bill of materials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/042—Knowledge-based neural networks; Logical representations of neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06315—Needs-based resource requirements planning or analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Development Economics (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Educational Administration (AREA)
- Quality & Reliability (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application relates to a impregnated paper raw material management system integrating a knowledge graph and a neural network, which comprises an inventory management unit, a database and a database, wherein the inventory management unit automatically acquires supplier information and creates a purchase order through an interface with the supplier database, and associates the warehouse-in information of raw materials with the knowledge graph; the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula; the quality management unit is used for correlating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system; and the prediction optimizing unit is used for analyzing the historical data of the raw materials through the knowledge graph, predicting the demand and the inventory change of the raw materials and optimizing the purchase plan and the inventory management of the raw materials. The application realizes comprehensive visual management and monitoring of the impregnated paper raw material, further improves the production efficiency and reduces the cost.
Description
Technical Field
The application relates to the field of data management, in particular to a impregnated paper raw material management system integrating a knowledge graph and a neural network.
Background
At present, the impregnated paper raw material management system is mainly carried out by a manual recording and management mode, and has the following problems:
inaccurate data: because the manual operation is easy to cause errors and omission, the accuracy of raw material information cannot be ensured. This may lead to inaccurate dip proportioning during production, affecting product quality;
data are not real-time: the manual recording and management requires a certain time and labor cost, resulting in untimely updating and transmission of raw material information. This can lead to inaccuracy in production planning and inventory management, affecting production efficiency and inventory costs;
data is not traceable: the manual recording and management can not provide complete data tracing function, and the sources, the service conditions and the change trend of the raw materials can not be accurately known. This can lead to difficult tracing of quality problems, affecting quality management and risk control;
lack of analysis and decision support: the manual recording and management cannot provide effective data analysis and decision support functions, and cannot further analyze and evaluate the use condition and benefit of raw materials. This can lead to unreasonable raw material procurement and use, affecting cost control and resource utilization efficiency;
with the development of impregnated paper industry, the production scale and complexity are increasing, and the conventional manual recording and management is not satisfactory.
Disclosure of Invention
In order to solve the problems, the application aims to provide a impregnated paper raw material management system integrating a knowledge graph and a neural network, which is a system for comprehensively managing and monitoring the quality control, inventory management and the like of impregnated paper raw materials, and can improve the production efficiency, reduce the production cost, improve the product quality and realize visual management.
In order to achieve the above purpose, the present application adopts the following technical scheme:
a impregnated paper raw material management system integrating a knowledge graph and a neural network comprises a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the stock management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for associating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
the prediction optimizing unit analyzes the historical data of the raw materials through a knowledge graph, predicts the demand quantity and the inventory change of the raw materials, and optimizes the purchase plan and the inventory management of the raw materials.
Further, the raw material knowledge graph is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material;
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials;
attribute relationship: using edges to represent attribute relationships between raw materials includes:
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
-supply relation: establishing edges according to the supply relation between suppliers and raw materials;
hierarchical structure of raw material properties: representing hierarchical relationships between raw material properties using a hierarchical structure, comprising:
-a parent attribute node;
-a child attribute node;
-attribute relationship edge: representing a hierarchical relationship between a parent attribute node and a child attribute node;
the knowledge graph comprises a secondary unfolding graph, wherein a primary unfolding schematic diagram shows the formula, the composition components and the raw materials of the impregnated paper; a two-stage development schematic diagram for showing the kinds of materials used for raw materials and corresponding supplier information, inventory data, purchasing information and attribute data.
Further, the stock management unit comprises a raw material purchasing database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
and associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity.
Further, the correlation between the material knowledge graph and the purchasing and warehousing data is specifically as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
and matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to acquire the source and quality of the raw materials.
Further, the material analysis and comparison unit predefines attribute relation and rule of raw materials, deduces other attributes related to the strength of the impregnated paper based on the reasoning and correlation functions of the raw material knowledge graph and based on the known attribute relation and rule, and finds out raw materials related to the strength of the impregnated paper and meeting the requirements of users;
there are n defined attributes that have a correlation with the strength attribute Y of the impregnated paper, then the impregnated paper strength:
Y = β0 + ΣβiXi,
wherein Xi is the attribute of raw materials, and beta i is the regression coefficient, i is more than or equal to 1 and less than or equal to n.
Further, the prediction optimization unit comprises a word vector conversion module, a word embedding layer, an LSTM model and an analysis optimization module, and is specifically constructed as follows:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
and training the LSTM model based on a training set to obtain a raw material prediction model.
Further, the DBN deep belief network is constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, its input is a word vector representation and its output is an activation value for the hidden layer. For RBM of subsequent layer, input is activation value of hidden layer of RBM of previous layer, and output is activation value of hidden layer of current layer.
After training, each RBM learns a set of weights and biases.
Further, the training using the contrast divergence algorithm is specifically:
the gradient descent method is used to minimize the energy function of the RBM, which is defined as follows:
E(v, h) = -sum(W * v * h) - sum(b_v * v) - sum(b_h * h)
where v denotes the state of the visible layer, h denotes the state of the hidden layer, W denotes the weight matrix between the visible layer and the hidden layer, b_v denotes the bias vector of the visible layer, and b_h denotes the bias vector of the hidden layer.
Further, the training process is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b + Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b + Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
-repeating the above two steps a number of times until a steady state is reached;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
-calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
-calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
-updating the weight matrix W;
-updating the bias vector b_v of the visible layer;
-updating the bias vector b_h of the hidden layer;
-updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating (2) and (3) until a convergence condition is reached or a predetermined number of training iterations is reached.
Further, the raw material prediction model is constructed specifically as follows:
dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in the training process of the model, compiling by using an Adam optimizer and a mean square error loss function, and training based on data of a training set and designating the training iteration times epochs and the sample number batch_size of each batch;
finally, the trained LSTM model is used to predict test set data, and a back propagation algorithm is used to update the weight and bias of the LSTM network model, minimizing the loss function.
The application has the following beneficial effects:
1. the application manages the impregnated paper raw material based on the knowledge graph technology, can realize the full life cycle management of the raw material, and provides more intelligent and efficient raw material management;
2. the application utilizes the knowledge graph to carry out deep analysis and comparison on the attributes of the raw materials, thereby helping users to better understand the characteristics and the advantages and disadvantages of the raw materials and assisting the users to carry out analysis decision;
3. according to the application, the knowledge graph, the DBN and the LSTM are fused to analyze and predict the historical data of the raw materials, so that a user is helped to better know the demand condition and the inventory change of the raw materials, and the user is helped to make a more accurate supply chain decision; the weight and the deviation of the LSTM network model are updated by using a back propagation algorithm, the loss function is minimized, the training process is accelerated by improving a random gradient descent method, the calculation efficiency of the LSTM network model is effectively improved, and the calculation cost is saved.
Drawings
FIG. 1 is a diagram of a system architecture of the present application;
FIG. 2 is a schematic diagram of an embodiment of the present application;
FIG. 3 is a first-order development schematic diagram of a knowledge graph according to an embodiment of the application;
fig. 4 is a schematic diagram of a second-level development of a knowledge graph according to an embodiment of the application.
Detailed Description
The application is described in further detail below with reference to the attached drawings and specific examples:
referring to fig. 1, the application provides a impregnated paper raw material management system integrating a knowledge graph and a neural network, which comprises a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the inventory management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of the raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for correlating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
and the prediction optimizing unit is used for analyzing the historical data of the raw materials through the knowledge graph, predicting the demand and the inventory change of the raw materials and optimizing the purchase plan and the inventory management of the raw materials.
In this example, the impregnated paper preparation formulation specifically includes: a paper substrate, an impregnating solution and a solvent;
paper base material: cellulose paper is generally used as a base material, and may be wood pulp paper, bamboo pulp paper, or the like.
Impregnating solution: the impregnating solution is a solution for impregnating the paper substrate, and different components can be selected according to different applications, and common impregnating solution components comprise:
resin: such as phenolic resin, cellulose acetate, etc., for increasing the strength and water resistance of the paper.
Flame retardant: such as magnesium chloride, ammonium phosphate, etc., for improving the flame retardant properties of paper.
Waterproof agent: such as silicone oil, wax, etc., for increasing the water resistance of the paper.
Antibacterial agent: such as silver ions, antimicrobial agents, etc., for increasing the antimicrobial properties of the paper.
Other additives: such as thickeners, pH adjusters, and the like, are used to adjust the properties and performance of the impregnating solution.
Solvent: solvents are used to dissolve and uniformly distribute the components of the impregnating solution in the paper substrate, and common solvents include water, organic solvents, and the like.
Preferably, referring to fig. 3 and 4, the raw material knowledge graph is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material; such as plant cellulose, non-woven wood pulp, PE, etc
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials; attributes such as inventory quantity, vendor information, etc
Attribute relationship: using edges to represent attribute relationships between raw materials;
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
hierarchical structure of raw material properties: using a hierarchical structure to represent hierarchical relationships between raw material properties;
-a parent attribute node representing a higher level attribute;
-a sub-attribute node representing a lower level attribute;
-attribute relationship edge: representing a hierarchical relationship between parent attribute nodes and child attribute nodes.
In a preferred embodiment, referring to fig. 3, a first-level development schematic diagram of a knowledge graph of the present example is shown, showing the formulation, composition and raw materials of impregnated paper; fig. 4 is a two-level development schematic diagram of the knowledge graph in the embodiment, which can intuitively obtain the types of materials that can be used for raw materials, and corresponding vendor information, inventory data, purchase information and attribute data.
In this embodiment, the stock management unit includes a raw material purchase database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
and associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity.
In this embodiment, it is preferable to correlate the material knowledge graph with the purchase and warehouse-in data, specifically as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
and matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to acquire the source and quality of the raw materials.
In this embodiment, preferably, the attribute relationship and rule of the raw materials are predefined in the material analysis and comparison unit, other attributes related to the required attributes are deduced based on the reasoning and association functions of the raw material knowledge graph and based on the known attribute relationship and rule, and the optimal scheme of the required attributes is solved;
there are defined n raw material properties correlated with the desired properties Y of the impregnated paper, then the impregnated paper strength:
wherein,,,/> , />respectively the firstiThe parameter vector corresponding to each raw material attribute and the preset correlation function,β 0 and i is more than or equal to 1 and less than or equal to n as regression coefficients.
Preferably, in this embodiment, the material analysis and comparison unit solves an optimal solution for the required attribute, specifically as follows:
collecting a set of data samples containing the required attribute Y and the corresponding raw material attribute x;
selecting a mean square error to measure the difference between the model predicted value and the actual value;
the least squares method is used to estimate the parameter β in the model:
(1) Giving an initial value for beta;
(2) Randomly selecting a training sample(x j ,y j );
(3) For the selected training samples, updating the parameter beta by adopting a gradient descent method:
wherein,,for learning coefficient->A gradient of a training square error corresponding to the training samples in the sequence i;
(4) Repeating the steps (2) and (3) until the solution beta reaches convergence accuracy.
And substituting the raw material attribute x into the model by using the estimated optimal parameter beta, and calculating to obtain an optimal required attribute Y scheme.
In this embodiment, the prediction optimization unit includes a word vector conversion module, a word embedding layer, an LSTM model, and an analysis optimization module, which specifically includes the following steps:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
and training the LSTM model based on a training set to obtain a raw material prediction model.
In this embodiment, the DBN deep belief network is preferably constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, its input is a word vector representation and its output is an activation value for the hidden layer. For RBM of subsequent layer, input is activation value of hidden layer of RBM of previous layer, and output is activation value of hidden layer of current layer.
After training, each RBM learns a set of weights and biases.
In this embodiment, in the contrast divergence algorithm, a gradient descent method is used to minimize the energy function of the RBM, which is as follows:
E(v, h) =-sum(W * v * h)- sum(b_v * v)- sum(b_h * h)
wherein v represents the state of the visible layer, h represents the state of the hidden layer, W represents the weight matrix between the visible layer and the hidden layer, b_v represents the bias vector of the visible layer, b_h represents the bias vector of the hidden layer, and the training process is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b+Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b+Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
-repeating the above two steps a number of times until a steady state is reached;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
-calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
-calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
-updating the weight matrix W;
-updating the bias vector b_v of the visible layer;
-updating the bias vector b_h of the hidden layer;
-updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating the step (2) and the step (3) until reaching a convergence condition or reaching a predetermined training iteration number.
In this embodiment, the raw material prediction model is constructed specifically as follows:
(1) Dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in this embodiment, each LSTM layer includes a forget gate, an input gate, and an output gate, as follows:
forget Gate (Forget Gate):
f_t =σ(W_f *[h_(t-1), x_t]+ b_f)
wherein f_t is the output of the forgetting gate, W_f is the weight matrix of the forgetting gate, h_ (t-1) is the hidden state of the last time step, x_t is the input of the current time step, b_f is the bias vector of the forgetting gate, and sigma is the sigmoid function.
Input Gate (Input Gate):
i_t =σ(W_i * [h_(t-1), x_t] + b_i)
wherein i_t is the output of the input gate, W_i is the weight matrix of the input gate, h_ (t-1) is the hidden state of the previous time step, x_t is the input of the current time step, b_i is the bias vector of the input gate, and sigma is the sigmoid function.
New Cell State:
wherein (1)>For the output of the new cell state, W_c is the weight matrix of the new cell state, h/u(t-1) is the hidden state of the last time step, x_t is the input of the current time step, b_c is the bias vector of the new cell state, and tanh is the hyperbolic tangent function.
Update cell state (Update Cell State):
wherein C_t is the cell state of the current time step, C_ (t-1) is the cell state of the last time step, f_t is the output of the forget gate, i_t is the output of the input gate,for the output of the new cell state, element-wise multiplication.
Output Gate (Output Gate):
o_t =σ(W_o * [h_(t-1), x_t] + b_o)
wherein o_t is the output of the output gate, W_o is the weight matrix of the output gate, h_ (t-1) is the hidden state of the previous time step, x_t is the input of the current time step, b_o is the bias vector of the output gate, and sigma is the sigmoid function.
Hidden State (Hidden State):
h_t = o_t * tanh(C_t)
wherein h_t is the hidden state of the current time step, o_t is the output of the output gate, C_t is the unit state of the current time step, and tanh is the hyperbolic tangent function.
(2) In the training process of the model, an Adam optimizer and a mean square error loss function are used for compiling, training is performed based on data of a training set and specifying the training iteration times epochs and the sample number batch_size of each batch, the weight and deviation of an LSTM network model are updated by using a back propagation algorithm, the loss function is minimized, and a raw material prediction model is obtained.
Preferably, in this embodiment, the weight and bias of the LSTM network model are updated using a back propagation algorithm, minimizing the loss function, as follows:
the step and formula of using a random gradient descent method to accelerate the training process and combining regularization techniques to prevent overfitting is as follows:
defining a loss function: selecting a cross entropy loss function to measure the difference between the prediction result of the model and the real label;
initializing network parameters: randomly initializing the weight and deviation of the LSTM network;
(3) Iterative training:
-selecting a small batch of samples (mini-batch): m samples T1 are randomly selected from the training set T { (ui, vi) }.
Forward propagation: and inputting the selected small batch of samples T1 into an LSTM network for forward propagation to obtain an output value of the network.
-calculating a loss function: comparing the output value f (uj) of the network with the actual value vj, calculating the value of the loss function。
Back propagation: the gradient of the loss function over weight and bias is calculated using a back propagation algorithm, specifically:
from the last layer, the gradient is calculated layer by layer according to the chain law.
a. Calculating the gradient of the output layer:
wherein,,representing the partial derivative, loss being the Loss function, aL being the output value of the output layer;
b. calculating gradient of hidden layer:
wherein, as per element multiplication, zl is the weighted input value of the hidden layer, wl+1 is the weight of the next layer, wl+1^T is the T power of the weight, δl+1 is the gradient of the next layer;
c. calculating the gradient of the weights and the deviations:
(al-1 is the output value of the upper layer)
-updating parameters: the weight and bias of the network are updated from the gradient using a random gradient descent method.
-random gradient descent method formula:
- W = W - learning_rate * gradient_W
- b = b - learning_rate * gradient_b
where W represents the weight, b represents the deviation, learning_rate represents the learning rate, and gradient_w and gradient_b represent the gradient.
Preferably, in order to prevent overfitting, a larger weight value can be penalized by adding an L2 regularization term to the loss function when updating the parameters, thereby reducing the complexity of the model.
L2 regularization:
loss = loss + lambda * sum(W^2)
where lambda represents the regularization parameter and W represents the weight.
(4) Repeating the step (3) for a plurality of times until reaching a convergence condition or reaching a predetermined training iteration number.
In this embodiment, the output of the DBN is preferably connected to the input layer of the LSTM network. Assuming that the DBN has n layers, the output of the ith layer serves as the input of the ith time step of the LSTM network. A cross entropy loss function is used to define the tuning targets for the DBN. Assuming that the output of the DBN is y and the output of the LSTM network is y', the fine tuning target is defined as: loss= -sum (y x log (y')), where sum represents the sum over all time steps.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and is not intended to limit the application in any way, and any person skilled in the art may make modifications or alterations to the disclosed technical content to the equivalent embodiments. However, any simple modification, equivalent variation and variation of the above embodiments according to the technical substance of the present application still fall within the protection scope of the technical solution of the present application.
Claims (2)
1. The impregnated paper raw material management system integrating the knowledge graph and the neural network is characterized by comprising a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the stock management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for associating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
the prediction optimizing unit analyzes the historical data of the raw materials through a knowledge graph, predicts the demand quantity and the inventory change of the raw materials, and optimizes the purchase plan and the inventory management of the raw materials;
the inventory management unit comprises a raw material purchasing database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity;
the material knowledge graph is associated with purchasing and warehousing data, and the material knowledge graph is concretely as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to obtain the source and quality of the raw materials;
the prediction optimization unit comprises a word vector conversion module, a word embedding layer, an LSTM model and an analysis optimization module, and is specifically as follows:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
the LSTM model is trained based on a training set to obtain a raw material prediction model;
the DBN deep belief network is constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, the input is word vector representation, and the output is the activation value of the hidden layer; for RBM of the subsequent layer, inputting an activation value of a hidden layer of the RBM of the previous layer, and outputting an activation value of a hidden layer of the current layer;
after training, each RBM learns a set of weights and deviations;
the training is performed by using a contrast divergence algorithm, specifically:
the gradient descent method is used to minimize the energy function of the RBM, which is defined as follows:
E(v, h) = -sum(W * v * h) - sum(b_v * v) - sum(b_h * h)
wherein v represents the state of the visible layer, h represents the state of the hidden layer, W represents the weight matrix between the visible layer and the hidden layer, b_v represents the bias vector of the visible layer, and b_h represents the bias vector of the hidden layer;
the training process using the contrast divergence algorithm is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b + Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b + Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
repeating the steps (1) and (2) for a plurality of times until the preset requirement is met;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating the step (2) and the step (3) until reaching a convergence condition or reaching a preset training iteration number;
the raw material prediction model is constructed specifically as follows:
dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in the training process of the model, compiling by using an Adam optimizer and a mean square error loss function, and training based on data of a training set and designating the training iteration times epochs and the sample number batch_size of each batch;
finally, predicting test set data by using the trained LSTM model, and updating the weight and deviation of the LSTM network model by using a back propagation algorithm to minimize a loss function;
the gradient of the loss function to the weight and deviation is calculated by using a back propagation algorithm, in particular:
calculating gradients layer by layer according to the chain law, starting from the last layer;
a. calculating the gradient of the output layer:
δL = ∂Loss/∂aL ;
wherein ∂ represents the partial derivative, loss is a Loss function, aL is the output value of the output layer;
b. calculating gradient of hidden layer:
δl = (∂Loss/∂zl) ⊙ (Wl+1^T * δl+1)
wherein, as per element multiplication, zl is the weighted input value of the hidden layer, wl+1 is the weight of the next layer, wl+1^T is the T power of the weight, δl+1 is the gradient of the next layer;
c. calculating the gradient of the weights and the deviations:
∂Loss/∂Wl = δl * al-1^T;
wherein al-1 is the output value of the previous layer;
∂Loss/∂bl = δl
finally, the weight and bias of the network are updated according to the gradient using a random gradient descent method.
2. The impregnated paper raw material management system fusing a knowledge graph and a neural network as claimed in claim 1, wherein the raw material knowledge graph base is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material;
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials;
attribute relationship: using edges to represent attribute relationships between raw materials includes:
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
-supply relation: establishing edges according to the supply relation between suppliers and raw materials;
hierarchical structure of raw material properties: representing hierarchical relationships between raw material properties using a hierarchical structure, comprising:
-a parent attribute node;
-a child attribute node;
-attribute relationship edge: representing a hierarchical relationship between a parent attribute node and a child attribute node;
the knowledge graph comprises a secondary unfolding graph, wherein a primary unfolding schematic diagram shows the formula, the composition components and the raw materials of the impregnated paper; a two-stage development schematic diagram for showing the kinds of materials used for raw materials and corresponding supplier information, inventory data, purchasing information and attribute data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311005069.9A CN116720819B (en) | 2023-08-10 | 2023-08-10 | Impregnated paper raw material management system integrating knowledge graph and neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311005069.9A CN116720819B (en) | 2023-08-10 | 2023-08-10 | Impregnated paper raw material management system integrating knowledge graph and neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116720819A CN116720819A (en) | 2023-09-08 |
CN116720819B true CN116720819B (en) | 2023-10-27 |
Family
ID=87872017
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311005069.9A Active CN116720819B (en) | 2023-08-10 | 2023-08-10 | Impregnated paper raw material management system integrating knowledge graph and neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116720819B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111061882A (en) * | 2019-08-19 | 2020-04-24 | 广州利科科技有限公司 | Knowledge graph construction method |
CN111428046A (en) * | 2020-03-18 | 2020-07-17 | 浙江网新恩普软件有限公司 | Knowledge graph generation method based on bidirectional L STM deep neural network |
WO2021107448A1 (en) * | 2019-11-25 | 2021-06-03 | 주식회사 데이터마케팅코리아 | Method and apparatus for providing knowledge graph-based marketing information analysis service to support efficient document classification processing |
CN114610898A (en) * | 2022-03-09 | 2022-06-10 | 北京航天智造科技发展有限公司 | Method and system for constructing supply chain operation knowledge graph |
CN114896408A (en) * | 2022-03-24 | 2022-08-12 | 北京大学深圳研究生院 | Construction method of material knowledge graph, material knowledge graph and application |
CN115605894A (en) * | 2020-09-03 | 2023-01-13 | 京东方科技集团股份有限公司(Cn) | Intelligent management system, intelligent management method and computer program product |
CN115858805A (en) * | 2022-11-21 | 2023-03-28 | 江苏科继佳信息技术有限公司 | Knowledge graph construction management system and method |
CN116070973A (en) * | 2022-11-30 | 2023-05-05 | 中化创新(北京)科技研究院有限公司 | Knowledge graph-based digital factory management index intelligent recommendation method |
CN116362371A (en) * | 2022-12-26 | 2023-06-30 | 青岛檬豆网络科技有限公司 | Knowledge graph-based purchasing prediction system and method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114168745B (en) * | 2021-11-30 | 2022-08-09 | 大连理工大学 | Knowledge graph construction method for production process of ethylene oxide derivative |
-
2023
- 2023-08-10 CN CN202311005069.9A patent/CN116720819B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111061882A (en) * | 2019-08-19 | 2020-04-24 | 广州利科科技有限公司 | Knowledge graph construction method |
WO2021107448A1 (en) * | 2019-11-25 | 2021-06-03 | 주식회사 데이터마케팅코리아 | Method and apparatus for providing knowledge graph-based marketing information analysis service to support efficient document classification processing |
CN111428046A (en) * | 2020-03-18 | 2020-07-17 | 浙江网新恩普软件有限公司 | Knowledge graph generation method based on bidirectional L STM deep neural network |
CN115605894A (en) * | 2020-09-03 | 2023-01-13 | 京东方科技集团股份有限公司(Cn) | Intelligent management system, intelligent management method and computer program product |
CN114610898A (en) * | 2022-03-09 | 2022-06-10 | 北京航天智造科技发展有限公司 | Method and system for constructing supply chain operation knowledge graph |
CN114896408A (en) * | 2022-03-24 | 2022-08-12 | 北京大学深圳研究生院 | Construction method of material knowledge graph, material knowledge graph and application |
CN115858805A (en) * | 2022-11-21 | 2023-03-28 | 江苏科继佳信息技术有限公司 | Knowledge graph construction management system and method |
CN116070973A (en) * | 2022-11-30 | 2023-05-05 | 中化创新(北京)科技研究院有限公司 | Knowledge graph-based digital factory management index intelligent recommendation method |
CN116362371A (en) * | 2022-12-26 | 2023-06-30 | 青岛檬豆网络科技有限公司 | Knowledge graph-based purchasing prediction system and method |
Non-Patent Citations (2)
Title |
---|
基于LINGO的在线陶瓷矿产资源管理知识库的设计与实现;钟玲;郭婷;张明;;智能计算机与应用(第01期);全文 * |
基于知识图谱分析的库存管理研究;叶勇;;信息资源管理学报(第01期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116720819A (en) | 2023-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130024167A1 (en) | Computer-Implemented Systems And Methods For Large Scale Automatic Forecast Combinations | |
Azab et al. | Simulation methods for changeable manufacturing | |
CN108713205A (en) | System and method for the data type that automatic mapping and data stream environment are used together | |
Soman et al. | Automating look-ahead schedule generation for construction using linked-data based constraint checking and reinforcement learning | |
Wang et al. | A context-aware recommendation system for improving manufacturing process modeling | |
Siddiqi et al. | Genetic algorithm for the mutual information-based feature selection in univariate time series data | |
Wang et al. | On the use of time series and search based software engineering for refactoring recommendation | |
CN111898867A (en) | Airplane final assembly production line productivity prediction method based on deep neural network | |
CN116306923A (en) | Evaluation weight calculation method based on knowledge graph | |
Kumar et al. | Predicting object-oriented software maintainability using hybrid neural network with parallel computing concept | |
CN116720819B (en) | Impregnated paper raw material management system integrating knowledge graph and neural network | |
Rajbahadur et al. | Pitfalls analyzer: quality control for model-driven data science pipelines | |
Fang et al. | Using Bayesian network technology to predict the semiconductor manufacturing yield rate in IoT | |
Khurana et al. | Autonomous predictive modeling via reinforcement learning | |
CN111523685B (en) | Method for reducing performance modeling overhead based on active learning | |
Chalyi et al. | Method of constructing an attribute description of the business process" as is" in the process approach to enterprise management | |
Santoni et al. | Comparison of High-Dimensional Bayesian Optimization Algorithms on BBOB | |
Liu | An intelligent planning technique-based software requirement analysis | |
Beiranvand et al. | Bridging the semantic gap for software effort estimation by hierarchical feature selection techniques | |
Gupta et al. | A meta level data mining approach to predict software reusability | |
Liu et al. | The management of simulation validation | |
Su et al. | Scene-aware Activity Program Generation with Language Guidance Supplementary Material | |
Pianosi et al. | ANN-based representation of parametric and residual uncertainty of models | |
Montevechi et al. | Ensemble-Based Infill Search Simulation Optimization Framework | |
Taktak et al. | A Computer-assisted Performance Analysis and Optimization (CPAO) of Manufacturing Systems based on ARENA® Software |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |