CN116720819B - Impregnated paper raw material management system integrating knowledge graph and neural network - Google Patents

Impregnated paper raw material management system integrating knowledge graph and neural network Download PDF

Info

Publication number
CN116720819B
CN116720819B CN202311005069.9A CN202311005069A CN116720819B CN 116720819 B CN116720819 B CN 116720819B CN 202311005069 A CN202311005069 A CN 202311005069A CN 116720819 B CN116720819 B CN 116720819B
Authority
CN
China
Prior art keywords
raw material
layer
raw materials
knowledge graph
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311005069.9A
Other languages
Chinese (zh)
Other versions
CN116720819A (en
Inventor
黄旭丹
黄建超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Minqing Shuangleng Paper Co ltd
Original Assignee
Fujian Minqing Shuangleng Paper Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Minqing Shuangleng Paper Co ltd filed Critical Fujian Minqing Shuangleng Paper Co ltd
Priority to CN202311005069.9A priority Critical patent/CN116720819B/en
Publication of CN116720819A publication Critical patent/CN116720819A/en
Application granted granted Critical
Publication of CN116720819B publication Critical patent/CN116720819B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • G06Q10/0875Itemisation or classification of parts, supplies or services, e.g. bill of materials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/042Knowledge-based neural networks; Logical representations of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06315Needs-based resource requirements planning or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Development Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Educational Administration (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to a impregnated paper raw material management system integrating a knowledge graph and a neural network, which comprises an inventory management unit, a database and a database, wherein the inventory management unit automatically acquires supplier information and creates a purchase order through an interface with the supplier database, and associates the warehouse-in information of raw materials with the knowledge graph; the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula; the quality management unit is used for correlating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system; and the prediction optimizing unit is used for analyzing the historical data of the raw materials through the knowledge graph, predicting the demand and the inventory change of the raw materials and optimizing the purchase plan and the inventory management of the raw materials. The application realizes comprehensive visual management and monitoring of the impregnated paper raw material, further improves the production efficiency and reduces the cost.

Description

Impregnated paper raw material management system integrating knowledge graph and neural network
Technical Field
The application relates to the field of data management, in particular to a impregnated paper raw material management system integrating a knowledge graph and a neural network.
Background
At present, the impregnated paper raw material management system is mainly carried out by a manual recording and management mode, and has the following problems:
inaccurate data: because the manual operation is easy to cause errors and omission, the accuracy of raw material information cannot be ensured. This may lead to inaccurate dip proportioning during production, affecting product quality;
data are not real-time: the manual recording and management requires a certain time and labor cost, resulting in untimely updating and transmission of raw material information. This can lead to inaccuracy in production planning and inventory management, affecting production efficiency and inventory costs;
data is not traceable: the manual recording and management can not provide complete data tracing function, and the sources, the service conditions and the change trend of the raw materials can not be accurately known. This can lead to difficult tracing of quality problems, affecting quality management and risk control;
lack of analysis and decision support: the manual recording and management cannot provide effective data analysis and decision support functions, and cannot further analyze and evaluate the use condition and benefit of raw materials. This can lead to unreasonable raw material procurement and use, affecting cost control and resource utilization efficiency;
with the development of impregnated paper industry, the production scale and complexity are increasing, and the conventional manual recording and management is not satisfactory.
Disclosure of Invention
In order to solve the problems, the application aims to provide a impregnated paper raw material management system integrating a knowledge graph and a neural network, which is a system for comprehensively managing and monitoring the quality control, inventory management and the like of impregnated paper raw materials, and can improve the production efficiency, reduce the production cost, improve the product quality and realize visual management.
In order to achieve the above purpose, the present application adopts the following technical scheme:
a impregnated paper raw material management system integrating a knowledge graph and a neural network comprises a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the stock management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for associating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
the prediction optimizing unit analyzes the historical data of the raw materials through a knowledge graph, predicts the demand quantity and the inventory change of the raw materials, and optimizes the purchase plan and the inventory management of the raw materials.
Further, the raw material knowledge graph is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material;
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials;
attribute relationship: using edges to represent attribute relationships between raw materials includes:
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
-supply relation: establishing edges according to the supply relation between suppliers and raw materials;
hierarchical structure of raw material properties: representing hierarchical relationships between raw material properties using a hierarchical structure, comprising:
-a parent attribute node;
-a child attribute node;
-attribute relationship edge: representing a hierarchical relationship between a parent attribute node and a child attribute node;
the knowledge graph comprises a secondary unfolding graph, wherein a primary unfolding schematic diagram shows the formula, the composition components and the raw materials of the impregnated paper; a two-stage development schematic diagram for showing the kinds of materials used for raw materials and corresponding supplier information, inventory data, purchasing information and attribute data.
Further, the stock management unit comprises a raw material purchasing database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
and associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity.
Further, the correlation between the material knowledge graph and the purchasing and warehousing data is specifically as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
and matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to acquire the source and quality of the raw materials.
Further, the material analysis and comparison unit predefines attribute relation and rule of raw materials, deduces other attributes related to the strength of the impregnated paper based on the reasoning and correlation functions of the raw material knowledge graph and based on the known attribute relation and rule, and finds out raw materials related to the strength of the impregnated paper and meeting the requirements of users;
there are n defined attributes that have a correlation with the strength attribute Y of the impregnated paper, then the impregnated paper strength:
Y = β0 + ΣβiXi,
wherein Xi is the attribute of raw materials, and beta i is the regression coefficient, i is more than or equal to 1 and less than or equal to n.
Further, the prediction optimization unit comprises a word vector conversion module, a word embedding layer, an LSTM model and an analysis optimization module, and is specifically constructed as follows:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
and training the LSTM model based on a training set to obtain a raw material prediction model.
Further, the DBN deep belief network is constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, its input is a word vector representation and its output is an activation value for the hidden layer. For RBM of subsequent layer, input is activation value of hidden layer of RBM of previous layer, and output is activation value of hidden layer of current layer.
After training, each RBM learns a set of weights and biases.
Further, the training using the contrast divergence algorithm is specifically:
the gradient descent method is used to minimize the energy function of the RBM, which is defined as follows:
E(v, h) = -sum(W * v * h) - sum(b_v * v) - sum(b_h * h)
where v denotes the state of the visible layer, h denotes the state of the hidden layer, W denotes the weight matrix between the visible layer and the hidden layer, b_v denotes the bias vector of the visible layer, and b_h denotes the bias vector of the hidden layer.
Further, the training process is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b + Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b + Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
-repeating the above two steps a number of times until a steady state is reached;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
-calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
-calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
-updating the weight matrix W;
-updating the bias vector b_v of the visible layer;
-updating the bias vector b_h of the hidden layer;
-updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating (2) and (3) until a convergence condition is reached or a predetermined number of training iterations is reached.
Further, the raw material prediction model is constructed specifically as follows:
dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in the training process of the model, compiling by using an Adam optimizer and a mean square error loss function, and training based on data of a training set and designating the training iteration times epochs and the sample number batch_size of each batch;
finally, the trained LSTM model is used to predict test set data, and a back propagation algorithm is used to update the weight and bias of the LSTM network model, minimizing the loss function.
The application has the following beneficial effects:
1. the application manages the impregnated paper raw material based on the knowledge graph technology, can realize the full life cycle management of the raw material, and provides more intelligent and efficient raw material management;
2. the application utilizes the knowledge graph to carry out deep analysis and comparison on the attributes of the raw materials, thereby helping users to better understand the characteristics and the advantages and disadvantages of the raw materials and assisting the users to carry out analysis decision;
3. according to the application, the knowledge graph, the DBN and the LSTM are fused to analyze and predict the historical data of the raw materials, so that a user is helped to better know the demand condition and the inventory change of the raw materials, and the user is helped to make a more accurate supply chain decision; the weight and the deviation of the LSTM network model are updated by using a back propagation algorithm, the loss function is minimized, the training process is accelerated by improving a random gradient descent method, the calculation efficiency of the LSTM network model is effectively improved, and the calculation cost is saved.
Drawings
FIG. 1 is a diagram of a system architecture of the present application;
FIG. 2 is a schematic diagram of an embodiment of the present application;
FIG. 3 is a first-order development schematic diagram of a knowledge graph according to an embodiment of the application;
fig. 4 is a schematic diagram of a second-level development of a knowledge graph according to an embodiment of the application.
Detailed Description
The application is described in further detail below with reference to the attached drawings and specific examples:
referring to fig. 1, the application provides a impregnated paper raw material management system integrating a knowledge graph and a neural network, which comprises a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the inventory management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of the raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for correlating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
and the prediction optimizing unit is used for analyzing the historical data of the raw materials through the knowledge graph, predicting the demand and the inventory change of the raw materials and optimizing the purchase plan and the inventory management of the raw materials.
In this example, the impregnated paper preparation formulation specifically includes: a paper substrate, an impregnating solution and a solvent;
paper base material: cellulose paper is generally used as a base material, and may be wood pulp paper, bamboo pulp paper, or the like.
Impregnating solution: the impregnating solution is a solution for impregnating the paper substrate, and different components can be selected according to different applications, and common impregnating solution components comprise:
resin: such as phenolic resin, cellulose acetate, etc., for increasing the strength and water resistance of the paper.
Flame retardant: such as magnesium chloride, ammonium phosphate, etc., for improving the flame retardant properties of paper.
Waterproof agent: such as silicone oil, wax, etc., for increasing the water resistance of the paper.
Antibacterial agent: such as silver ions, antimicrobial agents, etc., for increasing the antimicrobial properties of the paper.
Other additives: such as thickeners, pH adjusters, and the like, are used to adjust the properties and performance of the impregnating solution.
Solvent: solvents are used to dissolve and uniformly distribute the components of the impregnating solution in the paper substrate, and common solvents include water, organic solvents, and the like.
Preferably, referring to fig. 3 and 4, the raw material knowledge graph is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material; such as plant cellulose, non-woven wood pulp, PE, etc
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials; attributes such as inventory quantity, vendor information, etc
Attribute relationship: using edges to represent attribute relationships between raw materials;
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
hierarchical structure of raw material properties: using a hierarchical structure to represent hierarchical relationships between raw material properties;
-a parent attribute node representing a higher level attribute;
-a sub-attribute node representing a lower level attribute;
-attribute relationship edge: representing a hierarchical relationship between parent attribute nodes and child attribute nodes.
In a preferred embodiment, referring to fig. 3, a first-level development schematic diagram of a knowledge graph of the present example is shown, showing the formulation, composition and raw materials of impregnated paper; fig. 4 is a two-level development schematic diagram of the knowledge graph in the embodiment, which can intuitively obtain the types of materials that can be used for raw materials, and corresponding vendor information, inventory data, purchase information and attribute data.
In this embodiment, the stock management unit includes a raw material purchase database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
and associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity.
In this embodiment, it is preferable to correlate the material knowledge graph with the purchase and warehouse-in data, specifically as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
and matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to acquire the source and quality of the raw materials.
In this embodiment, preferably, the attribute relationship and rule of the raw materials are predefined in the material analysis and comparison unit, other attributes related to the required attributes are deduced based on the reasoning and association functions of the raw material knowledge graph and based on the known attribute relationship and rule, and the optimal scheme of the required attributes is solved;
there are defined n raw material properties correlated with the desired properties Y of the impregnated paper, then the impregnated paper strength:
wherein, the liquid crystal display device comprises a liquid crystal display device,,/> , />respectively the firstiThe parameter vector corresponding to each raw material attribute and the preset correlation function,β 0 and i is more than or equal to 1 and less than or equal to n as regression coefficients.
Preferably, in this embodiment, the material analysis and comparison unit solves an optimal solution for the required attribute, specifically as follows:
collecting a set of data samples containing the required attribute Y and the corresponding raw material attribute x;
selecting a mean square error to measure the difference between the model predicted value and the actual value;
the least squares method is used to estimate the parameter β in the model:
(1) Giving an initial value for beta;
(2) Randomly selecting a training sample(x j ,y j
(3) For the selected training samples, updating the parameter beta by adopting a gradient descent method:
wherein, the liquid crystal display device comprises a liquid crystal display device,for learning coefficient->A gradient of a training square error corresponding to the training samples in the sequence i;
(4) Repeating the steps (2) and (3) until the solution beta reaches convergence accuracy.
And substituting the raw material attribute x into the model by using the estimated optimal parameter beta, and calculating to obtain an optimal required attribute Y scheme.
In this embodiment, the prediction optimization unit includes a word vector conversion module, a word embedding layer, an LSTM model, and an analysis optimization module, which specifically includes the following steps:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
and training the LSTM model based on a training set to obtain a raw material prediction model.
In this embodiment, the DBN deep belief network is preferably constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, its input is a word vector representation and its output is an activation value for the hidden layer. For RBM of subsequent layer, input is activation value of hidden layer of RBM of previous layer, and output is activation value of hidden layer of current layer.
After training, each RBM learns a set of weights and biases.
In this embodiment, in the contrast divergence algorithm, a gradient descent method is used to minimize the energy function of the RBM, which is as follows:
E(v, h) =-sum(W * v * h)- sum(b_v * v)- sum(b_h * h)
wherein v represents the state of the visible layer, h represents the state of the hidden layer, W represents the weight matrix between the visible layer and the hidden layer, b_v represents the bias vector of the visible layer, b_h represents the bias vector of the hidden layer, and the training process is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b+Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b+Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
-repeating the above two steps a number of times until a steady state is reached;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
-calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
-calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
-updating the weight matrix W;
-updating the bias vector b_v of the visible layer;
-updating the bias vector b_h of the hidden layer;
-updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating the step (2) and the step (3) until reaching a convergence condition or reaching a predetermined training iteration number.
In this embodiment, the raw material prediction model is constructed specifically as follows:
(1) Dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in this embodiment, each LSTM layer includes a forget gate, an input gate, and an output gate, as follows:
forget Gate (Forget Gate):
f_t =σ(W_f *[h_(t-1), x_t]+ b_f)
wherein f_t is the output of the forgetting gate, W_f is the weight matrix of the forgetting gate, h_ (t-1) is the hidden state of the last time step, x_t is the input of the current time step, b_f is the bias vector of the forgetting gate, and sigma is the sigmoid function.
Input Gate (Input Gate):
i_t =σ(W_i * [h_(t-1), x_t] + b_i)
wherein i_t is the output of the input gate, W_i is the weight matrix of the input gate, h_ (t-1) is the hidden state of the previous time step, x_t is the input of the current time step, b_i is the bias vector of the input gate, and sigma is the sigmoid function.
New Cell State:
wherein (1)>For the output of the new cell state, W_c is the weight matrix of the new cell state, h/u(t-1) is the hidden state of the last time step, x_t is the input of the current time step, b_c is the bias vector of the new cell state, and tanh is the hyperbolic tangent function.
Update cell state (Update Cell State):
wherein C_t is the cell state of the current time step, C_ (t-1) is the cell state of the last time step, f_t is the output of the forget gate, i_t is the output of the input gate,for the output of the new cell state, element-wise multiplication.
Output Gate (Output Gate):
o_t =σ(W_o * [h_(t-1), x_t] + b_o)
wherein o_t is the output of the output gate, W_o is the weight matrix of the output gate, h_ (t-1) is the hidden state of the previous time step, x_t is the input of the current time step, b_o is the bias vector of the output gate, and sigma is the sigmoid function.
Hidden State (Hidden State):
h_t = o_t * tanh(C_t)
wherein h_t is the hidden state of the current time step, o_t is the output of the output gate, C_t is the unit state of the current time step, and tanh is the hyperbolic tangent function.
(2) In the training process of the model, an Adam optimizer and a mean square error loss function are used for compiling, training is performed based on data of a training set and specifying the training iteration times epochs and the sample number batch_size of each batch, the weight and deviation of an LSTM network model are updated by using a back propagation algorithm, the loss function is minimized, and a raw material prediction model is obtained.
Preferably, in this embodiment, the weight and bias of the LSTM network model are updated using a back propagation algorithm, minimizing the loss function, as follows:
the step and formula of using a random gradient descent method to accelerate the training process and combining regularization techniques to prevent overfitting is as follows:
defining a loss function: selecting a cross entropy loss function to measure the difference between the prediction result of the model and the real label;
initializing network parameters: randomly initializing the weight and deviation of the LSTM network;
(3) Iterative training:
-selecting a small batch of samples (mini-batch): m samples T1 are randomly selected from the training set T { (ui, vi) }.
Forward propagation: and inputting the selected small batch of samples T1 into an LSTM network for forward propagation to obtain an output value of the network.
-calculating a loss function: comparing the output value f (uj) of the network with the actual value vj, calculating the value of the loss function
Back propagation: the gradient of the loss function over weight and bias is calculated using a back propagation algorithm, specifically:
from the last layer, the gradient is calculated layer by layer according to the chain law.
a. Calculating the gradient of the output layer:
wherein, the liquid crystal display device comprises a liquid crystal display device,representing the partial derivative, loss being the Loss function, aL being the output value of the output layer;
b. calculating gradient of hidden layer:
wherein, as per element multiplication, zl is the weighted input value of the hidden layer, wl+1 is the weight of the next layer, wl+1^T is the T power of the weight, δl+1 is the gradient of the next layer;
c. calculating the gradient of the weights and the deviations:
(al-1 is the output value of the upper layer)
-updating parameters: the weight and bias of the network are updated from the gradient using a random gradient descent method.
-random gradient descent method formula:
- W = W - learning_rate * gradient_W
- b = b - learning_rate * gradient_b
where W represents the weight, b represents the deviation, learning_rate represents the learning rate, and gradient_w and gradient_b represent the gradient.
Preferably, in order to prevent overfitting, a larger weight value can be penalized by adding an L2 regularization term to the loss function when updating the parameters, thereby reducing the complexity of the model.
L2 regularization:
loss = loss + lambda * sum(W^2)
where lambda represents the regularization parameter and W represents the weight.
(4) Repeating the step (3) for a plurality of times until reaching a convergence condition or reaching a predetermined training iteration number.
In this embodiment, the output of the DBN is preferably connected to the input layer of the LSTM network. Assuming that the DBN has n layers, the output of the ith layer serves as the input of the ith time step of the LSTM network. A cross entropy loss function is used to define the tuning targets for the DBN. Assuming that the output of the DBN is y and the output of the LSTM network is y', the fine tuning target is defined as: loss= -sum (y x log (y')), where sum represents the sum over all time steps.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and is not intended to limit the application in any way, and any person skilled in the art may make modifications or alterations to the disclosed technical content to the equivalent embodiments. However, any simple modification, equivalent variation and variation of the above embodiments according to the technical substance of the present application still fall within the protection scope of the technical solution of the present application.

Claims (2)

1. The impregnated paper raw material management system integrating the knowledge graph and the neural network is characterized by comprising a raw material knowledge graph library, an inventory management unit, a material analysis and comparison unit, a quality management unit and a prediction optimization unit; the stock management unit, the material analysis and comparison unit, the quality management unit and the prediction optimization unit are respectively connected with the raw material knowledge graph library;
the stock management unit automatically acquires supplier information and creates a purchase order through an interface with a supplier database, and associates the warehousing information of raw materials with a knowledge graph;
the material analysis and comparison unit is used for analyzing and comparing the attributes of the raw materials through the knowledge graph, and finding out the raw materials related to the specific attributes for selecting or optimizing the formula;
the quality management unit is used for associating the quality inspection result with the raw material knowledge graph through an interface of the quality inspection system;
the prediction optimizing unit analyzes the historical data of the raw materials through a knowledge graph, predicts the demand quantity and the inventory change of the raw materials, and optimizes the purchase plan and the inventory management of the raw materials;
the inventory management unit comprises a raw material purchasing database and an inventory database;
in the purchasing process, a raw material purchasing database records raw material information purchased each time, and correlates a material knowledge graph with purchasing and warehousing data;
associating the raw materials in the knowledge graph with the raw materials in the stock database, recording the stock information into the stock database when the raw materials are put in stock, and updating the stock quantity;
the material knowledge graph is associated with purchasing and warehousing data, and the material knowledge graph is concretely as follows:
associating the raw materials in the knowledge graph with the raw materials in the purchasing and warehousing data through the unique identifiers of the raw materials;
matching raw material information in the purchasing and warehousing data with raw material attributes in the knowledge graph to verify the accuracy and consistency of the raw materials;
matching the supplier information in the purchasing and warehousing data with the supplier information in the knowledge graph to obtain the source and quality of the raw materials;
the prediction optimization unit comprises a word vector conversion module, a word embedding layer, an LSTM model and an analysis optimization module, and is specifically as follows:
the word vector conversion module extracts entities and relations of the inventory change related data in the knowledge graph to convert word vectors and convert unstructured and semi-structured data into structured data;
the word embedding layer constructs a training data set of the word embedding layer by taking the converted structured data as input, learns word vectors by using a DBN deep confidence network, captures semantic information of words and enriches the characteristics of the training set;
the LSTM model is trained based on a training set to obtain a raw material prediction model;
the DBN deep belief network is constructed as follows:
for RBM of each layer, training by using contrast divergence algorithm, wherein a gradient descent method is used to minimize the energy function of RBM during training;
for the first layer RBM, the input is word vector representation, and the output is the activation value of the hidden layer; for RBM of the subsequent layer, inputting an activation value of a hidden layer of the RBM of the previous layer, and outputting an activation value of a hidden layer of the current layer;
after training, each RBM learns a set of weights and deviations;
the training is performed by using a contrast divergence algorithm, specifically:
the gradient descent method is used to minimize the energy function of the RBM, which is defined as follows:
E(v, h) = -sum(W * v * h) - sum(b_v * v) - sum(b_h * h)
wherein v represents the state of the visible layer, h represents the state of the hidden layer, W represents the weight matrix between the visible layer and the hidden layer, b_v represents the bias vector of the visible layer, and b_h represents the bias vector of the hidden layer;
the training process using the contrast divergence algorithm is as follows:
(1) Initializing the state v of the visible layer of the RBM as a training sample;
(2) Using Gibbs sampling to perform approximate inference to obtain a state h of the hidden layer;
sampling the state h of the hidden layer by calculating the conditional probability P (h|v)
P(h|v) = sigmoid(b + Wv)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and v represents a given variable;
sampling the state v of the visible layer by calculating the conditional probability P (v|h)
P(v|h) = sigmoid(b + Wh)
Wherein sigmoid represents a sigmoid function, b represents a bias term, W represents a weight matrix, and h represents a given variable;
repeating the steps (1) and (2) for a plurality of times until the preset requirement is met;
(3) Calculating the gradient of the RBM and updating the parameters of the RBM by using a gradient descent method:
calculating a correlation between the visible layer and the hidden layer: pos_corr=v.h.trans ();
calculating a balanced correlation between the visible layer and the hidden layer: neg_corr=v '×h'. Franspost ();
where v 'and h' are new samples obtained by Gibbs sampling, pos_corr is the correlation factor between the visible layer and the hidden layer, neg_corr is the balanced correlation factor between the visible layer and the hidden layer;
updating parameters: w=w+delta_w; b_v=b_v+delta_b_v, b_h=b_h+delta_b_h;
where delta_w=learning_rate (pos_corr-neg_corr); delta_b_v=learning_rate × (v-v'); delta_b_h=learning_rate (h-h'), learning_rate being the learning rate;
(4) Repeating the step (2) and the step (3) until reaching a convergence condition or reaching a preset training iteration number;
the raw material prediction model is constructed specifically as follows:
dividing the data processed by the word embedding layer into a training set and a testing set;
constructing an LSTM model, wherein the LSTM model comprises three LSTM layers, a Dropout layer and a full connection layer;
in the training process of the model, compiling by using an Adam optimizer and a mean square error loss function, and training based on data of a training set and designating the training iteration times epochs and the sample number batch_size of each batch;
finally, predicting test set data by using the trained LSTM model, and updating the weight and deviation of the LSTM network model by using a back propagation algorithm to minimize a loss function;
the gradient of the loss function to the weight and deviation is calculated by using a back propagation algorithm, in particular:
calculating gradients layer by layer according to the chain law, starting from the last layer;
a. calculating the gradient of the output layer:
δL = ∂Loss/∂aL ;
wherein ∂ represents the partial derivative, loss is a Loss function, aL is the output value of the output layer;
b. calculating gradient of hidden layer:
δl = (∂Loss/∂zl) ⊙ (Wl+1^T * δl+1)
wherein, as per element multiplication, zl is the weighted input value of the hidden layer, wl+1 is the weight of the next layer, wl+1^T is the T power of the weight, δl+1 is the gradient of the next layer;
c. calculating the gradient of the weights and the deviations:
∂Loss/∂Wl = δl * al-1^T;
wherein al-1 is the output value of the previous layer;
∂Loss/∂bl = δl
finally, the weight and bias of the network are updated according to the gradient using a random gradient descent method.
2. The impregnated paper raw material management system fusing a knowledge graph and a neural network as claimed in claim 1, wherein the raw material knowledge graph base is constructed as follows:
raw material node: each raw material is represented as a node, which contains the following attributes:
raw material name: a name indicating the raw material;
-raw material number: a unique identifier representing the raw material;
-raw material type: representing the type of raw material;
-raw material suppliers: vendor information representing raw materials;
-raw material properties: other properties representing raw materials;
attribute relationship: using edges to represent attribute relationships between raw materials includes:
-similar attribute relationship: representing that the two raw materials have similar attributes, and establishing edges according to the similarity of the attributes;
-dependency attribute relationship: representing that the attribute of one raw material depends on the attribute of another raw material, edges can be established according to the dependency of the attribute;
-supply relation: establishing edges according to the supply relation between suppliers and raw materials;
hierarchical structure of raw material properties: representing hierarchical relationships between raw material properties using a hierarchical structure, comprising:
-a parent attribute node;
-a child attribute node;
-attribute relationship edge: representing a hierarchical relationship between a parent attribute node and a child attribute node;
the knowledge graph comprises a secondary unfolding graph, wherein a primary unfolding schematic diagram shows the formula, the composition components and the raw materials of the impregnated paper; a two-stage development schematic diagram for showing the kinds of materials used for raw materials and corresponding supplier information, inventory data, purchasing information and attribute data.
CN202311005069.9A 2023-08-10 2023-08-10 Impregnated paper raw material management system integrating knowledge graph and neural network Active CN116720819B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311005069.9A CN116720819B (en) 2023-08-10 2023-08-10 Impregnated paper raw material management system integrating knowledge graph and neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311005069.9A CN116720819B (en) 2023-08-10 2023-08-10 Impregnated paper raw material management system integrating knowledge graph and neural network

Publications (2)

Publication Number Publication Date
CN116720819A CN116720819A (en) 2023-09-08
CN116720819B true CN116720819B (en) 2023-10-27

Family

ID=87872017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311005069.9A Active CN116720819B (en) 2023-08-10 2023-08-10 Impregnated paper raw material management system integrating knowledge graph and neural network

Country Status (1)

Country Link
CN (1) CN116720819B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061882A (en) * 2019-08-19 2020-04-24 广州利科科技有限公司 Knowledge graph construction method
CN111428046A (en) * 2020-03-18 2020-07-17 浙江网新恩普软件有限公司 Knowledge graph generation method based on bidirectional L STM deep neural network
WO2021107448A1 (en) * 2019-11-25 2021-06-03 주식회사 데이터마케팅코리아 Method and apparatus for providing knowledge graph-based marketing information analysis service to support efficient document classification processing
CN114610898A (en) * 2022-03-09 2022-06-10 北京航天智造科技发展有限公司 Method and system for constructing supply chain operation knowledge graph
CN114896408A (en) * 2022-03-24 2022-08-12 北京大学深圳研究生院 Construction method of material knowledge graph, material knowledge graph and application
CN115605894A (en) * 2020-09-03 2023-01-13 京东方科技集团股份有限公司(Cn) Intelligent management system, intelligent management method and computer program product
CN115858805A (en) * 2022-11-21 2023-03-28 江苏科继佳信息技术有限公司 Knowledge graph construction management system and method
CN116070973A (en) * 2022-11-30 2023-05-05 中化创新(北京)科技研究院有限公司 Knowledge graph-based digital factory management index intelligent recommendation method
CN116362371A (en) * 2022-12-26 2023-06-30 青岛檬豆网络科技有限公司 Knowledge graph-based purchasing prediction system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114168745B (en) * 2021-11-30 2022-08-09 大连理工大学 Knowledge graph construction method for production process of ethylene oxide derivative

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061882A (en) * 2019-08-19 2020-04-24 广州利科科技有限公司 Knowledge graph construction method
WO2021107448A1 (en) * 2019-11-25 2021-06-03 주식회사 데이터마케팅코리아 Method and apparatus for providing knowledge graph-based marketing information analysis service to support efficient document classification processing
CN111428046A (en) * 2020-03-18 2020-07-17 浙江网新恩普软件有限公司 Knowledge graph generation method based on bidirectional L STM deep neural network
CN115605894A (en) * 2020-09-03 2023-01-13 京东方科技集团股份有限公司(Cn) Intelligent management system, intelligent management method and computer program product
CN114610898A (en) * 2022-03-09 2022-06-10 北京航天智造科技发展有限公司 Method and system for constructing supply chain operation knowledge graph
CN114896408A (en) * 2022-03-24 2022-08-12 北京大学深圳研究生院 Construction method of material knowledge graph, material knowledge graph and application
CN115858805A (en) * 2022-11-21 2023-03-28 江苏科继佳信息技术有限公司 Knowledge graph construction management system and method
CN116070973A (en) * 2022-11-30 2023-05-05 中化创新(北京)科技研究院有限公司 Knowledge graph-based digital factory management index intelligent recommendation method
CN116362371A (en) * 2022-12-26 2023-06-30 青岛檬豆网络科技有限公司 Knowledge graph-based purchasing prediction system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于LINGO的在线陶瓷矿产资源管理知识库的设计与实现;钟玲;郭婷;张明;;智能计算机与应用(第01期);全文 *
基于知识图谱分析的库存管理研究;叶勇;;信息资源管理学报(第01期);全文 *

Also Published As

Publication number Publication date
CN116720819A (en) 2023-09-08

Similar Documents

Publication Publication Date Title
JP7324827B2 (en) Systems and methods for dynamic incremental recommendations within real-time visual simulations
Yamada et al. OSS reliability measurement and assessment
US20080010232A1 (en) Apparatus and method for learning and reasoning for systems with temporal and non-temporal variables
Soman et al. Automating look-ahead schedule generation for construction using linked-data based constraint checking and reinforcement learning
CN113807900A (en) RF order demand prediction method based on Bayesian optimization
Siddiqi et al. Genetic algorithm for the mutual information-based feature selection in univariate time series data
CN111898867A (en) Airplane final assembly production line productivity prediction method based on deep neural network
Wang et al. On the use of time series and search based software engineering for refactoring recommendation
CN116720819B (en) Impregnated paper raw material management system integrating knowledge graph and neural network
Li et al. Research on the application of multimedia entropy method in data mining of retail business
Rajbahadur et al. Pitfalls analyzer: quality control for model-driven data science pipelines
Fang et al. Using Bayesian network technology to predict the semiconductor manufacturing yield rate in IoT
CN116306923A (en) Evaluation weight calculation method based on knowledge graph
CN111523685B (en) Method for reducing performance modeling overhead based on active learning
Khurana et al. Autonomous predictive modeling via reinforcement learning
Liu et al. The management of simulation validation
Liu An intelligent planning technique-based software requirement analysis
Natarajan et al. Programming by rewards
Beiranvand et al. Bridging the semantic gap for software effort estimation by hierarchical feature selection techniques
Gupta et al. A meta level data mining approach to predict software reusability
Su et al. Scene-aware Activity Program Generation with Language Guidance Supplementary Material
Taktak et al. A Computer-assisted Performance Analysis and Optimization (CPAO) of Manufacturing Systems based on ARENA® Software
Ebufegha et al. A hybrid algorithm for task sequencing problems with iteration in product development
Shapot et al. Creating Intersectoral Economic Models Based on Multi-agent and Linear Programming Approaches: Methods and Software ToolMedium-term forecasts for both sustainable economic development and crisis periods
Iftikhar et al. Automated Classification of Building Objects Using Machine Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant