CN114372181B - Equipment production intelligent planning method based on multi-mode data - Google Patents
Equipment production intelligent planning method based on multi-mode data Download PDFInfo
- Publication number
- CN114372181B CN114372181B CN202111609391.3A CN202111609391A CN114372181B CN 114372181 B CN114372181 B CN 114372181B CN 202111609391 A CN202111609391 A CN 202111609391A CN 114372181 B CN114372181 B CN 114372181B
- Authority
- CN
- China
- Prior art keywords
- data
- neural network
- deep neural
- manufacturing process
- manufacturing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 104
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000013528 artificial neural network Methods 0.000 claims abstract description 42
- 230000009471 action Effects 0.000 claims abstract description 40
- 238000012549 training Methods 0.000 claims abstract description 22
- 238000013507 mapping Methods 0.000 claims abstract description 16
- 230000010391 action planning Effects 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 45
- 239000013598 vector Substances 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 11
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 230000004913 activation Effects 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000002372 labelling Methods 0.000 claims description 6
- 241000287196 Asthenes Species 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 3
- 230000000875 corresponding effect Effects 0.000 abstract 1
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/906—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Business, Economics & Management (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Manufacturing & Machinery (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an intelligent planning method for equipment production based on multi-mode data, which comprises the following steps: 1) Data acquisition is carried out on the manufacturing process to obtain multi-mode data; 2) Normalizing the multi-modal data; 3) Building a training set for a single manufacturing step; 4) Constructing a category-action mapping relationship for the state of each manufacturing step; 5) Constructing and training a deep neural network; 6) Classifying the data acquired in each manufacturing process by using a trained deep neural network; 7) Selecting corresponding actions according to the classification result; 8) And (5) repeatedly classifying and selecting the actions for each manufacturing step to finish the action planning. The method and the device fully utilize the multi-mode data and the deep neural network collected in the manufacturing process, not only can identify the current state of each step, but also can carry out action planning according to the current state, thereby realizing intelligent action planning in the manufacturing process and reducing manual intervention. In addition, the interpretability and accuracy of the action planning can be increased.
Description
Technical Field
The invention relates to the technical field of computer artificial intelligence, in particular to an intelligent planning method for equipment production based on multi-mode data.
Background
At present, with the development of the 4.0-age industry, artificial intelligence starts to permeate into the traditional manufacturing industry, and the manufacturing industry starts to change to an intelligent and informationized direction. During the intelligent manufacturing process, a large amount of multi-mode data with different types and multiple sources can be generated. How to utilize the multi-mode data generated in the manufacturing process to carry out the programming of the manufacturing process reduces manual intervention, improves the interpretability and the accuracy of the production programming, becomes a technical difficulty of the manufacturing process towards intellectualization, and is also an important development direction of the intelligent manufacturing research field at the present stage.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings of the prior art, and provides an intelligent planning method for equipment production based on multi-mode data, which can effectively utilize the multi-mode data to carry out equipment production planning, fully utilize the multi-mode data and a deep neural network collected in the manufacturing process, not only can identify the current state of each step, but also can carry out action planning according to the current state, realize intelligent action planning in the manufacturing process, reduce manual intervention, and further can increase the interpretability and the accuracy of the action planning.
In order to achieve the above purpose, the technical scheme provided by the invention is as follows: an intelligent planning method for equipment production based on multi-mode data comprises the following steps:
1) Data acquisition is carried out on the manufacturing process by utilizing different types of sensors to obtain multi-mode data, and the multi-mode data are stored in a server;
2) Extracting multi-mode data from a server, dividing the data into two types of image data and numerical data, and preprocessing the two types of data to obtain preprocessed multi-mode data; the method comprises the steps of obtaining a pixel value matrix of image data and carrying out standardization processing on the pixel value matrix of the image data; for numerical data, directly carrying out data standardization processing;
3) Extracting multi-modal data regarding the manufacturing process t for the preprocessed multi-modal data; then, manually labeling the data according to expert knowledge to obtain labels of each piece of data, wherein the extracted multi-mode data and the corresponding labels form a training set;
4) Constructing a mapping relation between category and manufacturing action, so that each category corresponds to one manufacturing action in the training set obtained in the step 3);
5) Inputting the training set in the step 3) into a deep neural network for training; the method comprises the steps that a ReLU function is selected as an activation function of a deep neural network, a loss function is a cross entropy loss function, and a softmax function is adopted to output class probability; obtaining a trained deep neural network by optimizing a loss function;
6) When actual equipment is produced, data acquisition is carried out by utilizing different types of sensors to form a multi-mode data stream, and a trained deep neural network is used for predicting the multi-mode data stream to obtain a classification result;
7) Selecting a manufacturing action corresponding to the classification result of the step 6) according to the class-manufacturing action mapping relation constructed in the step 4), and completing action planning of an actual manufacturing process t;
8) Repeating the steps 3) to 7) for each process in the actually operated manufacturing process to form an action strategy intelligent plan of the equipment production manufacturing process under the multi-mode data.
Further, in step 2), the data normalization process is: after the numerical data stored in the server are read, the data are standardized according to the categories, and the standardized formulas of the numerical data are as follows:
Wherein X a is the original numerical data, Is the mean value of numerical data,/>Is the standard deviation of numerical data,/>Is numerical data standardized by X a;
for image data, firstly, generating a one-dimensional vector from a pixel value matrix obtained by a sensor to obtain an image vector X b, and then, normalizing according to the following formula:
In the method, in the process of the invention, Is the mean value of the image vector,/>Is the standard deviation of the image vector,/>Is the image data of X b after standardized processing;
The standardized image and the numerical data are two one-dimensional vectors, and the two are spliced to obtain standardized data X *:
Further, in step 3), data of the manufacturing process t is collected M times, and the collected multi-modal data is represented as data set X t:
In the method, in the process of the invention, Is the normalized data collected at the M-th time; labeling the collected multi-modal data to obtain a label of a data set X t:
Yt={Y1t,Y2t,...,YMt}
Wherein Y Mt is data Corresponding tags, determined by an expert; dataset X t and corresponding label Y t form a training set.
Further, in step 4), a mapping relationship of category-manufacturing action is constructed for the manufacturing process t, specifically as follows:
Assuming that there are S categories for the manufacturing process t, the category set C t is expressed as:
Ct={c1t,c2t,...,cSt}
wherein c St is the S-th class of the manufacturing process t; finally, a manufacturing action set A t is constructed:
At={a1t,a2t,...,aSt}
where a St is a manufacturing operation corresponding to the category c St, and a mapping relationship between the category and the manufacturing operation is constructed.
Further, in step 5), the training set of step 3) is input into the deep neural network H t for training, the activation function of the network selects a ReLU function, the loss function is a cross entropy loss function, and the softmax function is adopted for probability calculation; assuming that the deep neural network has L layers, h k and h k+1 are hidden layer networks corresponding to the k layer and the k+1 layer, respectively, for the k+1 layer hidden layer network, the k+1 layer hidden layer network h k+1 can be represented by the following formula:
hk+1=δ(Wkhk)
Wherein W k is a weight parameter corresponding to the k-th hidden layer network; delta (·) is an activation function, where a ReLU function is employed that introduces nonlinearity for the deep neural network, the ReLU function delta (x) formula being:
then, the output Z of the last layer of neural network is calculated:
Z=WLhL-1
Wherein W L is the weight parameter of the last hidden layer network, and h L-1 is the penultimate hidden layer network; using the softmax function, the probability p i that the input data belongs to class i, i=1, 2,..s, the calculation formula is:
Where Z i is the i-th element of output Z, Z j is the j-th element of output Z, j=1, 2,..s; after obtaining the probability p i, the cross entropy Loss function Loss is used as the Loss function of the deep neural network H t, and the specific formula is expressed as follows:
Where p r is the probability output of category r, y r is the label of category r, and S is the number of categories; and finally, optimizing the deep neural network by using a RMSprop optimization method to obtain the trained deep neural network.
Further, in step 6), predicting the multi-modal data stream R t of the manufacturing process during actual operation by using the deep neural network trained in step 5); the trained deep neural network is recorded asThen sort result/>Represented as
Further, in step 7), the classification result obtained in step 6) is usedCorresponding to the category-manufacturing action mapping relation constructed in the step 4), and selecting a corresponding manufacturing action.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the method can effectively reduce the difficulty of manually designing a complex manufacturing process and realize intelligent planning of the action strategy in the manufacturing process.
2. In the method, the multi-mode data of each manufacturing step are collected, and a deep neural network is trained for each step respectively according to the collected data. And classifying the actually operated multi-mode data according to the classification result. Different manufacturing action strategies are planned for different classification results.
3. The method designs a deep neural network for each manufacturing process, and can effectively perform action adjustment aiming at the data characteristics of different steps; and the status of each step in the manufacturing process can be automatically identified.
4. The method designs different actions according to the current state of each manufacturing process, so that the action adjustment in the manufacturing process has higher interpretability. After the deep neural network is constructed, the action strategy planning method can automatically adjust actions of each step according to multi-mode data in the manufacturing process, can effectively reduce manual intervention and improve automation level.
Drawings
FIG. 1 is a schematic logic flow diagram of the method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but embodiments of the present invention are not limited thereto.
As shown in fig. 1, the method for intelligent planning of equipment production based on multi-mode data provided in this embodiment specifically includes the following steps:
1) And acquiring data in the manufacturing process by using different types of sensors to obtain multi-mode data, and storing the multi-mode data in a server.
2) Extracting multi-mode data from a server, dividing the data into two types of image data and numerical data, and preprocessing the two types of data to obtain preprocessed multi-mode data; the method comprises the steps of obtaining a pixel value matrix of image data and carrying out standardization processing on the pixel value matrix of the image data; and directly carrying out data standardization processing on the numerical data.
The data normalization process is as follows: after the numerical data stored in the server are read, the data are standardized according to the categories, and the standardized formulas of the numerical data are as follows:
Wherein X a is the original numerical data, Is the mean value of numerical data,/>Is the standard deviation of numerical data,/>Is numerical data standardized by X a;
for image data, firstly, generating a one-dimensional vector from a pixel value matrix obtained by a sensor to obtain an image vector X b, and then, normalizing according to the following formula:
In the method, in the process of the invention, Is the mean value of the image vector,/>Is the standard deviation of the image vector,/>Is the image data of X b after standardized processing;
The standardized image and the numerical data are two one-dimensional vectors, and the two are spliced to obtain standardized data X *:
3) Extracting multi-modal data regarding the manufacturing process t for the preprocessed multi-modal data; and then, manually labeling the data according to expert knowledge to obtain labels of each piece of data, wherein the extracted multi-mode data and the corresponding labels form a training set, and the training set is specifically as follows:
Data of manufacturing process t is collected M times, and the collected multi-modal data is represented as data set X t:
In the method, in the process of the invention, Is the normalized data collected at the M-th time; labeling the collected multi-modal data to obtain a label of a data set X t:
Yt={Y1t,Y2t,...,YMt}
Wherein Y Mt is data Corresponding tags, determined by an expert; dataset X t and corresponding label Y t form a training set.
4) Constructing a mapping relation between category and manufacturing action, so that each category corresponds to one manufacturing action in the training set obtained in the step 3); wherein, the mapping relation of category-manufacturing action is constructed for the manufacturing process t, specifically as follows:
assuming that there are S categories for manufacturing process t, category set C t may be expressed as:
Ct={c1t,c2t,...,cSt}
wherein c St is the S-th class of the manufacturing process t; finally, a manufacturing action set A t is constructed:
At={a1t,a2t,...,aSt}
Where a St is a manufacturing operation corresponding to the category c St, and a mapping relationship between the category and the manufacturing operation is constructed. The correspondence of the category to the manufacturing action is determined by an expert.
5) And 3) inputting the training set in the step 3) into a deep neural network H t for training, wherein a ReLU function is selected as an activation function of the network, a loss function is a cross entropy loss function, and a softmax function is adopted for probability calculation. Assuming that the deep neural network has L layers, h k and h k+1 are hidden layer networks corresponding to the k layer and the k+1 layer, respectively, for the k+1 layer hidden layer network, the k+1 layer hidden layer network h k+1 can be represented by the following formula:
hk+1=δ(Wkhk)
Wherein W k is a weight parameter corresponding to the k-th hidden layer network; delta (·) is an activation function, where a ReLU function is employed that introduces nonlinearity for the deep neural network, the ReLU function delta (x) formula being:
then, the output Z of the last layer of neural network is calculated:
Z=WLhL-1
Where W L is the weight parameter of the last layer hidden layer network and h L-1 is the penultimate hidden layer network. Using the softmax function, the probability p i that the input data belongs to class i, i=1, 2,..s, the calculation formula is:
Where Z i is the i-th element of output Z, Z j is the j-th element of output Z, j=1, 2,..s; after obtaining the probability p i, the cross entropy Loss function Loss is used as the Loss function of the deep neural network H t, and the specific formula is expressed as follows:
Where p r is the probability output of category r, y r is the label of category r, and S is the number of categories; and finally, optimizing the deep neural network by using a RMSprop optimization method to obtain the trained deep neural network.
6) And 5) predicting the multi-mode data flow R t of the manufacturing process in actual operation by using the trained deep neural network in the step 5). The trained deep neural network is recorded asThen sort result/>Can be expressed as:
7) The classification result obtained in the step 6) is obtained Corresponding to the category-manufacturing action mapping relation constructed in the step 4), and selecting a corresponding manufacturing action. Hypothesis/>In category-manufacturing action, the category-manufacturing action corresponds to manufacturing action/>The actions of the manufacturing process t are planned as/>
8) Repeating the actions of steps 3) to 7) for each step in the manufacturing process to form an intelligent planning scheme for the manufacturing process actions under multi-mode dataWherein/>Is the manufacturing action of manufacturing process d (d=1, 2,.., T), T being the number of steps in the manufacturing process.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.
Claims (3)
1. The intelligent planning method for equipment production based on multi-mode data is characterized by comprising the following steps:
1) Data acquisition is carried out on the manufacturing process t by utilizing different types of sensors, so that multi-mode data are obtained and stored in a server;
2) Extracting multi-mode data from a server, dividing the data into two types of image data and numerical data, and preprocessing the two types of data to obtain preprocessed multi-mode data; the method comprises the steps of obtaining a pixel value matrix of image data and carrying out standardization processing on the pixel value matrix of the image data; for numerical data, directly carrying out data standardization processing;
The data normalization process is as follows: after the numerical data stored in the server are read, the data are classified, the data are standardized according to the categories, and the standardized formulas of the numerical data are as follows:
Wherein X a is the original numerical data, Is the mean value of numerical data,/>Is the standard deviation of the numerical data,Is numerical data standardized by X a;
for image data, firstly, generating a one-dimensional vector from a pixel value matrix obtained by a sensor to obtain an image vector X b, and then, normalizing according to the following formula:
In the method, in the process of the invention, Is the mean value of the image vector,/>Is the standard deviation of the image vector,/>Is the image data of X b after standardized processing;
The standardized image and the numerical data are two one-dimensional vectors, and the two are spliced to obtain standardized data X *:
3) Extracting multi-modal data regarding the manufacturing process t for the preprocessed multi-modal data; then, manually labeling the data according to expert knowledge to obtain labels of each piece of data, wherein the extracted multi-mode data and the corresponding labels form a training set;
Data of manufacturing process t is collected M times, and the collected multi-modal data is represented as data set X t:
In the method, in the process of the invention, Is the normalized data collected at the M-th time; labeling the collected multi-modal data to obtain a label of a data set X t:
Wherein Y Mt is data Corresponding tags, determined by an expert; the dataset X t and the corresponding label Y t form a training set;
4) Constructing a mapping relation between category and manufacturing action, so that each category corresponds to one manufacturing action in the training set obtained in the step 3);
the mapping relation of category and manufacturing action is constructed for the manufacturing process t, and the mapping relation is specifically as follows:
Assuming that there are S categories for the manufacturing process t, the category set C t is expressed as:
Ct={c1t,c2t,...,cSt}
wherein c St is the S-th class of the manufacturing process t; finally, a manufacturing action set A t is constructed:
At={a1t,a2t,...,aSt}
Wherein a St is a manufacturing operation corresponding to the category c St, and a mapping relation between the category and the manufacturing operation is constructed;
5) Inputting the training set in the step 3) into a deep neural network for training; the method comprises the steps that a ReLU function is selected as an activation function of a deep neural network, a loss function is a cross entropy loss function, and a softmax function is adopted to output class probability; obtaining a trained deep neural network by optimizing a loss function;
Inputting the training set in the step 3) into a deep neural network H t for training, selecting a ReLU function as an activation function of the network, wherein a loss function is a cross entropy loss function, and performing probability calculation by adopting a softmax function; assuming that the deep neural network has L layers, h k and h k+1 are hidden layer networks corresponding to the k layer and the k+1 layer, respectively, for the k+1 layer hidden layer network, the k+1 layer hidden layer network h k+1 can be represented by the following formula:
hk+1=δ(Wkhk)
Wherein W k is a weight parameter corresponding to the k-th hidden layer network; delta (·) is an activation function, where a ReLU function is employed that introduces nonlinearity for the deep neural network, the ReLU function delta (x) formula being:
then, calculate the output Z of the last layer neural network
Z=WLhL-1
Wherein W L is the weight parameter of the last hidden layer network, and h L-1 is the penultimate hidden layer network; using the softmax function, the probability p i that the input data belongs to class i, i=1, 2,..s, the calculation formula is:
Where Z i is the i-th element of output Z, Z j is the j-th element of output Z, j=1, 2,..s; after obtaining the probability p i, the cross entropy Loss function Loss is used as the Loss function of the deep neural network H t, and the specific formula is expressed as follows:
Where p r is the probability output of category r, y r is the label of category r, and S is the number of categories; finally, optimizing the deep neural network by RMSprop optimization method to obtain a trained deep neural network;
6) When actual equipment is produced, data acquisition is carried out by utilizing different types of sensors to form a multi-mode data stream, and a trained deep neural network is used for predicting the multi-mode data stream to obtain a classification result;
7) Selecting a manufacturing action corresponding to the classification result of the step 6) according to the class-manufacturing action mapping relation constructed in the step 4), and completing action planning of an actual manufacturing process t;
8) Repeating the steps 3) to 7) for each process in the actually operated manufacturing process t to form an action strategy intelligent planning of the equipment production manufacturing process under the multi-mode data.
2. The method for intelligently planning production of equipment based on multi-modal data according to claim 1, wherein the method comprises the following steps: in step 6), predicting the multi-modal data stream R t of the manufacturing process t during actual operation by using the deep neural network trained in step 5); the trained deep neural network is recorded asThen sort result/>Represented as
3. The method for intelligently planning production of equipment based on multi-modal data according to claim 1, wherein the method comprises the following steps: in step 7), the classification result obtained in step 6) is used for classifyingCorresponding to the category-manufacturing action mapping relation constructed in the step 4), and selecting a corresponding manufacturing action.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111609391.3A CN114372181B (en) | 2021-12-27 | 2021-12-27 | Equipment production intelligent planning method based on multi-mode data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111609391.3A CN114372181B (en) | 2021-12-27 | 2021-12-27 | Equipment production intelligent planning method based on multi-mode data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114372181A CN114372181A (en) | 2022-04-19 |
CN114372181B true CN114372181B (en) | 2024-06-07 |
Family
ID=81142263
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111609391.3A Active CN114372181B (en) | 2021-12-27 | 2021-12-27 | Equipment production intelligent planning method based on multi-mode data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114372181B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117112857B (en) * | 2023-10-23 | 2024-01-05 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | Machining path recommending method suitable for industrial intelligent manufacturing |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112784919A (en) * | 2021-02-03 | 2021-05-11 | 华南理工大学 | Intelligent manufacturing multi-mode data oriented classification method |
CN113112086A (en) * | 2021-04-22 | 2021-07-13 | 北京邮电大学 | Intelligent production system based on edge calculation and identification analysis |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9922272B2 (en) * | 2014-09-25 | 2018-03-20 | Siemens Healthcare Gmbh | Deep similarity learning for multimodal medical images |
-
2021
- 2021-12-27 CN CN202111609391.3A patent/CN114372181B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112784919A (en) * | 2021-02-03 | 2021-05-11 | 华南理工大学 | Intelligent manufacturing multi-mode data oriented classification method |
CN113112086A (en) * | 2021-04-22 | 2021-07-13 | 北京邮电大学 | Intelligent production system based on edge calculation and identification analysis |
Non-Patent Citations (2)
Title |
---|
制造业设计文档的模糊分类;刘志洪, 顾宁;计算机辅助设计与图形学学报;20041120(第11期);全文 * |
基于深度神经网络的多模态特征自适应聚类方法;敬明旻;;计算机应用与软件;20201012(第10期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114372181A (en) | 2022-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111694924B (en) | Event extraction method and system | |
CN108596203B (en) | Optimization method of parallel pooling layer for pantograph carbon slide plate surface abrasion detection model | |
CN111274817A (en) | Intelligent software cost measurement method based on natural language processing technology | |
CN111353379B (en) | Signal measurement feature matching and labeling method based on weight clustering | |
CN112732921B (en) | False user comment detection method and system | |
CN110705607A (en) | Industry multi-label noise reduction method based on cyclic re-labeling self-service method | |
CN114372181B (en) | Equipment production intelligent planning method based on multi-mode data | |
CN111368087A (en) | Chinese text classification method based on multi-input attention network | |
CN110363214A (en) | A kind of contact condition recognition methods of the robotic asssembly based on GWA-SVM | |
CN114676769A (en) | Visual transform-based small sample insect image identification method | |
CN113269182A (en) | Target fruit detection method and system based on small-area sensitivity of variant transform | |
CN114357221A (en) | Self-supervision active learning method based on image classification | |
CN116958700A (en) | Image classification method based on prompt engineering and contrast learning | |
CN114926702B (en) | Small sample image classification method based on depth attention measurement | |
CN108573275B (en) | Construction method of online classification micro-service | |
CN115217534A (en) | Method and system for monitoring service quality state of steam turbine | |
CN115512214A (en) | Indoor visual navigation method based on causal attention | |
CN111144464B (en) | Fruit automatic identification method based on CNN-Kmeans algorithm | |
CN115082726A (en) | Ceramic biscuit product classification method for toilet based on PointNet optimization | |
CN114170460A (en) | Multi-mode fusion-based artwork classification method and system | |
CN111626376A (en) | Domain adaptation method and system based on discrimination joint probability | |
CN115169617B (en) | Mold maintenance prediction model training method, mold maintenance prediction method and system | |
CN117274750B (en) | Knowledge distillation semi-automatic visual labeling method and system | |
CN116310463B (en) | Remote sensing target classification method for unsupervised learning | |
CN115294386B (en) | Image classification method based on regularization supervision loss function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |