CN111832610A - 3D printing organization prediction method, system, medium and terminal equipment - Google Patents
3D printing organization prediction method, system, medium and terminal equipment Download PDFInfo
- Publication number
- CN111832610A CN111832610A CN202010485985.7A CN202010485985A CN111832610A CN 111832610 A CN111832610 A CN 111832610A CN 202010485985 A CN202010485985 A CN 202010485985A CN 111832610 A CN111832610 A CN 111832610A
- Authority
- CN
- China
- Prior art keywords
- prediction
- learning model
- tissue
- process parameters
- deep learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Materials Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Manufacturing & Machinery (AREA)
- Mechanical Engineering (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Probability & Statistics with Applications (AREA)
- Medical Informatics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention belongs to the technical field of 3D printing organization prediction, and particularly discloses a method, a system, a medium and a terminal device for 3D printing organization prediction, wherein the method specifically comprises the following steps: the method specifically comprises the following steps: acquiring process parameters; constructing a deep learning model; and substituting the process parameters into a pre-constructed deep learning model to obtain a prediction result. The invention has the beneficial effects that: the method can effectively predict the 3D printing organization, has high prediction precision and does not depend on the calculation precision of calculation.
Description
Technical Field
The invention relates to the technical field of 3D printing, in particular to a method, a system, a medium and a terminal device for 3D printing organization prediction.
Background
In the eighties of the last century, 3D printing technology was born and is not limited to the traditional 'removal' processing method, and 3D printing is a bottom-up manufacturing method, also called additive manufacturing technology, which realizes the establishment of mathematical models. The 3D printing technology has received much attention from the birth date, and has thus been rapidly developed. In recent decades, 3D printing technology has been the focus of attention. Industrial design, architecture, automotive, aerospace, dental, educational fields, etc. are all applied, but their application and development are still limited by factors. In addition to instrumental equipment and printing procedure parameters, the detection of the appearance of a part is also a key factor affecting the quality of 3D printed products.
At present, however, the simulation calculation of the organization and the residue of the 3D printing is completed based on one-time calculation of a computer, and the specific method is as follows:
1. converting the heat input according to the used process;
2. the unit discretization data model divides the calculation units based on the STL digital analogy;
3. setting unit boundary conditions;
4. selecting a proper calculation model, such as a Gaussian thermal model, a Cartesian model and the like;
5. and performing simulation calculation.
The result of this simulation method is often dependent on the accuracy of the computational model, but unfortunately, no standard computational model can completely explain the complete thermal process of 3D printing, and the tissue prediction is also roughly estimated.
Disclosure of Invention
The invention aims to provide a novel 3D printing organization prediction method, a system, a medium and a terminal device, which have high organization prediction precision.
The invention is realized by the following technical scheme:
A3D printing tissue prediction method specifically comprises the following steps:
acquiring process parameters;
constructing a deep learning model;
and substituting the process parameters into a pre-constructed deep learning model to obtain a prediction result.
Further, in order to better implement the present invention, the building of the deep learning model specifically includes:
acquiring a training parameter sample set; the training parameters are typical appearances of corresponding tissue modules under typical process parameters;
extracting features according to the corrosion degree, wherein the features comprise one or more of distribution features, texture features and appearance features;
inputting the characteristics into a machine learning model, and training the machine learning model to obtain a deep learning model.
Further, in order to better implement the present invention, the substituting the process parameters into the pre-constructed deep learning model to obtain the processing result specifically means:
extracting the characteristics in the process parameters, and vectorizing the extracted characteristics;
carrying out weighted summation on the feature vectors of all samples in the training sample set to obtain a tissue prediction vector;
clustering the tissue prediction vectors to gather the tissue prediction vectors which are similar in distance or similarity of the tissue prediction vectors;
and extracting pictures from the clustering center of the texture prediction vector, judging the texture of the pictures, and organizing the texture of all the pictures of the cluster.
A 3D printed tissue prediction system comprising:
a training data acquisition module: is used for acquiring process parameters;
a prediction processing module: the system is used for carrying out feature on the extracted process parameters and inputting the features into a pre-constructed deep learning model to obtain a prediction result;
an output module: for output of prediction results.
A terminal device comprising a memory for storing executable instructions;
and the processor is used for realizing the method when executing the executable instructions stored in the memory.
A computer-readable storage medium having stored thereon program code that, when invoked by a processor, performs the method described above.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the method can accurately predict the 3D printing organization.
Drawings
Fig. 1 is a schematic diagram of a cluster center in embodiment 3 of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The present invention will be described in further detail with reference to examples, but the embodiments of the present invention are not limited thereto.
Example 1:
the invention is realized by the following technical scheme, and the 3D printing tissue prediction method specifically comprises the following steps:
acquiring process parameters;
constructing a deep learning model;
and substituting the process parameters into a pre-constructed deep learning model to obtain a prediction result.
Example 2:
the embodiment is further optimized on the basis of the above embodiment, and further, in order to better implement the present invention, the constructing of the deep learning model specifically includes:
acquiring a training parameter sample set; the training parameters are typical appearances of corresponding tissue modules under typical process parameters;
extracting features according to the corrosion degree, wherein the features comprise one or more of distribution features, texture features and appearance features;
inputting the characteristics into a machine learning model, and training the machine learning model to obtain a deep learning model.
It should be noted that, through the above improvement, the training and learning factor is the typical morphology of the corresponding organization module under each typical process parameter, such as the typical morphology of widmanstatten, martensite, austenite, etc., and the computer can extract the features of these images according to the erosion gray level, and the extraction of the features may also include color distribution features, texture features, morphology features, etc., or any combination thereof.
The extracted features are vectorized to obtain vectors of the features, for example, color distribution feature vectors, texture feature vectors, morphology feature vectors, and the like can be correspondingly obtained.
Other parts of this embodiment are the same as those of the above embodiment, and thus are not described again.
Example 3:
the embodiment is further optimized on the basis of the above embodiment, and further, in order to better implement the present invention, the substituting the process parameters into the pre-constructed deep learning model to obtain the processing result specifically means:
extracting the characteristics in the process parameters, and vectorizing the extracted characteristics;
carrying out weighted summation on the feature vectors of all samples in the training sample set to obtain a tissue prediction vector;
clustering the tissue prediction vectors to gather the tissue prediction vectors which are similar in distance or similarity of the tissue prediction vectors;
and extracting pictures from the clustering center of the texture prediction vector, judging the texture of the pictures, and organizing the texture of all the pictures of the cluster.
It is noted that, with the above-mentioned improvements,
the following description will be given of how to realize tissue judgment based on a tissue picture after vector acquisition, taking texture feature vectors and morphology feature vectors as examples:
A) and giving a similarity threshold value, and carrying out weighted summation on the texture feature vector a and the morphology feature vector b of all samples in the training sample set to obtain a tissue prediction vector c of the samples. For example, c = k1 a + k2 b; wherein K is an adjustable coefficient.
B) Clustering the tissue prediction vectors, and clustering the tissue prediction vectors close in distance, wherein the input of the step is a set of all the tissue prediction vectors in a sample set, and the output is the clustering of the tissue prediction vectors;
C) and extracting a 'representative' picture from the clustering center of the cluster of the organization prediction vector, judging the sentence, judging the organization of the sentence, and representing the organization of all pictures of the cluster by using the organization.
FIG. 1 shows some of the calculated cluster centers; in some schemes, the center of the cluster can also be set manually, and the cluster center is set manually, so that the cluster center is prevented from having no significance.
The type of the tissue is determined according to the distance between the feature vector and each cluster center, for example, the cluster center of martensite is (0, 0), the cluster center of austenite is (100 ), the cluster center of weiqi is (100, 0), the position of the map M in the vector space is (50, 50), and the distances between the map M and the cluster centers of martensite, austenite and weiqi are all 72, and the system determines that each tissue contains 33% based on the distances.
For another example, the position of a certain vector in the vector space is (99, 1), and the distances from the martensite cluster center, the austenite cluster center and the widmans cluster center are respectively 99.3,99.3 and 1.4;
normalized for distance, s1= s2=0.4965, s3= 1.4/(99.3 +99.3+ 1.4) = 0.007;
which respectively have mass fractions of:
the martensite content w1= log0.1s1/(log0.1s1+ log0.1s2+ log0.1s3) = 11%;
austenite content w2= log0.1s2/(log0.1s1+ log0.1s2+ log0.1s3) = 11%;
widmans content w3= log0.1s3/(log0.1s1+ log0.1s2+ log0.1s3) = 78%;
from the above mass fractions, the typical morphology to which the tissue belongs can be obtained.
Other parts of this embodiment are the same as those of the above embodiment, and thus are not described again.
Example 4:
the embodiment is further optimized on the basis of the above embodiment, and a 3D printing organization prediction system includes:
a training data acquisition module: is used for acquiring process parameters;
a prediction processing module: the system is used for carrying out feature on the extracted process parameters and inputting the features into a pre-constructed deep learning model to obtain a prediction result;
an output module: for output of prediction results.
Other parts of this embodiment are the same as those of the above embodiment, and thus are not described again.
Example 5:
the present embodiment is further optimized on the basis of the foregoing embodiments, and a terminal device includes a memory for storing executable instructions;
a processor for executing the methods of embodiment 1, embodiment 2, embodiment 3 of the executable instructions stored in the memory.
It should be noted that, with the above improvement, the memory may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories. The nonvolatile memory may be a Read Only Memory (ROM). The volatile memory may be a Random Access Memory (RAM).
The memories described in the embodiments of the present application are intended to comprise any suitable type of memory.
The memory is capable of storing data to support the operation. Examples of such data include: for any computer program, such as operating systems and application programs. The operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program may include various application programs.
The execution instructions may be in the form of a program, software module, script, or code, written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
The executable instructions may, but need not, correspond to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts stored in a hypertext markup language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
Executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The processor may be an integrated circuit chip having signal processing capabilities, such as a general purpose processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose processor may be a microprocessor or any conventional processor or the like.
Other parts of this embodiment are the same as those of the above embodiment, and thus are not described again.
Example 6:
the present embodiment is further optimized based on the above embodiments, and a computer-readable storage medium stores a program code, and when the program code is called by a processor, the program code executes the methods of embodiments 1, 2, and 3.
It should be noted that, with the above modification, the storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
The computer-readable storage medium may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium includes a non-volatile computer-readable storage medium. The computer readable storage medium has a storage space for program code for performing any of the method steps of the above-described method. The program code can be read from or written to one or more computer program products. The program code may be compressed, for example, in a suitable form.
Other parts of this embodiment are the same as those of the above embodiment, and thus are not described again.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and all simple modifications and equivalent variations of the above embodiments according to the technical spirit of the present invention are included in the scope of the present invention.
Claims (6)
1. A method of 3D printing tissue predictions, characterized by: the method specifically comprises the following steps:
acquiring process parameters;
constructing a deep learning model;
and substituting the process parameters into a pre-constructed deep learning model to obtain a prediction result.
2. The method of 3D printed tissue prediction according to claim 1, wherein: the building of the deep learning model specifically comprises the following steps:
acquiring a training parameter sample set; the training parameters are typical appearances of corresponding tissue modules under typical process parameters;
extracting features according to the corrosion degree, wherein the features comprise one or more of distribution features, texture features and appearance features;
inputting the characteristics into a machine learning model, and training the machine learning model to obtain a deep learning model.
3. The method of 3D printed tissue prediction according to claim 2, wherein: the process parameters are substituted into the pre-constructed deep learning model to obtain the processing result, specifically comprising the following steps:
extracting the characteristics in the process parameters, and vectorizing the extracted characteristics;
carrying out weighted summation on the feature vectors of all samples in the training sample set to obtain a tissue prediction vector;
clustering the tissue prediction vectors to gather the tissue prediction vectors which are similar in distance or similarity of the tissue prediction vectors;
and extracting pictures from the clustering center of the texture prediction vector, judging the texture of the pictures, and organizing the texture of all the pictures of the cluster.
4. A 3D printed tissue prediction system, characterized by: comprises that
A training data acquisition module: is used for acquiring process parameters;
a prediction processing module: the system is used for carrying out feature on the extracted process parameters and inputting the features into a pre-constructed deep learning model to obtain a prediction result;
an output module: for output of prediction results.
5. A terminal device characterized by: comprises a memory for storing executable instructions;
a processor for implementing the method of any one of claims 1 to 3 when executing executable instructions stored in the memory.
6. A computer-readable storage medium, characterized in that: the computer-readable storage medium has stored therein program code that can be invoked by a processor to perform the method of any of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010485985.7A CN111832610A (en) | 2020-06-01 | 2020-06-01 | 3D printing organization prediction method, system, medium and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010485985.7A CN111832610A (en) | 2020-06-01 | 2020-06-01 | 3D printing organization prediction method, system, medium and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111832610A true CN111832610A (en) | 2020-10-27 |
Family
ID=72897529
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010485985.7A Pending CN111832610A (en) | 2020-06-01 | 2020-06-01 | 3D printing organization prediction method, system, medium and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111832610A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022161000A1 (en) * | 2021-01-29 | 2022-08-04 | 苏州奇流信息科技有限公司 | Training method and training apparatus for machine learning model, and evaluation system |
CN115223267A (en) * | 2022-07-18 | 2022-10-21 | 徐州医科大学 | 3D printing component surface roughness prediction method and device based on deep learning |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005114338A1 (en) * | 2004-05-20 | 2005-12-01 | Mcmaster University | Method for controlling the appearance of products and process performance by image analysis |
US20130236085A1 (en) * | 2012-03-12 | 2013-09-12 | Kla-Tencor Corporation | Systems and Methods of Advanced Site-Based Nanotopography for Wafer Surface Metrology |
US20180322234A1 (en) * | 2017-05-08 | 2018-11-08 | Globalfoundries Inc. | Prediction of process-sensitive geometries with machine learning |
CN108897972A (en) * | 2018-07-20 | 2018-11-27 | 辽宁石油化工大学 | A kind of prediction technique of electroslag remelting ingot solidification microstructure |
CN113240096A (en) * | 2021-06-07 | 2021-08-10 | 北京理工大学 | Casting cylinder cover micro-structure prediction method based on rough set and neural network |
-
2020
- 2020-06-01 CN CN202010485985.7A patent/CN111832610A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005114338A1 (en) * | 2004-05-20 | 2005-12-01 | Mcmaster University | Method for controlling the appearance of products and process performance by image analysis |
US20130236085A1 (en) * | 2012-03-12 | 2013-09-12 | Kla-Tencor Corporation | Systems and Methods of Advanced Site-Based Nanotopography for Wafer Surface Metrology |
US20180322234A1 (en) * | 2017-05-08 | 2018-11-08 | Globalfoundries Inc. | Prediction of process-sensitive geometries with machine learning |
CN108897972A (en) * | 2018-07-20 | 2018-11-27 | 辽宁石油化工大学 | A kind of prediction technique of electroslag remelting ingot solidification microstructure |
CN113240096A (en) * | 2021-06-07 | 2021-08-10 | 北京理工大学 | Casting cylinder cover micro-structure prediction method based on rough set and neural network |
Non-Patent Citations (1)
Title |
---|
金泉林: "热锻过程宏观变形与微观组织预测的理论与技术", 《2004年中国材料研讨会论文摘要集》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022161000A1 (en) * | 2021-01-29 | 2022-08-04 | 苏州奇流信息科技有限公司 | Training method and training apparatus for machine learning model, and evaluation system |
CN115223267A (en) * | 2022-07-18 | 2022-10-21 | 徐州医科大学 | 3D printing component surface roughness prediction method and device based on deep learning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111951805B (en) | Text data processing method and device | |
CN111915540B (en) | Rubbing oracle character image augmentation method, rubbing oracle character image augmentation system, computer equipment and medium | |
CN110362814B (en) | Named entity identification method and device based on improved loss function | |
CN111428448B (en) | Text generation method, device, computer equipment and readable storage medium | |
CN113177630B (en) | Data memory elimination method and device for deep learning model | |
CN114529574B (en) | Image matting method and device based on image segmentation, computer equipment and medium | |
CN114064974B (en) | Information processing method, apparatus, electronic device, storage medium, and program product | |
CN111832610A (en) | 3D printing organization prediction method, system, medium and terminal equipment | |
CN113298152A (en) | Model training method and device, terminal equipment and computer readable storage medium | |
CN109710224A (en) | Page processing method, device, equipment and storage medium | |
CN114241411B (en) | Counting model processing method and device based on target detection and computer equipment | |
CN115859302A (en) | Source code vulnerability detection method, device, equipment and storage medium | |
CN110490056A (en) | The method and apparatus that image comprising formula is handled | |
CN110852102B (en) | Chinese part-of-speech tagging method and device, storage medium and electronic equipment | |
CN116227573B (en) | Segmentation model training method, image segmentation device and related media | |
CN117611821A (en) | Instance segmentation method, device, system and storage medium | |
CN117392293A (en) | Image processing method, device, electronic equipment and storage medium | |
CN117454190A (en) | Log data analysis method and device | |
CN116824572A (en) | Small sample point cloud object identification method, system and medium based on global and part matching | |
CN116541507A (en) | Visual question-answering method and system based on dynamic semantic graph neural network | |
CN116306672A (en) | Data processing method and device | |
CN116168403A (en) | Medical data classification model training method, classification method, device and related medium | |
CN115795207A (en) | Method and system for automatically identifying fraud-related websites | |
CN113272813B (en) | Custom data stream hardware simulation method, device, equipment and storage medium | |
CN117972699B (en) | Third party open source component risk analysis method and system based on software genes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201027 |
|
RJ01 | Rejection of invention patent application after publication |