CN114820430B - Multi-light source collaborative exposure 3D printing nondestructive testing method - Google Patents

Multi-light source collaborative exposure 3D printing nondestructive testing method Download PDF

Info

Publication number
CN114820430B
CN114820430B CN202210150708.XA CN202210150708A CN114820430B CN 114820430 B CN114820430 B CN 114820430B CN 202210150708 A CN202210150708 A CN 202210150708A CN 114820430 B CN114820430 B CN 114820430B
Authority
CN
China
Prior art keywords
layer
model
program
projection
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210150708.XA
Other languages
Chinese (zh)
Other versions
CN114820430A (en
Inventor
刘顺涛
荣鹏
易元
高川云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aircraft Industrial Group Co Ltd
Original Assignee
Chengdu Aircraft Industrial Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aircraft Industrial Group Co Ltd filed Critical Chengdu Aircraft Industrial Group Co Ltd
Priority to CN202210150708.XA priority Critical patent/CN114820430B/en
Publication of CN114820430A publication Critical patent/CN114820430A/en
Application granted granted Critical
Publication of CN114820430B publication Critical patent/CN114820430B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/379Handling of additively manufactured objects, e.g. using robots
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y40/00Auxiliary operations or equipment, e.g. for material handling
    • B33Y40/20Post-treatment, e.g. curing, coating or polishing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Abstract

The invention discloses a 3D printing nondestructive testing method of multi-light source collaborative exposure, which judges whether similar-shaped parts exist in a database according to part projection characteristics, if so, a scanning program of the similar-shaped parts in the database is called for shooting, if not, a feature vector representation of part projection is formed, the feature vector representation of the part projection is compared with the feature vector representation of the part stored in the database, and the scanning program of the closest part in the database is called as a pre-generation program; based on the simulation scan of the pre-generated program, the pre-generated program is adjusted and corrected until the part-free area is repeatedly exposed or is not irradiated and covered at the same time, and finally the scan program of the part is obtained. The invention can rapidly determine the nondestructive scanning program of the part through the database, and realize that the part-free area is repeatedly exposed or is not irradiated to cover at the same time, thereby ensuring the quality of nondestructive detection and having better practicability.

Description

Multi-light source collaborative exposure 3D printing nondestructive testing method
Technical Field
The invention belongs to the technical field of 3D printing nondestructive testing, and particularly relates to a multi-light-source collaborative exposure 3D printing nondestructive testing method.
Background
Because of the requirement of the 3D printing part forming mode and method, after the part is manufactured, X-ray detection is usually required to check whether the inside of the part has defects such as air holes, cracks, grooves and the like. Industrial X-ray detection devices currently in use are often exposed to multiple X-ray sources simultaneously. This has the advantage of high speed, but has the disadvantage that some areas are not clearly overexposed due to exposure to multiple X-rays. The method is characterized in that a single rotary motion light source is adopted to carry out multi-view X-ray scanning, the X-ray source is fixed on a rotary frame to carry out arc motion to carry out X-ray scanning, and the measurement mode reduces the spatial resolution of a scanned image due to motion artifacts caused by mechanical motion and time delay generated by a thermal electron emission mechanism, prolongs the scanning time, and easily generates motion artifacts in the shooting process, thereby influencing the image quality.
Disclosure of Invention
The invention aims to provide a multi-light source collaborative exposure 3D printing nondestructive testing method, which aims to quickly determine a nondestructive scanning program of a part, and realize repeated exposure or non-irradiation coverage of a part-free area at the same time, so that the nondestructive testing quality is ensured, and the nondestructive testing method has better practicability.
The invention is realized mainly by the following technical scheme:
a multi-light source collaborative exposure 3D printing nondestructive testing method comprises the following steps:
step S100: carrying out characteristic extraction on projection of a surface to be irradiated of the part to obtain part projection characteristics, wherein the part projection characteristics comprise part projection areas and part projection shape characteristics;
step S200: judging whether the parts with similar shapes exist in the database according to the projection characteristics of the parts, if so, calling a scanning program of the parts with similar shapes in the database to shoot, and if not, executing step S300;
step S300: extracting features of the parts, forming feature vector characterization of part projection, comparing the feature vector characterization of the part projection with feature vector characterization of the stored parts in the database, and calling a scanning program of the closest part in the database as a pre-generation program;
step S400: simulating scanning based on a pre-generated program, detecting whether a part area is repeatedly exposed or not at the same time, or not, and if yes, executing step S500;
step S500: the manual adjustment corrects the pre-generated program until the part-free area is repeatedly exposed or is not covered by irradiation at the same time, or the exhaustion method is automatically executed to correct the pre-generated program until the part-free area is repeatedly exposed or is not covered by irradiation at the same time, and finally the scanning program of the part is obtained.
In order to better implement the present invention, further, in the step S300, implementation is performed by a G-AFM model; firstly, mapping a feature vector into a low-dimensional vector through an embedding layer of a G-AFM model, and then crossing first-order features pairwise through a feature crossing layer to obtain all second-order feature items; inputting all second-order cross terms into an attribute Net for training, and endowing all second-order cross terms with a weight coefficient attribute score; and then performing gate operation on all the second-order cross item weight coefficients, and discarding and retaining the second-order cross items.
To better implement the invention, further, the training of the G-AFM model comprises the steps of:
step A1: training a G-AFM model is assisted by adopting a trained picture analysis model, wherein the G-AFM model comprises a front end convolution layer, a rear end convolution layer, a characteristic cross layer and a full connection layer which are sequentially arranged from front to back; the picture analysis model and the G-AFM model share an embedded layer and a front-end convolution layer;
step A2: during training, parameters of a front-end convolution layer are only transmitted forward, internal model parameters are not updated, a picture analysis model is only transmitted forward, a comprehensive vector is obtained, and corresponding model parameters are not updated;
step A3: the G-AFM model outputs the image adjacency finally, and if the distance between the image vector and the comprehensive vector in the vector space is smaller than or equal to a preset distance delta, training is stopped;
step A4: the back propagation LOSS function is the distance D between the two vectors in the Lagrangian space, a LOSS value LOSS is constructed based on D, and after the LOSS value LOSS is obtained, a back propagation algorithm is adopted to determine the update amplitude of each weight of the G-AFM model.
In order to better implement the present invention, in step A4, the back propagation algorithm may determine the change condition of the loss function with respect to each weight, and the gradient back propagation algorithm may back-propagate the value of the loss function to the hidden layer and the input layer by layer through the output layer, and sequentially determine the correction value of the model parameter of each layer, where the correction value of the model parameter of each layer includes a plurality of matrix elements, and corresponds to the model parameter one by one, and each matrix element reflects the correction direction and the correction amount of the parameter.
In order to better realize the invention, further, in a scanning program, radiation sources with non-overlapping irradiation coverage areas on a detector are simultaneously exposed, the radiation sources with overlapping irradiation coverage areas on the detector are respectively exposed, after images irradiated by the radiation sources are acquired, image data are spliced, at least one of scanning images is selected as a reference scanning image, then the acquired images are arranged according to the position relation between the radiation sources and corresponding imaging areas, the non-overlapping parts of one or more scanning images are respectively extracted, the non-reference scanning images of the overlapped parts are deleted, and finally splicing is carried out; and (3) according to the position relation between the ray source and the corresponding imaging area, performing distortion adjustment, color adjustment and/or gray adjustment, and further determining a scanning image of the photographed object by a synthetic or three-dimensional image reconstruction method.
In order to better realize the invention, further, the three-dimensional image reconstruction method comprises algebraic method, iterative method, fourier transformation method and convolution back-projection method.
The invention has the beneficial effects that:
(1) The invention can rapidly determine the nondestructive scanning program of the part through the database, and realize that the part-free area is repeatedly exposed or is not irradiated and covered at the same time, thereby ensuring the quality of nondestructive detection and having better practicability;
(2) According to the invention, a G-AFM model is constructed on the basis of the existing AFM model, a gate mechanism is introduced on the basis of the AFM model to control the characteristic selection process, and the second-order cross characteristic items are reserved and discarded. The specific operation of the "gate" mechanism here is that when the coefficient of the second order cross term is greater than a preset threshold, the "gate" state is on, and the second order feature term is retained; when the coefficient of the second order cross feature is smaller than a preset threshold, the gate state is off, and the second order cross feature is discarded. The second-order cross characteristic is selected through the gate operation, so that the matching accuracy is improved, and the method has good practicability.
Drawings
FIG. 1 is a schematic block diagram of a G-AFM model;
FIG. 2 is a schematic diagram of the training principle of the G-AFM model.
Detailed Description
Example 1:
a multi-light source collaborative exposure 3D printing nondestructive testing method comprises the following steps:
step S100: carrying out characteristic extraction on projection of a surface to be irradiated of the part to obtain part projection characteristics, wherein the part projection characteristics comprise part projection areas and part projection shape characteristics;
step S200: judging whether the parts with similar shapes exist in the database according to the projection characteristics of the parts, if so, calling a scanning program of the parts with similar shapes in the database to shoot, and if not, executing step S300;
step S300: extracting features of the parts, forming feature vector characterization of part projection, comparing the feature vector characterization of the part projection with feature vector characterization of the stored parts in the database, and calling a scanning program of the closest part in the database as a pre-generation program;
step S400: simulating scanning based on a pre-generated program, detecting whether a part area is repeatedly exposed or not at the same time, or not, and if yes, executing step S500;
step S500: and manually adjusting and correcting the pre-generated program until the part-free area is repeatedly exposed or is not irradiated and covered at the same time, or automatically executing the exhaustion method to correct the pre-generated program until the part-free area is repeatedly exposed or is not irradiated and covered at the same time, and finally obtaining the scanning program of the part to obtain the radiation source combination mode.
The invention can rapidly determine the nondestructive scanning program of the part through the database, and realize that the part-free area is repeatedly exposed or is not irradiated to cover at the same time, thereby ensuring the quality of nondestructive detection and having better practicability.
Example 2:
the present embodiment is optimized based on embodiment 1, and in the step S300, the optimization is performed by a G-AFM model; firstly, mapping a feature vector into a low-dimensional vector through an embedding layer of a G-AFM model, and then crossing first-order features pairwise through a feature crossing layer to obtain all second-order feature items; inputting all second-order cross terms into an attribute Net for training, and endowing all second-order cross terms with a weight coefficient attribute score; and then performing gate operation on all the second-order cross item weight coefficients, and discarding and retaining the second-order cross items.
Further, the training of the G-AFM model comprises the following steps:
step A1: training a G-AFM model is assisted by adopting a trained picture analysis model, wherein the G-AFM model comprises a front end convolution layer, a rear end convolution layer, a characteristic cross layer and a full connection layer which are sequentially arranged from front to back; the picture analysis model and the G-AFM model share an embedded layer and a front-end convolution layer;
step A2: during training, parameters of a front-end convolution layer are only transmitted forward, internal model parameters are not updated, a picture analysis model is only transmitted forward, a comprehensive vector is obtained, and corresponding model parameters are not updated;
step A3: the G-AFM model outputs the image adjacency finally, and if the distance between the image vector and the comprehensive vector in the vector space is smaller than or equal to a preset distance delta, training is stopped;
step A4: the back propagation LOSS function is the distance D between the two vectors in the Lagrangian space, a LOSS value LOSS is constructed based on D, and after the LOSS value LOSS is obtained, a back propagation algorithm is adopted to determine the update amplitude of each weight of the G-AFM model.
Further, in the step A4, the back propagation algorithm may determine a change condition of the loss function with respect to each weight, and the gradient back propagation algorithm may back-propagate the value of the loss function to the hidden layer and the input layer by layer through the output layer, and sequentially determine a correction value of the model parameter of each layer, where the correction value of the model parameter of each layer includes a plurality of matrix elements and corresponds to the model parameter one by one, and each matrix element reflects the correction direction and the correction amount of the parameter.
Other portions of this embodiment are the same as those of embodiment 1, and thus will not be described in detail.
Example 3:
in the embodiment, optimization is performed on the basis of embodiment 1 or 2, in a scanning procedure, radiation sources with non-overlapping irradiation coverage areas on a detector are simultaneously exposed, radiation sources with overlapping irradiation coverage areas on the detector are respectively exposed, after images irradiated by the radiation sources are acquired, image data are spliced, at least one of scanning images is selected as a reference scanning image, then the acquired images are arranged according to the positional relationship between the radiation sources and corresponding imaging areas, non-overlapping parts in one or more scanning images are respectively extracted, non-reference scanning images of the overlapped parts are deleted, and finally splicing is performed; and (3) according to the position relation between the ray source and the corresponding imaging area, performing distortion adjustment, color adjustment and/or gray adjustment, and further determining a scanning image of the photographed object by a synthetic or three-dimensional image reconstruction method.
Further, the three-dimensional image reconstruction method comprises algebraic method, iterative method, fourier transformation method and convolution back projection method.
Other portions of this embodiment are the same as those of embodiment 1 or 2 described above, and thus will not be described again.
Example 4:
a multi-light source collaborative exposure 3D printing nondestructive testing method comprises a detector and a plurality of ray sources arranged on one side of the detector, wherein the irradiation ranges of all the ray sources can be incompletely overlapped, and all the ray sources can cover different areas on the detector. For example, with source 1-source 4, source 1 and source 4 may be exposed simultaneously during exposure, since source 1 does not overlap source 4 and source 2 does not overlap source 3; similarly, sources 2 and 3 may be exposed simultaneously. This simultaneous exposure mode is called collaborative exposure. The collaborative exposure can greatly improve the exposure efficiency under the condition of ensuring the quality of the photo.
Further, after the images irradiated by the radiation sources are acquired, the industrial computer can splice according to the image data obtained by multiple exposure. And according to the position relation between the ray source and the corresponding imaging area, performing distortion adjustment, color adjustment and/or gray adjustment, and further determining a scanning image of the object to be shot through methods such as synthesis or three-dimensional image reconstruction. Three-dimensional image reconstruction methods may include, but are not limited to, algebraic methods, iterative methods, fourier transformation methods, convolution back-projection methods, and the like.
Specifically, during stitching, the industrial computer may select at least one of the one or more scanned images as a reference scanned image; arranging the obtained images according to the position relation between the area array ray sources and the corresponding imaging areas; respectively extracting non-overlapping parts in one or more scanned images; deleting the non-reference scanned image of the overlapped part; and splicing the non-overlapping portion and the overlapping portion.
Further, the invention comprises the following steps:
1. determining the characteristics of the part to be irradiated;
carrying out characteristic extraction on projection of a surface to be irradiated of the part by adopting a characteristic extraction method; the extraction method can be used for extracting projection edges according to the projection gray level of the processed part by adopting a graph convolution method and finally obtaining projection characteristics of the part. The part projection features include: part projected area and part projected shape features. Features may be measured in a matrix, for example, a three-dimensional matrix may be used to represent the shape of a projection of the part, e.g., (0, 1) to indicate the presence of sharp corners in the part.
It is also possible to add a digit to the three-dimensional matrix to reflect the corresponding area of the shape, for example, the pointed area is 15, and then (0, 1, 15) can be used to reflect the feature of the shape.
Specifically, the part area can be normalized/data binned, for example, the area is 0.2 within (0-30), the area is 30-60 is 0.5, the area is 60-80 is 0,7, and the area is greater than 120 and is 1.
2. According to the feature of the part, whether the part with similar shape is shot is judged, and the similar judgment standard can be used for judging by using the vector distance between the two matrixes or introducing vector similarity. Our library already has a scanning procedure of several parts inside (this procedure includes invoking which X-ray sources, and the order of the shots of these sources).
Further, for a new part or a new positioning of a previous part, the following process is performed, specifically, the following steps are performed:
s1, extracting features, namely extracting features of parts, wherein a conventional feature extraction mode is adopted, and the feature extraction is performed in a mode of feature barrel division and the like as shown above.
S2, running the simulation software once based on a pre-generated program, and looking at whether the part area is repeatedly exposed or not at the same time, if not, directly executing the program; if so, step S3 is performed.
S3, correcting the place where the manual intervention adjustment program is wrong to obtain the radiation source combination mode. The radiation source combination mode can also be realized by an automatic program executing exhaustion method.
The number of sources required for irradiation of the part is determined. For example: the number of the radiation sources required is determined according to the projection area of the part, and the range of the radiation sources can be determined, for example, 6-8 radiation sources are required, and an accurate number, for example, 7 radiation sources are required.
The best radiation source combination mode is obtained based on an exhaustion method or other preferred searching methods. For example: if 7 radiation sources are used, the number of the radiation sources is 11,12,13,21,22,23,32, and the irradiation ranges of the radiation sources are interfered with each other, then a plurality of non-interfered radiation sources can be simultaneously exposed in the same exposure, for example, the radiation sources 11, 22 and 13 are not interfered, and the radiation sources 12, 21, 23 and 32 are also not interfered. Thus, there may be multiple combinations of non-interfering radiation sources, resulting in a photographing combination with the shortest photographing time, so as to minimize photographing time. Shooting the parts according to a preferred setting method.
Further, in step S1, alternatively, the following technical scheme may be used:
in order to better reflect the shape of the part, a G-AFM model can be adopted to form a second order vector representation of model features, a feature vector representation of the projection of the whole part is formed based on all second order vectors and cascading (splicing, inner product and other forms) of first order vectors, the feature vector representation is compared with the stored part in a library, and the closest program is called as a pre-generated program of the part.
It will be appreciated that the second order vector is a more comprehensive representation that can be obtained by stitching multiple first order vectors. For example, for feature matrices and gray matrices of part outline (gray matrices reflect part accuracy), a weighted summation approach may be used to derive a representation of the integrated second order vector.
Further, as shown in fig. 1, the AFM model gives a weight to the second-order feature by the Attention Net, where the Attention score represents the degree of contribution to the predicted result. Based on this, the AFM model can implement ordering of the second order cross feature items to implement feature ordering. However, since the second-order features in the AFM model are all possible terms of the first-order features intersecting every other one, some of the second-order intersecting terms will interfere with the weight coefficients of other terms during the training process. Aiming at the problems of AFM model in feature selection, the invention improves to obtain the G-AFM model. The G-AFM model is based on an AFM model, a gate mechanism is introduced to control the characteristic selection process, and second-order cross characteristic items are reserved and discarded. The specific operation of the "gate" mechanism here is that when the coefficient of the second order cross term is greater than a preset threshold, the "gate" state is on, and the second order feature term is retained; when the coefficient of the second order cross feature is smaller than a preset threshold, the gate state is off, and the second order cross feature is discarded. The second order crossover feature is selected by a "gate" operation. Firstly, the input candidate feature set passes through an embedding layer of a G-AFM model, and feature vectors are mapped into low-dimensional vectors; then, the first-order features are intersected pairwise through a feature intersection layer to obtain all second-order feature items; inputting all second-order cross terms into an attribute Net for training, and endowing all second-order cross terms with a weight coefficient attribute score; and then carrying out gate operation on all the second-order cross item weight coefficients, and discarding and retaining the second-order cross items.
Further, as shown in FIG. 2, the G-AFM model implements a training architecture in the following manner:
a trained photo analysis model is used, and a trained G-AFM model is used, wherein the G-AFM model and the photo analysis model share an embedded layer and a part of a convolution layer (front-end convolution layer).
Wherein, during training, the parameters of the front-end convolution layer only propagate forward, and the parameters of the internal model are not updated.
The image analysis model is only transmitted forward to obtain a comprehensive label value, and corresponding model parameters are not updated.
The final output of the G-AFM model is a vector of an image, the distance between the vector and the comprehensive vector in a vector space is smaller than a preset distance delta, and the training can be stopped, otherwise, the training is continued. Until the distance between the integrated vector and the AFM output image vector satisfies delta.
The back-propagation loss function should be the distance D of both vectors in lagrangian space,
the LOSS value LOSS is constructed based on D,
after obtaining the LOSS value LOSS, a back propagation algorithm may be used to determine the update magnitude of each weight of the AFM model. That is, the back propagation algorithm can determine the change in the loss function (also referred to as gradient or error derivative) for each weight, noted as. Furthermore, the gradient back-transfer algorithm can back-transfer the value of the loss function layer by layer to the hidden layer and the input layer through the output layer, and sequentially determine the correction value (or gradient) of the model parameters of each layer. Wherein the correction value (or gradient) of the model parameter of each layer includes a plurality of matrix elements (e.g., gradient elements) that are in one-to-one correspondence with the model parameter, each of the gradient elements reflecting the correction direction (increase or decrease) of the parameter and the correction amount.
The invention can rapidly determine the nondestructive scanning program of the part through the database, and realize that the part-free area is repeatedly exposed or is not irradiated to cover at the same time, thereby ensuring the quality of nondestructive detection and having better practicability.
The foregoing description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, and any simple modification, equivalent variation, etc. of the above embodiment according to the technical matter of the present invention fall within the scope of the present invention.

Claims (5)

1. The 3D printing nondestructive testing method for the multi-light source collaborative exposure is characterized by comprising the following steps of:
step S100: carrying out characteristic extraction on projection of a surface to be irradiated of the part to obtain part projection characteristics, wherein the part projection characteristics comprise part projection areas and part projection shape characteristics;
step S200: judging whether the parts with similar shapes exist in the database according to the projection characteristics of the parts, if so, calling a scanning program of the parts with similar shapes in the database to shoot, and if not, executing step S300;
step S300: extracting features of the parts, forming feature vector characterization of part projection, comparing the feature vector characterization of the part projection with feature vector characterization of the stored parts in the database, and calling a scanning program of the closest part in the database as a pre-generation program;
step S400: simulating scanning based on a pre-generated program, detecting whether a part area is repeatedly exposed or not at the same time, or not, and if yes, executing step S500;
step S500: manually adjusting and correcting the pre-generated program until the part-free area is repeatedly exposed or is not irradiated and covered at the same time, or automatically executing the exhaustive method to correct the pre-generated program until the part-free area is repeatedly exposed or is not irradiated and covered at the same time, and finally obtaining a scanning program of the part;
in a scanning program, exposing radiation sources with non-overlapping coverage areas on a detector at the same time, exposing the radiation sources with overlapping coverage areas on the detector respectively, acquiring images irradiated by the radiation sources, stitching image data, selecting at least one of scanned images as a reference scanned image, arranging the acquired images according to the position relation between the radiation sources and the corresponding imaging areas, respectively extracting non-overlapping parts of one or more scanned images, deleting the non-reference scanned images of the overlapped parts, and stitching; and (3) according to the position relation between the ray source and the corresponding imaging area, performing distortion adjustment, color adjustment and/or gray adjustment, and further determining a scanning image of the photographed object by a synthetic or three-dimensional image reconstruction method.
2. The method for non-destructive inspection of 3D printing by multiple light source collaborative exposure according to claim 1, wherein in step S300, it is implemented by a G-AFM model; firstly, mapping a feature vector into a low-dimensional vector through an embedding layer of a G-AFM model, and then crossing first-order features pairwise through a feature crossing layer to obtain all second-order feature items; inputting all second-order cross terms into an attribute Net for training, and endowing all second-order cross terms with a weight coefficient attribute score; and then performing gate operation on all the second-order cross item weight coefficients, and discarding and retaining the second-order cross items.
3. The multi-light source collaborative exposure 3D printing non-destructive inspection method of claim 2, wherein the training of the G-AFM model comprises the steps of:
step A1: training a G-AFM model is assisted by adopting a trained picture analysis model, wherein the G-AFM model comprises a front end convolution layer, a rear end convolution layer, a characteristic cross layer and a full connection layer which are sequentially arranged from front to back; the picture analysis model and the G-AFM model share an embedded layer and a front-end convolution layer;
step A2: during training, parameters of a front-end convolution layer are only transmitted forward, internal model parameters are not updated, a picture analysis model is only transmitted forward, a comprehensive vector is obtained, and corresponding model parameters are not updated;
step A3: the G-AFM model outputs the image adjacency finally, and if the distance between the image vector and the comprehensive vector in the vector space is smaller than or equal to a preset distance delta, training is stopped;
step A4: the back propagation LOSS function is the distance D between the two vectors in the Lagrangian space, a LOSS value LOSS is constructed based on D, and after the LOSS value LOSS is obtained, a back propagation algorithm is adopted to determine the update amplitude of each weight of the G-AFM model.
4. A multi-light source collaborative exposure 3D printing nondestructive testing method according to claim 3, wherein in the step A4, a back propagation algorithm determines the change condition of a loss function relative to each weight, a gradient back propagation algorithm carries out back propagation on the value of the loss function layer by layer to a hidden layer and an input layer through an output layer, correction values of model parameters of each layer are sequentially determined, the correction values of the model parameters of each layer comprise a plurality of matrix elements, the matrix elements correspond to the model parameters one by one, and each matrix element reflects the correction direction and correction amount of the parameters.
5. The multi-light source collaborative exposure 3D printing nondestructive testing method of claim 1 wherein the three-dimensional image reconstruction method comprises algebraic, iterative, fourier transform, convolution back-projection.
CN202210150708.XA 2022-02-18 2022-02-18 Multi-light source collaborative exposure 3D printing nondestructive testing method Active CN114820430B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210150708.XA CN114820430B (en) 2022-02-18 2022-02-18 Multi-light source collaborative exposure 3D printing nondestructive testing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210150708.XA CN114820430B (en) 2022-02-18 2022-02-18 Multi-light source collaborative exposure 3D printing nondestructive testing method

Publications (2)

Publication Number Publication Date
CN114820430A CN114820430A (en) 2022-07-29
CN114820430B true CN114820430B (en) 2023-10-03

Family

ID=82527893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210150708.XA Active CN114820430B (en) 2022-02-18 2022-02-18 Multi-light source collaborative exposure 3D printing nondestructive testing method

Country Status (1)

Country Link
CN (1) CN114820430B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116373293B (en) * 2023-06-06 2023-09-29 成都飞机工业(集团)有限责任公司 Wire feeding device and method for FDM numerical control 3D printing equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000133563A (en) * 1998-10-22 2000-05-12 Nikon Corp Exposure method and aligner
JP2003347196A (en) * 2002-05-27 2003-12-05 Nikon System:Kk Device management method, exposure method, lithography system, and program
CN102207690A (en) * 2011-05-20 2011-10-05 合肥芯硕半导体有限公司 Multi-SLM (Spatial Light Modulator) exposure and data processing method
CN108883575A (en) * 2016-02-18 2018-11-23 维洛3D公司 Accurate 3 D-printing
CN109991251A (en) * 2019-04-08 2019-07-09 中国工程物理研究院应用电子学研究所 A kind of industrial CT scanning method based on multilayer fan-delta sandbody
CN110070521A (en) * 2019-03-19 2019-07-30 广东工业大学 A kind of 3D printing model flaw anticipation system and method for view-based access control model nerve study
CN111299581A (en) * 2020-03-30 2020-06-19 成都飞机工业(集团)有限责任公司 Method for improving success rate of 3D printing of thin-wall metal component
CN111735735A (en) * 2020-06-11 2020-10-02 山东滨州烟草有限公司 Intelligent cigarette package nondestructive identification method and identifier
CN111923411A (en) * 2020-09-01 2020-11-13 卢振武 Dynamic imaging 3D printing system and printing method thereof
CN112595262A (en) * 2020-12-08 2021-04-02 广东省科学院智能制造研究所 Binocular structured light-based high-light-reflection surface workpiece depth image acquisition method
CN113246466A (en) * 2021-01-14 2021-08-13 西安交通大学 Single-light-source multi-irradiation large-size surface exposure additive manufacturing equipment, system and method
CN113334767A (en) * 2021-06-16 2021-09-03 上海联泰科技股份有限公司 3D printing method, device, data processing method, system and storage medium
CN113889229A (en) * 2021-09-29 2022-01-04 浙江德尚韵兴医疗科技有限公司 Construction method of medical image diagnosis standard based on human-computer combination

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8330673B2 (en) * 2009-04-02 2012-12-11 GM Global Technology Operations LLC Scan loop optimization of vector projection display
WO2017097763A1 (en) * 2015-12-08 2017-06-15 U-Nica Technology Ag Three-dimensional printing method for producing a product protected against forgery by means of a security feature

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000133563A (en) * 1998-10-22 2000-05-12 Nikon Corp Exposure method and aligner
JP2003347196A (en) * 2002-05-27 2003-12-05 Nikon System:Kk Device management method, exposure method, lithography system, and program
CN102207690A (en) * 2011-05-20 2011-10-05 合肥芯硕半导体有限公司 Multi-SLM (Spatial Light Modulator) exposure and data processing method
CN108883575A (en) * 2016-02-18 2018-11-23 维洛3D公司 Accurate 3 D-printing
CN110070521A (en) * 2019-03-19 2019-07-30 广东工业大学 A kind of 3D printing model flaw anticipation system and method for view-based access control model nerve study
CN109991251A (en) * 2019-04-08 2019-07-09 中国工程物理研究院应用电子学研究所 A kind of industrial CT scanning method based on multilayer fan-delta sandbody
CN111299581A (en) * 2020-03-30 2020-06-19 成都飞机工业(集团)有限责任公司 Method for improving success rate of 3D printing of thin-wall metal component
CN111735735A (en) * 2020-06-11 2020-10-02 山东滨州烟草有限公司 Intelligent cigarette package nondestructive identification method and identifier
CN111923411A (en) * 2020-09-01 2020-11-13 卢振武 Dynamic imaging 3D printing system and printing method thereof
CN112595262A (en) * 2020-12-08 2021-04-02 广东省科学院智能制造研究所 Binocular structured light-based high-light-reflection surface workpiece depth image acquisition method
CN113246466A (en) * 2021-01-14 2021-08-13 西安交通大学 Single-light-source multi-irradiation large-size surface exposure additive manufacturing equipment, system and method
CN113334767A (en) * 2021-06-16 2021-09-03 上海联泰科技股份有限公司 3D printing method, device, data processing method, system and storage medium
CN113889229A (en) * 2021-09-29 2022-01-04 浙江德尚韵兴医疗科技有限公司 Construction method of medical image diagnosis standard based on human-computer combination

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Advances in Optics and Exposure Devices Employed in Excimer Laser/EUV Lithography;Akiyoshi Suzuki;SpringerLink;全文 *
三维激光技术的特异性建筑施工检测应用;杨雪姣 等;测绘科学;全文 *
融合智能信息处理的面曝光3D打印关键技术研究;赵立东;中国硕士学位论文全文数据库信息科技辑(第05期);全文 *

Also Published As

Publication number Publication date
CN114820430A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
Quéau et al. Led-based photometric stereo: Modeling, calibration and numerical solution
US8908910B2 (en) Provision of image data
US6563942B2 (en) Method for adjusting positions of radiation images
KR101892321B1 (en) A method and apparatus for providing image data for constructing an image of a region of a target object
JP6519265B2 (en) Image processing method
CN114820430B (en) Multi-light source collaborative exposure 3D printing nondestructive testing method
Douxchamps et al. High-accuracy and robust localization of large control markers for geometric camera calibration
CN102576410B (en) Evaluation of image processing algorithms
US8942345B2 (en) Method for obtaining a 3D image dataset of an object of interest
WO2015022999A1 (en) Image processing apparatus, image processing system, image processing method, and computer program
US20100254591A1 (en) Verification method for repairs on photolithography masks
CN110599578A (en) Realistic three-dimensional color texture reconstruction method
KR20220073766A (en) Multi-imaging mode image alignment
CN113935948A (en) Grating image target positioning optimization and wavelength characteristic analysis method and device
CN113989353A (en) Pig backfat thickness measuring method and system
JP2000113198A (en) Method for automatically inspecting print quality using elastic model
WO2020165976A1 (en) Simulation device, simulation method, and simulation program
US7574051B2 (en) Comparison of patterns
US9237873B2 (en) Methods and systems for CT projection domain extrapolation
US20030185339A1 (en) Tomography of curved surfaces
CN115164776B (en) Three-dimensional measurement method and device for fusion of structured light decoding and deep learning
US9886749B2 (en) Apparatus and method for parameterizing a plant
US20220130081A1 (en) Computer-implemented method for determining at least one geometric parameter required for evaluating measurement data
JP7178621B2 (en) Image processing device and image processing method
Moorthi et al. Co-registration of LISS-4 multispectral band data using mutual information-based stochastic gradient descent optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant