CN116175977A - Method and device for detecting printing quality of 3D printer and 3D printer - Google Patents

Method and device for detecting printing quality of 3D printer and 3D printer Download PDF

Info

Publication number
CN116175977A
CN116175977A CN202211124617.5A CN202211124617A CN116175977A CN 116175977 A CN116175977 A CN 116175977A CN 202211124617 A CN202211124617 A CN 202211124617A CN 116175977 A CN116175977 A CN 116175977A
Authority
CN
China
Prior art keywords
images
image
model
pixel
partial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211124617.5A
Other languages
Chinese (zh)
Inventor
唐克坦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Outline Technology Co ltd
Original Assignee
Shanghai Outline Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Outline Technology Co ltd filed Critical Shanghai Outline Technology Co ltd
Priority to CN202211124617.5A priority Critical patent/CN116175977A/en
Publication of CN116175977A publication Critical patent/CN116175977A/en
Priority to PCT/CN2023/107561 priority patent/WO2024055742A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/20Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)

Abstract

A method for detecting print quality of a 3D printer is provided. The method comprises the following steps: obtaining a model reference diagram; generating a scanning path; moving a color camera under the carriage of a print head along a scan path and capturing at a plurality of different positions to obtain a plurality of first partial images; enabling a printing head to print a first layer of the 3D model on a hot bed; moving the color camera along the scan path and capturing a plurality of second partial images at a plurality of different locations as described above; determining a plurality of partial result images based on respective differences between the plurality of second partial images and the plurality of first partial images; stitching the plurality of partial result images to generate a global image corresponding to the model reference map; based on the model reference map and the global image, a print quality result is determined.

Description

Method and device for detecting printing quality of 3D printer and 3D printer
Technical Field
The present disclosure relates to the field of 3D printing technology, in particular to a method for detecting the print quality of a 3D printer, an apparatus for detecting the print quality of a 3D printer, a computer readable storage medium and a computer program product.
Background
3D printing technology, also known as additive manufacturing technology, is a technology that builds objects by layer-by-layer printing using bondable materials based on digital model files. 3D printing is typically implemented using a 3D printer. The 3D printer, also called a three-dimensional printer and a three-dimensional printer, is a process device for rapid molding. 3D printers are often used in the field of mold manufacturing, industrial design, etc. to manufacture models or parts. One typical 3D printing technique is fused deposition modeling (Fused Deposition modeling, FDM) to build objects by selectively depositing molten material in a predetermined path layer by layer, the material used being a thermoplastic polymer and in the form of filaments. At present, the printing quality of the 3D printer has a large improvement space.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, the problems mentioned in this section should not be considered as having been recognized in any prior art unless otherwise indicated.
Disclosure of Invention
The present disclosure provides methods for detecting print quality of a 3D printer, apparatuses for detecting print quality of a 3D printer, 3D printers, computer readable storage media, and computer program products.
According to an aspect of the present disclosure, there is provided a method for detecting a print quality of a 3D printer, wherein the 3D printer includes: a hot bed; a printhead movable relative to the thermal bed; a color camera disposed on the printhead for capturing an image of a portion of the thermal bed; and at least one processor for controlling the printhead to move relative to the hotbed to print a 3D model layer by layer based on control code generated by slicing software, the method comprising: obtaining a model reference map, wherein the model reference map is generated by analyzing control information generated by the slicing software, and the model reference map represents an occupied area of at least one part of a first layer of the 3D model on the hot bed; generating a scan path based on the model reference map, wherein the scan path is generated such that the color camera sequentially captures images of a plurality of different locations of the occupied zone as the color camera moves along the scan path as the printhead moves relative to the hotbed; moving the color camera along the scan path under the load of the printhead and capturing at the plurality of different locations during the movement to obtain a plurality of first partial images indicative of respective images of the thermal bed at the plurality of different locations, respectively; causing the printhead to print a first layer of the 3D model on the thermal bed; moving the color camera along the scan path under the carriage of the printhead and capturing during this movement at the plurality of different positions to obtain a plurality of second partial images respectively indicative of respective images of the thermal bed at the plurality of different positions after the first layer of the 3D model is printed on the thermal bed; determining a plurality of partial result images based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images; splicing the plurality of local result images according to the scanning path to generate a global image corresponding to the model reference image; and determining a print quality result based on the model reference map and the global image, the print quality result indicating a print quality of the at least a portion of the first layer of the 3D model.
According to another aspect of the present disclosure, there is provided an apparatus for detecting a print quality of a 3D printer, wherein the 3D printer includes: a hot bed; a printhead movable relative to the thermal bed; a color camera disposed on the printhead for capturing an image of a portion of the thermal bed; and at least one processor for controlling the printhead to move relative to the hotbed to print a 3D model layer by layer based on control code generated by slicing software, the apparatus comprising: a first module for obtaining a model reference map, wherein the model reference map is generated by analyzing control information generated by the slicing software, and the model reference map represents an occupied area of at least a part of a first layer of the 3D model on the hot bed; a second module for generating a scan path based on the model reference map, wherein the scan path is generated such that the color camera sequentially captures images of a plurality of different locations of the occupied zone as the color camera moves along the scan path as the printhead moves relative to the thermal bed; a third module for moving the color camera along the scan path under the carriage of the printhead and capturing a plurality of first partial images at the plurality of different positions during the movement to obtain a plurality of first partial images indicative of respective images of the thermal bed at the plurality of different positions, respectively; a fourth module for causing the printhead to print a first layer of the 3D model on the thermal bed; a fifth module for moving the color camera along the scan path under the carriage of the printhead and capturing during this movement at the plurality of different positions to obtain a plurality of second partial images respectively indicative of respective images of the thermal bed at the plurality of different positions after a first layer of the 3D model is printed on the thermal bed; a sixth module for determining a plurality of partial result images based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images; a seventh module, configured to stitch the plurality of local result images according to the scan path, so as to generate a global image corresponding to the model reference map; and an eighth module for determining a print quality result based on the model reference map and the global image, the print quality result indicating a print quality of the at least a portion of the first layer of the 3D model.
According to another aspect of the present disclosure, there is provided a 3D printer including: a hot bed; a printhead movable relative to the thermal bed; a depth sensor disposed on the printhead for measuring a distance of a portion of the thermal bed relative to the depth sensor; and at least one processor configured to derive a local depth map of the portion of the thermal bed from the measurements of the depth sensor and to control the printhead to move relative to the thermal bed to print a 3D model layer by layer based on control code generated by slicing software, wherein the at least one processor is further configured to execute instructions to implement the method as described above.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing instructions, wherein the instructions, when executed by the at least one processor of a 3D printer as described above, implement a method as described above.
According to another aspect of the present disclosure, there is provided a computer program product comprising instructions which, when executed by the at least one processor of a 3D printer as described above, implement a method as described above.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The accompanying drawings illustrate exemplary embodiments and, together with the description, serve to explain exemplary implementations of the embodiments. The illustrated embodiments are for exemplary purposes only and do not limit the scope of the claims. Throughout the drawings, identical reference numerals designate similar, but not necessarily identical, elements.
FIG. 1 shows a schematic diagram of a 3D printer according to an example embodiment;
FIG. 2 shows a flowchart of a method for detecting print quality of a 3D printer, according to an example embodiment;
FIG. 3 shows an example graphical representation of a first layer of a 3D model on a hotbed;
FIG. 4 shows a model reference map corresponding to the example of FIG. 3;
FIG. 5 shows an example of a scan path for the model reference map of FIG. 4;
FIG. 6 shows a flowchart of steps for determining a plurality of partial result images according to an example embodiment;
FIG. 7 shows a flowchart of steps for determining print quality results, according to an example embodiment;
FIG. 8 shows a flowchart of another method for detecting print quality of a 3D printer, according to an example embodiment;
FIG. 9a shows an example of a global image stitched from multiple partial result images without illumination compensation;
FIG. 9b shows an example of the global image of FIG. 9a after illumination compensation;
fig. 10 shows a block diagram of a structure of an apparatus for detecting print quality of a 3D printer according to an example embodiment.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the present disclosure, the use of the terms "first," "second," and the like to describe various elements is not intended to limit the positional relationship, timing relationship, or importance relationship of the elements, unless otherwise indicated, and such terms are merely used to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, they may also refer to different instances based on the description of the context.
The terminology used in the description of the various illustrated examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, the elements may be one or more if the number of the elements is not specifically limited. Furthermore, the term "and/or" as used in this disclosure encompasses any and all possible combinations of the listed items. The term "based on" should be construed as "based at least in part on.
The 3D printing technique constructs an object by means of layer-by-layer printing. In 3D printing, the print quality of the first layer of the 3D model is a key to determine whether the printing can be successful. If the print quality of the first layer is poor, the quality of the final formed 3D model will be seriously affected. Therefore, in order to ensure that printing is successful, it is necessary to detect the print quality of the first layer. If the printing quality of the first layer is not in accordance with the requirement, the printing can be stopped in time, and the final printing failure is avoided. The current 3D printer has no function of detecting the printing quality of the first layer, so that the quality condition of the first layer cannot be instantly known.
The inventors have realized that the print quality of the first layer can be detected by image processing of a captured image with an optical camera. Moreover, compared to other quality detection techniques (e.g., using depth detection techniques to detect the presence of print voids), using optical cameras and image processing requires less equipment, and can be implemented with common consumer color cameras, and thus can also be easily integrated into various types of 3D printing devices.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
Fig. 1 shows a schematic diagram of a 3D printer 100 according to an embodiment of the present disclosure. As shown in fig. 1, the 3D printer 100 includes a thermal bed 110, a print head 120 movable relative to the thermal bed 110, a color camera 130 disposed on the print head 120 for capturing an image of a portion of the thermal bed 110. Herein, the phrase "a printhead movable relative to a thermal bed" may refer to any of the following: (1) the thermal bed remains stationary and the printhead moves; (2) the thermal bed moves and the printhead remains stationary; (3) both the thermal bed and the print head move. Examples of color cameras 130 include, but are not limited to, normal consumer 2D color cameras. Embodiments of the present disclosure are described herein with such a general consumer color 2D camera as an example for ease of description, but the present disclosure is not limited in this respect.
The 3D printer 100 also includes at least one processor (not shown). The at least one processor is configured to control movement of the print head 120 relative to the hotbed 110 to print the 3D model layer by layer based on control code generated by the slicing software. As shown in fig. 1, at least one processor may drive a motor (not shown) and in turn drive the extrusion wheel 150 to feed the printing material 170 on the tray 160 into the printhead 120. During movement of the print head 120, printing material is extruded from the print head 120 and deposited on the thermal bed 110. Typically, the slicing software runs on a computing device communicatively connected to the 3D printer 100 and operates to generate control information for controlling the printing process. For example, slicing software may provide a Graphical User Interface (GUI) to allow a user to select or adjust layout information representing the position and orientation of the 3D model on hotbed 110. Slicing software may slice the 3D graphical representation of the 3D model to generate slice data (e.g., number of slices, height of slices per layer, etc.), and then convert the slice data into control code for controlling the print head 120 of the 3D printer 100 to move along a print path to print the slices of each layer. Such control codes are typically in the form of gcode. The control code is downloaded to the 3D printer 100 for execution by at least one processor. To this end, the 3D printer 100 may further include at least one memory (not shown) for storing programs and/or data.
The at least one processor is also configured to implement the various functions described below. In an example, the processor includes a microcontroller or computer that executes instructions stored in firmware and/or software (not shown). The processor may be programmable to perform the functions described herein. As used herein, the term computer is not limited to just those integrated circuits referred to in the art as computers, but broadly refers to computers, processors, microcontrollers, microcomputers, programmable logic controllers, application specific integrated circuits, and other programmable circuits, and these terms are used interchangeably herein. The computers and/or processors discussed herein may each employ a computer-readable medium or machine-readable medium, which refers to any medium that participates in providing instructions to the processor for execution. The memory discussed above constitutes a computer-readable medium. Such a medium may take many forms, including but not limited to, non-volatile media, and transmission media.
It will be appreciated that example embodiments of the present disclosure are described below in connection with an FDM printer, but the present disclosure is not limited to FDM printers. In embodiments, the print head 120 may be configured to be capable of extruding any material suitable for 3D printing, including, for example, thermoplastics, alloys, metal powders, ceramic materials, ceramic powders, polymers, and the like.
Fig. 2 is a flowchart illustrating a method 200 for detecting print quality of a 3D printer according to an example embodiment. For discussion purposes, the method 200 is described below in connection with the 3D printer 100 shown in fig. 1. In an example, the method 200 may be implemented by at least one processor in the 3D printer 100.
At step 210, a model reference map is acquired. The model reference map represents the footprint of at least a portion of the first layer of the 3D model on the hotbed 110.
The model reference diagram is described below in connection with fig. 3 and 4. Fig. 3 shows an example graphical representation of the first layer of a 3D model on a hotbed 310, and fig. 4 shows a model reference map corresponding to the example of fig. 3. In this example, the footprint of the first layer of the 3D model on the thermal bed 310 includes four discrete areas 340a, 340b, 340c, and 340D. These areas may be formed of the same printing material or different printing materials. Although discrete regions 340a, 340b, 340c, and 340d are shown in fig. 3 as having a rectangular shape, this is merely illustrative; in other examples, the footprint of the first layer of the 3D model may have other shapes or configurations (e.g., a single connected region), as the disclosure is not limited in this respect. Corresponding to the graphical representation of fig. 3, the model reference map 410 shown in fig. 5 includes four pixel regions 440a, 440b, 440c, and 440d. In general, model reference map 410 may be generated under a hotbed coordinate system oxyz (FIG. 3), and the coordinates of pixel regions 440a, 440b, 440c, and 440d in model reference map 410 correspond one-to-one with the coordinates of discrete regions 340a, 340b, 340c, and 340d on hotbed 310. It will be appreciated that while the model reference map 410 is shown in fig. 4 as having dimensions corresponding to those of the thermal bed 310 of fig. 3, this is not required. In other examples, the model reference map 410 may have only a size corresponding to a bounding box (bounding box) of the occupied area ( discrete areas 340a, 340b, 340c, and 340d as a whole in fig. 3), thereby saving storage space.
It will also be appreciated that the model reference map need not necessarily represent all of the first layer of the 3D model, but may represent only a portion of the first layer of the 3D model. This is because in some cases it may only be necessary to detect the print quality of a portion of the first layer of the 3D model. For example, because the temperature distribution of the thermal beds may be non-uniform, the temperature may be higher in some thermal bed regions and lower in other thermal bed regions. In the hot bed region where the temperature is low, the printing material may not be molded normally and thus a printing defect occurs. In this case, the detection of the print quality can be performed only for the hot bed region at a lower temperature, thereby improving the detection efficiency.
The model reference map may be generated by parsing control information generated by slicing software. In some embodiments, the control information generated by the slicing software includes control code (e.g., gcode) for printing the first layer of the 3D model. In such an embodiment, obtaining the model reference map (step 210) may include: the model reference map is received from a computing device communicatively connected to the 3D printer 100. The model reference map is generated by slicing software, when run on a computing device, by parsing the control code for printing the first layer of the 3D model. Alternatively, obtaining the model reference map (step 210) may include: the model reference map is read locally from the 3D printer 100. The model reference map is generated by at least one processor by parsing the control code for printing the first layer of the 3D model or is extracted directly from a slice of the model. Since the control code specifies the path of movement of the print head, the footprint of the first layer of the 3D model on the hot bed can be recovered therefrom.
In some embodiments, the control information generated by the slicing software includes layout information representing the position and orientation of the 3D model on the hotbed 110. In such an embodiment, obtaining the model reference map (step 210) may include: the model reference map is received from a computing device communicatively connected to the 3D printer 100. The model reference map is generated by slicing software that parses the layout information when running on the computing device. Since the layout information defines the position and orientation of the 3D model on the hotbed, the footprint of the first layer of the 3D model on the hotbed can be recovered therefrom.
Referring back to FIG. 2, at step 220, a scan path is generated based on the model reference map. The scan path is generated such that as the color camera 130 moves along the scan path as the printhead 120 moves relative to the thermal bed 110, the color camera 130 sequentially captures images of a plurality of different locations of the occupied zone.
Fig. 5 shows an example of a scan path for the model reference map of fig. 4. In some embodiments, the footprint comprises at least one discrete area spaced apart from each other, the model reference map comprises at least one pixel area representing the at least one discrete area, respectively, and generating the scan path (step 220) may comprise:
(1a) And determining the respective boundary boxes of the at least one pixel area to obtain at least one boundary box corresponding to the at least one pixel area respectively. In the example of fig. 5, the bounding boxes of each of the pixel regions 440a, 440b, 440c, and 440d may be determined, resulting in four bounding boxes.
(1b) A scan path along which a virtual frame representing a field of view (FOV) of the color camera is moved to cover a portion of the at least one bounding box at a time as a whole and finally traverse the entire area of the at least one bounding box is determined in the model reference map. In the example of fig. 5, a virtual box representing the FOV of the color camera 130 is shown, and the determined scan path is shown with an open arrow. In this example, the scan path is a Zig-Zag path, but this is illustrative and not limiting.
It will be appreciated that the operation of generating the bounding box is not necessary. In some embodiments, the scan path may be generated for the original shape of the pixel region in the model reference map representing the footprint of the first layer of the 3D model on the hotbed. In other embodiments, the scan path may also be generated in any other suitable way, as long as the depth sensor can measure a plurality of target locations of the area occupied by the first layer of the 3D model on the hotbed.
In some embodiments, the footprint comprises at least one discrete area spaced apart from each other, the model reference map comprises at least one pixel area representing the at least one discrete area, respectively, and generating the scan path (step 220) may comprise:
(2a) And determining the respective connected domains of the at least one pixel region to obtain at least one connected domain respectively corresponding to the at least one pixel region. In the example of fig. 5, the connected domains of each of the pixel regions 440a, 440b, 440c, and 440d may be determined, thereby obtaining four connected domains.
(2b) For each connected domain, a movement path along which a virtual frame representing a field of view of the color camera moves to cover a portion of the connected domain at a time and finally traverse the entire region of the connected domain is determined in the model reference map. This may be similar to operation (1 b) described above and will not be described again.
(2c) And merging the moving paths for all the connected domains into one merging path serving as the scanning path. By generating separate scan paths for each discrete region of the occupied region and combining the individual scan paths into a final scan path, scanning for non-target regions (e.g., blank regions in fig. 5) may be reduced, thereby improving detection efficiency.
It will be appreciated that in embodiments, the scan path is generated for the field of view of the color camera 130, and the scan path of the color camera 130 does not necessarily coincide with the path of movement of the print head 120, as there may be rotation and/or translation between the orientation of the color camera 130 and the orientation of the print head 120. The rotation and/or translation of the print head 120 and the color camera 130 in a three-dimensional coordinate system (e.g., a hotbed coordinate system) may be pre-calibrated by external reference calibration, and the scan path for the color camera 130 is converted into a movement path for the print head 120, and corresponding control codes are generated to control the movement of the print head 120 such that the color camera 130 moves along the scan path under the load of the print head 120. External calibration is a known technique and is not described in detail herein in order not to obscure the subject matter of the present disclosure.
Referring back to fig. 2, at step 230, the color camera 130 is moved along the scan path under the carriage of the printhead 120 and the color camera 130 is photographed at a plurality of different positions during the movement to obtain a plurality of first partial images. The plurality of first partial images respectively indicate corresponding images of hot bed 110 at a plurality of different locations.
In some implementations, each first partial image of the plurality of first partial images is numbered and stored in memory according to the physical coordinates of the print head 120 in the hotbed coordinate system when the color camera 130 captured each first partial image on the scan path. The purpose of numbering is to allow each first partial image to correspond to the plurality of different locations on hot bed 110. It will be appreciated that the plurality of first partial images may be stored under the camera coordinate system or may be converted to be stored under the image coordinate system. The following is described by way of example stored under a camera coordinate system.
At step 240, the printhead 120 is caused to print the first layer of the 3D model on the hotbed 110.
At step 250, the color camera 130 is moved along the scan path under the carriage of the printhead 120 and captured at a plurality of different locations as described above during the movement to obtain a plurality of second partial images. The plurality of second partial images respectively indicate respective images of the hotbed 110 at a plurality of different positions after the first layer of the 3D model is printed on the hotbed 110. This step causes the color camera 130 to again take shots at the same plurality of different locations along the same scan path as in step 230.
Likewise, in some implementations, each of the plurality of second partial images may be numbered and stored in memory under the camera coordinate system according to the physical coordinates of the print head 120 under the hotbed coordinate system when the color camera 130 captures each optical image on the scan path. This allows each second partial image to correspond to the plurality of different locations on hotbed 110 and thus also to each first partial image stored in step 230.
In step 260, a plurality of partial result images are determined based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images. The plurality of partial result images represent respective images after the heat bed background is subtracted at a plurality of different locations, respectively. With further reference to fig. 6, determining a plurality of partial result images (step 260) may include the operations of:
in step 610, a plurality of difference images are obtained by calculating a difference between each of the plurality of second partial images and a corresponding one of the plurality of first partial images. For example, the second partial image is represented by a pixel matrix I, the corresponding first partial image (hotbed image) is represented by a pixel matrix J, and the pixel matrix d=abs (I-J) of the difference image.
In step 620, for each difference image, a threshold value for determining valid pixels is determined, each valid pixel indicating that the pixel belongs to the footprint. In some embodiments, an average variance of each pixel in a first partial image (hotbed image) corresponding to the difference image is calculated, and a threshold is determined based on the average variance. For example, the average variance of each pixel is calculated as mean_std. The definition of the threshold for judging valid pixels is diff_thresh=clip (2×mean_std,30, 80), i.e., the threshold diff_thresh is equal to 2 times mean_std, but does not exceed the range [30,80].
In step 630, a masking operation is performed for each pixel in each difference image. The masking operation includes: comparing the pixel value of each pixel with the determined threshold value; in response to the pixel value of the pixel being greater than the threshold, determining that the pixel belongs to a valid pixel and replacing the pixel value of the pixel with a pixel value of a corresponding pixel of the second partial image, otherwise preserving the pixel value.
In step 640, the plurality of difference images after the masking operation are taken as a plurality of partial result images, thereby obtaining a plurality of partial result images.
Referring back to FIG. 2, at step 270, the plurality of local result images are stitched according to the scan path to generate a global image corresponding to the model reference map. In some embodiments, the splicing operation includes: in response to determining that the plurality of partial result images overlap each other, pixel values of pixels at overlapping positions between the plurality of partial result images are averaged or linearly interpolated.
At step 280, a print quality result is determined based on the model reference map and the global image, the print quality result indicating a print quality of at least a portion of a first layer of the 3D model. It should be appreciated that the model reference map indicates target print images at a plurality of different locations, and the global image indicates actual print images at a plurality of different locations. Therefore, whether there is a print defect or the severity of the print defect can be detected from the error of the target print image and the actual print image.
With further reference to fig. 7, determining the print quality result includes (step 280) may include the operations of:
at step 710, the target print image at the plurality of different locations is compared with the actual print image at the corresponding one of the plurality of different locations.
At step 720, a print quality result is determined based on the comparison operation.
It will also be appreciated that the error between such target value and actual value can be measured in a variety of possible ways and the print quality result determined therefrom. Hereinafter, some illustrative implementations are provided and should not be considered limiting.
In some embodiments, the comparing step 710 may include:
(3a) A normal print threshold for a first layer of the 3D model is determined. The upper and lower boundaries of the normal printing threshold are related to the printing height and the printing material set by the first layer.
(3b) The global image and the model reference map are binarized based on the determined normal print threshold. In some embodiments, the global image and model reference map are turned to the Lab color space using Lab algorithm, and the luminance component is binarized to obtain a binarized image B of the global image and a binarized image R of the model reference map, where the binarized threshold may be the threshold diff_thresh/2 used above with respect to background subtraction. Binarization is a known technique and is not described in detail herein in order not to obscure the subject matter of the present disclosure.
(3c) And identifying error pixels in the global image by comparing pixel values of corresponding pixels of the binarized global image and the model reference image, wherein the error pixels indicate that the first layer of the 3D model is not printed normally. Continuing with the previous example (resulting in a binarized image B of the global image and a binarized image R of the model reference map), the binarized image R of the model reference map is traversed, and if a certain position R (x, y) >0&B (x, y) =0, this is denoted here as a hole and marked as an erroneous pixel. The traversal is completed to obtain an error map errmap.
(3d) At least one pixel region representing an occupied region in a model reference map is determined. Each pixel region is typically in the form of a connected domain, each pixel in the connected domain representing that the corresponding location on the hotbed is occupied by the first layer of the 3D model. It should be appreciated that different pixel regions representing occupied areas may have different areas. For detection efficiency purposes, the detection may be performed for only a portion of the area that is large (e.g., greater than threshold T 1 ) Instead of the entire pixel area.
(3e) For at least one of the at least one pixel region:
(3 e-1) counting the number of erroneous pixels in each pixel corresponding to the pixel region in the global image. For areas greater than the threshold T, for example 1 Respectively counting the number n of pixels of the global image in the connected domain C.
(3 e-2) comparing the number of erroneous pixels with a corresponding threshold value and/or comparing the ratio of the number of erroneous pixels to the number of individual pixels in the global image corresponding to the pixel area with a corresponding threshold value.
As previously described, the error between the target value and the actual value of the first-layer print image can be measured in various possible ways. Here, the number of normal pixels, the number of erroneous pixels, and the relative number relationship between the normal pixels and the erroneous pixels are all measurement criteria reflecting an error between the target value and the actual value of the first-layer printed image. In one example, the error level/may be defined as follows:
Figure BDA0003847946670000151
wherein T is 2 、T 3 、T 4 All are thresholds, and n is the number of pixels of the global image in the connected domain C. These thresholds may be preset or adaptive. For example, these thresholds may adaptively change as a function of the number of pixels in the connected domain C. In this example, the error level l may have a value of 0, 1 or 2. It will be appreciated that such error levels are merely illustrative and not limiting.
Based on the comparison in step 710, a print quality result may be determined in step 720. Continuing with the example above with respect to error level l, the following decision logic may be defined:
if any connected domain C has error level equal to 2, the print quality result is judged to be error, and the final error level 2 is output.
Otherwise, if the connected domain with the error level equal to 1 is more than or equal to 2 and the total error pixel number is more than the threshold value T 4 Or the error pixel proportion is greater than the threshold T 5 The print quality result is also determined to be "error", and the final error level 2 is output.
Otherwise, if the connected domain with the error level equal to 1 is more than or equal to 1, the print quality result is judged to be 'warning', and the final error level 1 is output.
Otherwise, the print quality result is judged to be "normal", and the final error level is 0.
It will be understood that such decision logic is merely illustrative and not limiting. In other embodiments, other decision logic may be applicable. For example, the determination of the print quality result may be made based on the accumulated absolute value of the error between the target value and the actual value of the first-layer print image. It should be appreciated that if the error between the target value and the actual value of the first layer printed image is known, various possible decision criteria can be devised to detect print quality. The present disclosure is not likely to exhaust all decision criteria, but it is within the scope of the present disclosure that this does not affect these other decision criteria.
In some embodiments, the print quality result includes a confidence indicating a degree of reliability of the detection, the confidence being a function of a total number of valid pixels and a total number of pixels of the model reference map. For example, the confidence is the ratio of the total number of valid pixels to the total number of pixels of the model reference map. In other embodiments, the confidence is another suitable function of the total number of valid pixels and the total number of pixels of the model reference map.
The actual printed first layer pattern is not necessarily exactly aligned with the first layer pattern of the model reference map, and therefore registration of the model reference map with the global image may also be required to find the error in the best matching position, subject to various systematic errors. In some embodiments, prior to operation (3 e) (i.e., counting the number of erroneous pixels in each pixel in the global image corresponding to at least one of the pixel regions for at least one of the pixel regions), the global image may be registered with the model reference map such that the global image and the model reference map are aligned according to a registration criterion.
In embodiments, various registration methods may be used, such as:
(1) Gray-scale based template matching algorithm: looking for sub-images similar to the template image from the known template image to another image. For example, the global image and the model reference map are binarized, and template matching is performed on the binarized global image and model reference map.
(2) Feature-based matching algorithm: firstly, extracting features of the images, regenerating feature descriptors, and finally, matching the features of the two images according to the similarity of the descriptors. Features of an image may include points, lines (edges), areas (faces), etc., and may be divided into local features and global features.
(3) Relationship-based matching algorithm: the images are matched using a machine learning algorithm.
In one example, a brute force search method may be used to find the best matching location of the global image and the model reference map, and then calculate the error at the best matching location. Specifically, the global image is moved from both x and y directions, an error is calculated at a new position, and if the number of error pixels at the new position is smaller than the number of error pixels at the previously recorded optimal position, the new position is updated to the optimal matching position. To reduce the amount of computation, the search range may be limited to a window range (e.g., 20 pixels, 2mm corresponding to the physical coordinates). And if the number of error pixels of the new position is much larger (for example, 20% larger) than the number of error pixels of the previous position, the search in the current direction is stopped.
As previously described, the error between such target value and actual value can be measured in various possible ways and the print quality result can be determined therefrom. In some embodiments, determining the print quality result may include: the model reference map, the printed image set by the slicing software, and the global image are input into a trained machine learning algorithm (e.g., a classification neural network) to obtain a print quality result output by the trained machine learning algorithm. As previously described, the print images set by the model reference map and the slicing software indicate target print images at a plurality of different positions on the hotbed, while the global image indicates actual print images at the plurality of different positions. The machine learning algorithm may be adapted to determine an application scenario in which an error between a target value and an actual value of a printed image is determined. In the case where there are a large number of training samples, the machine learning algorithm may be trained to detect whether a print defect exists.
Fig. 8 is a flowchart illustrating another method 800 for detecting print quality of a 3D printer according to an example embodiment. The steps of method 800 (steps 810, 820, 830, 840, 850, 860, 88, 890) are identical to the corresponding steps of the method shown in fig. 2, except that method 800 further comprises, after subtracting the hot bed background (step 860), step 870: a luminance field indicating a degree of uniformity of photographing illumination of the color camera is determined, and illumination compensation is performed on the plurality of partial result images based on the luminance field. The reason for adding the illumination compensation step is that: because the light filling lamp is on one side of the camera, the illumination of the image is non-uniform, illumination compensation is needed, otherwise, the spliced image has obvious joints, and the detection effect can be influenced. Fig. 9a and 9b show a global image 900 of a stitching of a plurality of partial result images without illumination compensation and a global image 910 of a stitching of a plurality of partial result images with illumination compensation, respectively.
In some embodiments, determining the luminance field indicative of the degree of photographic illumination uniformity comprises: the luminance field L1 is determined based on predetermined calibration information. The calibration information indicates a default illumination uniformity level in a shooting field of view of the color camera.
In other embodiments, determining the luminance field indicative of the degree of homogeneity of the captured illumination comprises: an average image of the plurality of first partial images is determined. The pixel value of each pixel of the average image corresponds to an average of the pixel values of corresponding pixels of the plurality of first partial images, and the luminance field L2 is determined based on the average image.
In other embodiments, the total luminance field may be determined based on the average of the luminance field L1 and the luminance field L2 obtained as described above. For example, the total luminance field l= (l1+l2)/2.
It will be appreciated that the manner in which the luminance field is determined may be determined according to the particular situation and application scenario.
After determining the luminance field, illumination compensation is performed on the plurality of partial result images based on the luminance field. In some embodiments, the effects of luminance non-uniformities are substantially eliminated by dividing the multiple partial result images pixel-by-pixel by the luminance field L. It is noted that either L1 or L2 may be used directly, and the effect is acceptable.
Fig. 10 shows a block diagram of an apparatus 1000 for detecting print quality of a 3D printer according to an example embodiment. The apparatus 1000 includes a first module 1100, a second module 1200, a third module 1300, a fourth module 1400, a fifth module 1500, a sixth module 1600, a seventh module 1700, and an eighth module 1800. For discussion purposes, the apparatus 1000 is described below in connection with the 3D printer 100 of fig. 1.
The first module 1100 is used to obtain a model reference map. The model reference map is generated by parsing control information generated by the slicing software, and represents an occupied area of at least a portion of the first layer of the 3D model on the thermal bed 110.
The second module 1200 is for generating a scan path based on the model reference map, wherein the scan path is generated such that the color camera 130 sequentially captures images of a plurality of different locations of the occupied zone as the color camera 130 moves along the scan path as the printhead 120 moves relative to the hotbed 110.
The third module 1300 is for moving the color camera 130 along the scan path under the load of the print head 120 and capturing at the plurality of different positions during the movement to obtain a plurality of first partial images indicative of respective images of the thermal bed 110 at the plurality of different positions, respectively.
The fourth module 1400 is for causing the printhead 120 to print the first layer of the 3D model on the hotbed 110.
A fifth module 1500 is for moving the color camera 130 along the scan path under the load of the print head 120 and capturing during this movement at the plurality of different positions to obtain a plurality of second partial images, which are indicative of respective images of the thermal bed at the plurality of different positions after the first layer of the 3D model is printed on the thermal bed 110, respectively.
The sixth module 1600 is configured to determine a plurality of partial result images based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images.
The seventh module 1700 is configured to stitch the plurality of local result images according to the scan path to generate a global image corresponding to the model reference map.
An eighth module 1800 is for the model reference map and the global image, determines a print quality result indicating a print quality of the at least a portion of the first layer of the 3D model.
It should be appreciated that the various modules of the apparatus 1000 shown in fig. 10 may correspond to the various steps in the method 200 described with reference to fig. 2 and the method 800 described with reference to fig. 8. Thus, the operations, features, and advantages described above with respect to method 200 and method 800 apply equally to apparatus 1000 and the modules comprising it. For brevity, certain operations, features and advantages are not described in detail herein.
Although specific functions are discussed above with reference to specific modules, it should be noted that the functions of the various modules discussed herein may be divided into multiple modules and/or at least some of the functions of the multiple modules may be combined into a single module. The particular module performing the actions discussed herein includes the particular module itself performing the actions, or alternatively the particular module invoking or otherwise accessing another component or module that performs the actions (or performs the actions in conjunction with the particular module). Thus, a particular module that performs an action may include that particular module itself that performs the action and/or another module that the particular module invokes or otherwise accesses that performs the action.
It should also be appreciated that various techniques may be described herein in the general context of software hardware elements or program modules. The various modules described above with respect to fig. 10 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, these modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of these modules may be implemented together in a system on a chip (SoC). The SoC may include an integrated circuit chip (which includes one or more components of a processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry), and may optionally execute received program code and/or include embedded firmware to perform functions.
There is also provided, in accordance with an embodiment of the present disclosure, a non-transitory computer-readable storage medium storing instructions for causing the 3D printer 100 as described above to perform a method as described in any of the embodiments of the present disclosure.
There is also provided, in accordance with an embodiment of the present disclosure, a computer program product comprising instructions for causing the 3D printer 100 as described above to perform a method as described in any of the embodiments of the present disclosure.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, sequentially or in a different order, provided that the desired results of the disclosed aspects are achieved, and are not limited herein.
Although embodiments or examples of the present disclosure have been described with reference to the accompanying drawings, it is to be understood that the foregoing methods, systems, and apparatus are merely illustrative embodiments or examples and that the scope of the present disclosure is not limited by these embodiments or examples but only by the claims following the grant and their equivalents. Various elements of the embodiments or examples may be omitted or replaced with equivalent elements thereof. Furthermore, the steps may be performed in a different order than described in the present disclosure. Further, various elements of the embodiments or examples may be combined in various ways. It is important that as technology evolves, many of the elements described herein may be replaced by equivalent elements that appear after the disclosure.

Claims (14)

1. A method for detecting print quality of a 3D printer, wherein the 3D printer comprises: a hot bed; a printhead movable relative to the thermal bed; a color camera disposed on the printhead for capturing an image of a portion of the thermal bed; and at least one processor for controlling the printhead to move relative to the hotbed to print a 3D model layer by layer based on control code generated by slicing software, the method comprising:
obtaining a model reference map, wherein the model reference map is generated by analyzing control information generated by the slicing software, and the model reference map represents an occupied area of at least one part of a first layer of the 3D model on the hot bed;
generating a scan path based on the model reference map, wherein the scan path is generated such that the color camera sequentially captures images of a plurality of different locations of the occupied zone as the color camera moves along the scan path as the printhead moves relative to the hotbed;
moving the color camera along the scan path under the load of the printhead and capturing at the plurality of different locations during the movement to obtain a plurality of first partial images indicative of respective images of the thermal bed at the plurality of different locations, respectively;
Causing the printhead to print a first layer of the 3D model on the thermal bed;
moving the color camera along the scan path under the carriage of the printhead and capturing during this movement at the plurality of different positions to obtain a plurality of second partial images respectively indicative of respective images of the thermal bed at the plurality of different positions after the first layer of the 3D model is printed on the thermal bed;
determining a plurality of partial result images based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images;
splicing the plurality of local result images according to the scanning path to generate a global image corresponding to the model reference image; and
based on the model reference map and the global image, a print quality result is determined, the print quality result being indicative of a print quality of the at least a portion of the first layer of the 3D model.
2. The method of claim 1, wherein the determining a plurality of partial result images comprises:
obtaining a plurality of difference images by calculating the difference value between each second partial image in the plurality of second partial images and the corresponding first partial image in the plurality of first partial images;
Determining, for each difference image, a threshold value for judging valid pixels, each valid pixel indicating that the pixel belongs to the occupied area;
performing a masking operation for each pixel in each difference image, the masking operation comprising:
comparing the pixel value of the pixel with the threshold value, and
in response to the pixel value of the pixel being greater than the threshold, determining that the pixel belongs to the valid pixel and replacing the pixel value of the pixel with the pixel value of the corresponding pixel of the second partial image; and
the plurality of difference images after the masking operation are taken as the plurality of partial result images.
3. The method of claim 2, wherein the determining a threshold for judging valid pixels comprises:
determining the average variance of each pixel in the first partial image corresponding to the difference image;
the threshold is determined based on the average variance.
4. The method of claim 1, further comprising:
determining a brightness field indicative of a degree of homogeneity of a photographing illumination of the color camera; and
and carrying out illumination compensation on the plurality of local result images based on the brightness field.
5. The method of claim 4, wherein the determining a luminance field indicative of a degree of photographic illumination uniformity comprises:
And determining the brightness field according to predetermined calibration information, wherein the calibration information indicates default illumination uniformity degree in a shooting view field of the color camera.
6. The method of claim 4, wherein the determining a luminance field indicative of a degree of photographic illumination uniformity comprises:
determining an average image of the plurality of first partial images, wherein pixel values of pixels of the average image correspond to an average value of pixel values of corresponding pixels of the plurality of first partial images; and
the luminance field is determined based at least in part on the average image.
7. The method of claim 1, wherein the model reference map indicates a target print image at the plurality of different locations, the global image indicates an actual print image at the plurality of different locations, wherein the determining a print quality result comprises:
comparing the target print image at the plurality of different locations with the actual print image at the corresponding one of the plurality of different locations; and
the print quality result is determined based on the comparison.
8. The method of claim 7, wherein the comparing comprises:
Determining a normal printing threshold of a first layer of the 3D model;
binarizing the global image and the model reference map based on the normal print threshold;
identifying an erroneous pixel in the global image by comparing pixel values of corresponding pixels of the binarized global image and the model reference image, wherein the erroneous pixel indicates that a first layer of the 3D model is not printed normally;
determining at least one pixel region in the model reference map representing the occupied region;
for at least one of the at least one pixel region:
counting the number of error pixels in each pixel corresponding to the pixel region in the global image; and
comparing the number of erroneous pixels with a corresponding threshold value and/or comparing the ratio of the number of erroneous pixels to the number of individual pixels in the global image corresponding to the pixel area with a corresponding threshold value.
9. The method of claim 2, wherein the print quality result includes a confidence indicating a degree of reliability of detection, the confidence being a function of a total number of the active pixels and a total number of pixels of the model reference map.
10. The method of any of claims 1 to 9, wherein the stitching the plurality of partial result images according to the scan path comprises:
in response to determining that the plurality of partial result images overlap each other, pixel values of pixels at overlapping positions between the plurality of partial result images are averaged or linearly interpolated.
11. An apparatus for detecting print quality of a 3D printer, wherein the 3D printer comprises: a hot bed; a printhead movable relative to the thermal bed; a color camera disposed on the printhead for capturing an image of a portion of the thermal bed; and at least one processor for controlling the printhead to move relative to the hotbed to print a 3D model layer by layer based on control code generated by slicing software, the apparatus comprising:
a first module for obtaining a model reference map, wherein the model reference map is generated by analyzing control information generated by the slicing software, and the model reference map represents an occupied area of at least a part of a first layer of the 3D model on the hot bed;
a second module for generating a scan path based on the model reference map, wherein the scan path is generated such that the color camera sequentially captures images of a plurality of different locations of the occupied zone as the color camera moves along the scan path as the printhead moves relative to the thermal bed;
A third module for moving the color camera along the scan path under the carriage of the printhead and capturing a plurality of first partial images at the plurality of different positions during the movement to obtain a plurality of first partial images indicative of respective images of the thermal bed at the plurality of different positions, respectively;
a fourth module for causing the printhead to print a first layer of the 3D model on the thermal bed;
a fifth module for moving the color camera along the scan path under the carriage of the printhead and capturing during this movement at the plurality of different positions to obtain a plurality of second partial images respectively indicative of respective images of the thermal bed at the plurality of different positions after a first layer of the 3D model is printed on the thermal bed;
a sixth module for determining a plurality of partial result images based on respective differences between each of the plurality of second partial images and a corresponding one of the plurality of first partial images;
a seventh module, configured to stitch the plurality of local result images according to the scan path, so as to generate a global image corresponding to the model reference map; and
An eighth module for determining a print quality result based on the model reference map and the global image, the print quality result indicating a print quality of the at least a portion of the first layer of the 3D model.
12. A 3D printer, comprising:
a hot bed;
a printhead movable relative to the thermal bed;
a depth sensor disposed on the printhead for measuring a distance of a portion of the thermal bed relative to the depth sensor; and
at least one processor configured to derive a local depth map of the portion of the thermal bed from measurements of the depth sensor and to control movement of the print head relative to the thermal bed to print a 3D model layer by layer based on control code generated by slicing software,
wherein the at least one processor is further configured to execute instructions to implement the method of any one of claims 1 to 10.
13. A non-transitory computer-readable storage medium storing instructions which, when executed by the at least one processor of the 3D printer of claim 12, implement the method of any of claims 1 to 10.
14. A computer program product comprising instructions which, when executed by the at least one processor of the 3D printer of claim 12, implement the method of any one of claims 1 to 10.
CN202211124617.5A 2022-09-15 2022-09-15 Method and device for detecting printing quality of 3D printer and 3D printer Pending CN116175977A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211124617.5A CN116175977A (en) 2022-09-15 2022-09-15 Method and device for detecting printing quality of 3D printer and 3D printer
PCT/CN2023/107561 WO2024055742A1 (en) 2022-09-15 2023-07-14 Method and apparatus for testing printing quality of 3d printer, and 3d printer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211124617.5A CN116175977A (en) 2022-09-15 2022-09-15 Method and device for detecting printing quality of 3D printer and 3D printer

Publications (1)

Publication Number Publication Date
CN116175977A true CN116175977A (en) 2023-05-30

Family

ID=86431344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211124617.5A Pending CN116175977A (en) 2022-09-15 2022-09-15 Method and device for detecting printing quality of 3D printer and 3D printer

Country Status (2)

Country Link
CN (1) CN116175977A (en)
WO (1) WO2024055742A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024055742A1 (en) * 2022-09-15 2024-03-21 上海轮廓科技有限公司 Method and apparatus for testing printing quality of 3d printer, and 3d printer

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10715683B2 (en) * 2017-01-25 2020-07-14 Hewlett-Packard Development Company, L.P. Print quality diagnosis
CN108638497A (en) * 2018-04-28 2018-10-12 浙江大学 The comprehensive detecting system and method for a kind of 3D printer printer model outer surface
CN112596981B (en) * 2020-12-24 2023-04-28 深圳市汉森软件有限公司 Monitoring method, device, equipment and storage medium for three-dimensional printing process
CN114770946A (en) * 2022-04-24 2022-07-22 上海轮廓科技有限公司 Method and device for detecting printing quality of 3D printer and 3D printer
CN114789555A (en) * 2022-05-13 2022-07-26 上海轮廓科技有限公司 Method and device for 3D printer, 3D printer and storage medium
CN116175977A (en) * 2022-09-15 2023-05-30 上海轮廓科技有限公司 Method and device for detecting printing quality of 3D printer and 3D printer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024055742A1 (en) * 2022-09-15 2024-03-21 上海轮廓科技有限公司 Method and apparatus for testing printing quality of 3d printer, and 3d printer

Also Published As

Publication number Publication date
WO2024055742A1 (en) 2024-03-21

Similar Documents

Publication Publication Date Title
US10008005B2 (en) Measurement system and method for measuring multi-dimensions
WO2023207861A1 (en) Method and apparatus for testing printing quality of 3d printer, and 3d printer
CN104180769B (en) The system and method for carrying out active surface measurement using laser displacement sensor
US8786700B2 (en) Position and orientation measurement apparatus, position and orientation measurement method, and storage medium
KR102103853B1 (en) Defect inspection device and defect inspection method
US10885626B2 (en) Identifying apparatus, identifying method, and program
WO2024055742A1 (en) Method and apparatus for testing printing quality of 3d printer, and 3d printer
Okarma et al. Improved quality assessment of colour surfaces for additive manufacturing based on image entropy
JP6161276B2 (en) Measuring apparatus, measuring method, and program
US10554945B2 (en) Stereo camera
US20210031507A1 (en) Identifying differences between images
US20180082115A1 (en) Methods of detecting moire artifacts
JP7392488B2 (en) Recognition method, device, and image processing device for false detection of remains
JP2010157093A (en) Motion estimation device and program
CN110390682B (en) Template self-adaptive image segmentation method, system and readable storage medium
CN111833371A (en) Image edge detection method based on pq-mean sparse measurement
CN113246473B (en) Compensation method and compensation device for 3D printer, 3D printer and storage medium
US20210382496A1 (en) Position detection apparatus, position detection system, remote control apparatus, remote control system, position detection method, and program
CN113902740A (en) Construction method of image blurring degree evaluation model
CN113034492A (en) Printing quality defect detection method and storage medium
US20150117769A1 (en) Blob-Encoding
KR20150106878A (en) Image generation device, defect inspection device, and defect inspection method
CA3219745A1 (en) Texture mapping to polygonal models for industrial inspections
TW202329043A (en) Defect detection method, defect detection system, and recording medium
JP2004326382A (en) Component recognizing system and method, and program for recognizing component

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination