US20230081400A1 - Enhanced three dimensional printing of vertical edges - Google Patents

Enhanced three dimensional printing of vertical edges Download PDF

Info

Publication number
US20230081400A1
US20230081400A1 US17/828,271 US202217828271A US2023081400A1 US 20230081400 A1 US20230081400 A1 US 20230081400A1 US 202217828271 A US202217828271 A US 202217828271A US 2023081400 A1 US2023081400 A1 US 2023081400A1
Authority
US
United States
Prior art keywords
intensity level
light intensity
pixel
printer
changed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/828,271
Inventor
Richard M. Greene
Brian James Adzima
Stephen James Kranz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Holo Inc
Original Assignee
Holo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Holo Inc filed Critical Holo Inc
Priority to US17/828,271 priority Critical patent/US20230081400A1/en
Assigned to HOLO, INC. reassignment HOLO, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AUTODESK, INC.
Assigned to AUTODESK, INC. reassignment AUTODESK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ADZIMA, BRIAN JAMES, GREENE, RICHARD M., KRANZ, STEPHEN JAMES
Publication of US20230081400A1 publication Critical patent/US20230081400A1/en
Assigned to VENTURE LENDING & LEASING IX, INC., WTI FUND X, INC. reassignment VENTURE LENDING & LEASING IX, INC. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOLO, INC.
Assigned to Southwest Greene International, Inc. reassignment Southwest Greene International, Inc. UCC ARTICLE 9 SALE Assignors: VENTURE LENDING & LEASING IX, INC., WTI FUND X, INC.
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y10/00Processes of additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/10Processes of additive manufacturing
    • B29C64/106Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material
    • B29C64/124Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using layers of liquid which are selectively solidified
    • B29C64/129Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using layers of liquid which are selectively solidified characterised by the energy source therefor, e.g. by global irradiation combined with a mask
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/10Processes of additive manufacturing
    • B29C64/106Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material
    • B29C64/124Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using layers of liquid which are selectively solidified
    • B29C64/129Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using layers of liquid which are selectively solidified characterised by the energy source therefor, e.g. by global irradiation combined with a mask
    • B29C64/135Processes of additive manufacturing using only liquids or viscous materials, e.g. depositing a continuous bead of viscous material using layers of liquid which are selectively solidified characterised by the energy source therefor, e.g. by global irradiation combined with a mask the energy source being concentrated, e.g. scanning lasers or focused light sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • This patent document relates to three-dimensional (3D) printing using photopolymers.
  • Photopolymer-based 3D printers that use bottom-up illumination can project light upwards through an optically transparent window into a vat of photoactive resin to cure at least a portion of the resin.
  • Such printers can build a 3D structure by forming one layer at a time, where a subsequent layer adheres to the previous layer.
  • the light can be patterned to cause some portions of the resin to cure and other portions not to cure, thereby creating substructures of the 3D structure.
  • a described technique includes accessing, by a data processing apparatus, an original image corresponding to a slice of a three-dimensional model prepared for printing on a three-dimensional printer that uses a photopolymer to create a three-dimensional structure; accessing a pixel reduction factor that is associated with an increased exposure duration parameter, the increased exposure duration parameter being greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased
  • Implementations can include sending the modified image to the three-dimensional printer; and controlling the three-dimensional printer to use the increased exposure duration parameter when printing the slice in accordance with the modified image to build a portion of a three-dimensional structure. Controlling the three-dimensional printer to use the increased exposure duration parameter can include sending the increased exposure duration parameter to the three-dimensional printer.
  • the increased exposure duration parameter is selected to increase a curing quality at edges of the three-dimensional structure, and the pixel reduction factor is selected based on the increased exposure duration parameter to eliminate or minimize over-curing for interior areas within the three-dimensional structure.
  • Implementations can include reducing a build area represented by the original image to preserve dimensional accuracy under the increased exposure duration parameter.
  • Classifying pixels of the original image can include using the reduced build area to identify the interior pixels and the exterior pixels. Classifying the pixels can include accessing neighboring pixels of a target pixel of the original image, the target pixel having an intensity level greater than a black intensity level; classifying the target pixel as an exterior pixel if one or more of the neighboring pixels have the black intensity level; and classifying the target pixel as an interior pixel if all of the neighboring pixels have an intensity level greater than the black intensity level.
  • a described technology can be used to smooth out vertical edges in three-dimensional printing.
  • a described technology can improve edge quality without over-curing interior areas.
  • a described technology can improve edge quality for thick slices, such as those of 100 microns thickness or greater.
  • a described technology can be used to more accurately print a 3D structure.
  • FIG. 1 shows an example of a 3D printing system coupled with a computer.
  • FIGS. 2 A, 2 B, 2 C, and 2 D show different intensity and curing profiles that motivate using a vertical edge enhancement technique.
  • FIGS. 3 A and 3 B respectively show cross-sections of an example of multiple printed layers of a 3D structure that are printed without and with vertical edge enhancement.
  • FIG. 4 shows a flowchart of an example of a process that enhances edges of a 3D digital model for manufacturing with a 3D printer.
  • FIG. 5 shows a flowchart of an example of a process that performs an edge enhancement routine.
  • FIG. 6 shows a flowchart of another example of a process that performs an edge enhancement routine.
  • FIG. 7 shows a flowchart of an example of a process to determine an exposure duration parameter and a pixel reduction factor.
  • FIGS. 8 A and 8 B show edge views of an example of a 3D structure printed without and with edge enhancement from the same digital model.
  • FIGS. 9 A, 9 B, 9 C, 9 D, and 9 E show images associated with another example of a 3D structure printed without and with edge enhancement from the same digital model.
  • FIG. 1 shows an example of a 3D printing system 100 coupled with a computer 150 .
  • the computer 150 can provide information about a 3D structure to the 3D printing system 100 for printing.
  • the computer 150 can communicate with a controller 145 of the printing system 100 via a wireline or wireless connection.
  • the controller 145 can include integrated circuit technology, such as an integrated circuit board with embedded processor and firmware to control various system components such as a 3D printing mechanism 140 and a light projection device 142 .
  • the system 100 includes a vat 110 to hold a liquid 120 , which includes one or more photoactive resins.
  • the vat 110 includes a window 115 in its bottom through which light is transmitted to cure resin to form a 3D printed structure 160 in a layer-by-layer build process.
  • the 3D printed structure 160 is shown as a block, but as will be appreciated, a wide variety of complicated shapes can be 3D printed.
  • the structure 160 is 3D printed on a build plate 130 , which can be connected by a rod 135 to a 3D printing mechanism 140 .
  • the printing mechanism 140 can include various mechanical structures for moving the build plate 130 within the vat 110 . This movement is relative movement, and thus the moving piece can be build plate 130 , the vat 110 , or both, in various implementations.
  • the window 115 includes a material such as polydimethylsiloxane (PDMS) to prevent resin from adhering to the window 115 during a curing procedure.
  • PDMS polydimethylsiloxane
  • Other techniques can be used to prevent resin from adhering to the window 115 such as a photo-inhibition technique that prevents resin from curing within a section of the vat 110 immediately above the window 115 , while allowing resin to cure further away from the window 115 .
  • the light projection device 142 can be positioned below the window 115 .
  • the controller 145 can operate the light projection device 142 to project a pattern of light 185 into the vat 110 to form substructures of the structure 160 .
  • the light 185 has a wavelength which is used to create the 3D structure 160 on the build plate 130 by curing the photoactive resin in the liquid 120 within a photo-initiation region 175 , in accordance with a defined pattern or patterns.
  • the wavelength can be selected based on the characteristics of the photoactive resin in the liquid 120 .
  • the build plate 130 can start at a position near the bottom of the vat 110 , and varying patterns of the light 185 are directed through the window 115 to create layers of the solid structure 160 as the build plate 130 is raised out of the vat 110 by the printing mechanism 140 .
  • the printing mechanism 140 can employ a stepwise separation mechanism that raises the build plate 130 by a predetermined amount after each layer completion, e.g., after a predetermined curing time.
  • the printing mechanism 140 can include mechanisms to aid in separation, e.g., by providing a rotation out of the plane of FIG. 1 .
  • the printing mechanism 140 can employ a continuous separation mechanism that continuously raises the build plate 130 .
  • the light projection device 142 can be configured to modulate its light output based on a two dimensional grid of pixels.
  • the light projection device 142 can include a pixel addressable filter to allow controlled amounts of light to pass at some pixel locations while blocking or deflecting light at other pixel locations from a light source within the light projection device 142 .
  • a pixel addressable filter can include a digital micro-mirror device (DMD).
  • the light projection device 142 can include a pixel addressable light source to produce controlled amounts of light at some pixel locations and not produce light at other pixel locations.
  • the light projection device 142 includes a liquid crystal display (LCD) device, discrete light emitting diode (LED) array device, laser, or a digital light processing (DLP) projector.
  • LCD liquid crystal display
  • LED discrete light emitting diode
  • DLP digital light processing
  • the 3D printing system 100 can include sensors and be designed to modify its operations based on feedback from these sensors.
  • the 3D printing system 100 can use closed loop feedback from sensors in the printer to improve print reliability.
  • Such feedback sensors can include one or more strain sensors on the rod 135 holding the build plate 130 to detect if adhesion has occurred and stop and/or adjust the print, and one or more sensors to detect polymer conversion, such as a spectrometer, a pyrometer, etc. These sensors can be used to confirm that the 3D printing is proceeding correctly, to determine if the resin has been fully cured before the 3D printing system 100 proceeds to the next layer, or both.
  • one or more cameras can be used along with computer vision techniques to check that the print is proceeding as expected. Such cameras can be positioned under the vat 110 to examine the output, e.g., 3D printed layer, which the controller 145 can compare to the input, e.g., mask or layer image.
  • the computer 150 can include a processor 152 , memory 154 , and interfaces such as a network interface or a Universal Serial Bus (USB) interface.
  • the processor 152 can be one or multiple processors, which can each include multiple processor cores.
  • the memory 154 can include volatile memory such as Random Access Memory (RAM).
  • the memory 154 can include non-volatile memory such as flash memory or read-only memory (ROM).
  • the computer 150 can include one or more types of computer storage media and devices, which can include the memory 154 , to store instructions of programs that run on the processor 152 .
  • a 3D printing program 156 can be stored in the memory 154 and run on the processor 152 to implement the techniques described herein.
  • the controller 145 can include the 3D printing program 156 .
  • the 3D printing program 156 can transform a digital model into a sequence of layers that collectively describe the structure 160 .
  • the 3D printing program 156 can access a file containing mesh data that represents a digital model.
  • Mesh data can include descriptions of geometric shapes such as polygons and their locations within the digital model.
  • the 3D printing program 156 can map the digital model into three-dimensional discrete points called voxels.
  • a voxel can be mapped to a pixel within a layer.
  • the digital model can be sliced into grids of pixels and each pixel represents a voxel.
  • a voxel can be fully contained within the digital model, partially contained within the digital model, or outside of the digital model.
  • a pixel corresponding to a voxel fully contained within the model can be assigned to a white intensity level which causes light to be projected onto a corresponding printing area to cure resin.
  • a pixel corresponding to voxel partially contained within the model can be assigned to a grayscale intensity level which causes some light to be projected onto a corresponding printing area to cure some resin.
  • a pixel corresponding to a voxel not contained within the model can be assigned to a black intensity level which causes light not to be projected onto a corresponding printing area so that resin does not cure; this type of pixel can be referred to as a black pixel or a non-printing pixel.
  • the pixels on the exterior of the model are those that are either fully or partially contained but adjacent to one or more not contained pixels.
  • the pixels on the interior of the model are those that are fully contained and adjacent to other pixels that are all either fully or partially contained.
  • the 3D printing program 156 , the controller 145 , or both can employ a vertical edge enhancement technique 158 to enhance vertical edges.
  • the vertical edge enhancement technique 158 can use an increased exposure duration parameter to deliver greater than nominal curing doses to the one or more edges of the 3D printed structure 160 to form smoother vertical edges.
  • the vertical edge enhancement technique 158 can modify slice image data to increase curing at exterior pixels of a slice image and prevent over-curing at interior pixels of the slice image.
  • exterior pixels can correspond to the one or more edges of the 3D printed structure 160 and can be referred to as edge pixels.
  • the 3D printing program 156 can output layer information, such as graphic files or light modulation command sequences, that represent respective patterns of light to be generated for each layer of the model.
  • the 3D printing program 156 , the controller 145 , or both can modify slice images by reducing the intensity values of interior portions, e.g., corresponding to interior pixels, of the objects to be printed from white to a dark gray (e.g., 25% of their original values) via the vertical edge enhancement technique 158 . Portions of the slice images at the edges of objects are left at their original gray or white values.
  • the exposure time can be increased in inverse proportion to the reduction of the interior values (e.g., by 400%), such that the interior portions receive the same dose as they would have with the original white at the original exposure time, while the edge regions receive a proportionately larger (e.g., 400% larger) dose. This can improve the verticality of the printed edges by providing them with a higher dose of light than is received by the interior portions of the print.
  • FIGS. 2 A, 2 B, 2 C, and 2 D show different intensity and curing profiles that motivate using a vertical edge enhancement technique.
  • FIG. 2 A shows a graph 205 of an example of an intensity profile.
  • the intensity profile represents a Gaussian-like fall-off in light intensity in the x-direction starting at the edge 208 of an exposed region in a slice.
  • pixels to the left of the edge 208 are illuminated, whereas pixels to the right of the edge 208 are not.
  • FIG. 2 B shows a cross-section of an example of a curing profile 210 without absorption. Since there is no absorption in this theoretical example, the curing region 220 extends fully and vertically in the z-direction. Due to the fall-off shown in FIG. 2 A in the x-direction, the curing region 220 gradually stops at the curing threshold 215 in the x-direction. Areas to the right of the curing threshold 215 do not receive a sufficient dose to cure a photopolymer.
  • FIG. 2 C shows a cross-section of an example of a curing profile 230 that experiences absorption under a nominal curing dose.
  • the nominal curing dose is associated with a nominal exposure duration parameter. Since there is absorption in this example, the curing region 240 gradually stops at the curing threshold 235 in the z-direction. Due to the fall-off shown in FIG. 2 A in the x-direction, the curing region 240 gradually stops at the curing threshold 235 in the x-direction.
  • FIG. 2 D shows a cross-section of an example of a curing profile 250 that experiences less absorption using a greater than nominal curing dose associated with a vertical edge enhancement technique.
  • the vertical edge enhancement technique uses an increased exposure duration parameter to provide a greater than nominal curing dose at the edge. Since there is absorption in this example, the curing region 260 gradually stops at the curing threshold 255 in the z-direction. Due to the fall-off shown in FIG. 2 A in the x-direction, the curing region 260 gradually stops at the curing threshold 255 in the x-direction.
  • the curing threshold 255 is pushed out farther in the z-direction, e.g., a greater volume of photopolymer is cured, when compared to the curing threshold 235 of FIG. 2 C under the nominal dose. Such a push out can result in smoother edges between layers.
  • FIGS. 3 A and 3 B respectively show cross-sections of an example of multiple printed layers of a 3D structure that are printed without and with vertical edge enhancement.
  • FIG. 3 A shows a cross-section of an example of multiple printed layers 305 a , 305 b , 305 c of a 3D structure 301 that are printed without vertical edge enhancement.
  • the printed layers 305 a - c exhibit scalloping due to absorption losses using nominal curing doses associated with a nominal exposure duration parameter.
  • the curved shaded regions 307 a , 307 b , 307 c representing cured resin, do not fully extend from a left side 320 a of a printing region corresponding to an exterior pixel to a right side 320 b of the printing region for each of the layers 305 a - c .
  • the right side 320 b of the printing region represents an edge of the 3D structure 301 . The reason for the uncured areas of each printing region is explained above with respect to FIG. 2 C .
  • FIG. 3 B shows a cross-section of an example of multiple printed layers 355 a , 355 b , 355 c of a 3D structure 351 that are printed with vertical edge enhancement.
  • Vertical edge enhancement can minimize scalloping due to absorption losses by using a greater than nominal curing dose.
  • the partially curved shaded regions 357 a , 357 b , 357 c representing cured resin, extend further from a left side 320 a of a printing region corresponding to an exterior pixel to a right side 320 b of the printing region for each of the layers 355 a - c than the respective layers 305 a - c of FIG. 3 A .
  • the right side 320 b of the printing region represents an edge of the 3D structure 351 .
  • the reason for the greater area of cured resin is explained above with respect to FIG. 2 D .
  • FIG. 4 shows a flowchart of an example of a process that enhances edges of a 3D digital model for manufacturing with a 3D printer.
  • a device such as a printer controller or a computer can perform this process.
  • the process obtains a digital model that describes a 3D structure.
  • Obtaining a digital model can include accessing a file that defines the meshes that cover the surface of the structure.
  • the file can be in a format such as the STL (STereoLithography) file format or the Polygon File Format (PLY). Other types of file formats are possible.
  • the file can identify a list of voxels, or other representation thereof, that are included in the structure.
  • the digital model is received over a network connection.
  • a user can upload a digital model to a server on the Internet for manufacturing with a 3D printer.
  • a digital model can be created from actual objects.
  • a 3D scanner such as Magnetic resonance imaging (MRI) scanner, computed tomography (CT) scanner, or laser scanner can be used to generate digital models.
  • MRI Magnetic resonance imaging
  • CT computed tomography
  • a digital model can be generated by different images of an object.
  • a digital model can be generated based on a microtome sectioning process.
  • the process accesses a pixel reduction factor that is associated with an increased exposure duration parameter.
  • the increased exposure duration parameter can increase curing at the edges during 3D printing, while the pixel reduction factor can prevent over-curing at interior areas during 3D printing.
  • Accessing a pixel reduction factor can include retrieving a value from a database or a value embedded in a software program.
  • the pixel reduction factor is determined based on a pixel factor determination process; see, e.g., the process of FIG. 7 .
  • the pixel reduction factor is determined slice-by-slice based on the digital model.
  • the process can selectively modify the digital model to preserve dimensional accuracy. For example, using the increased exposure duration parameter may cause thicker edges to develop during printing, and eroding the original image may be required to preserve dimensional accuracy.
  • the process can reduce, e.g., shrink, one or more build areas represented by the digital model to preserve dimensional accuracy under the increased exposure duration parameter.
  • build area reduction can occur later in the process such as when individual slices undergo edge enhancement.
  • the process maps the digital model onto a three dimensional grid of voxels associated with a 3D printer.
  • Mapping the digital model can include identifying voxels that are fully contained within the digital model, voxels that are partially contained within the digital model, and voxels that are outside of the digital model.
  • the process can receive one or more parameters that describe the capabilities of the 3D printer such as resolutions in the X, Y, and Z dimensions, and maximum sizes for each dimension. The process can use these parameters to determine the number and shape of voxels for the grid.
  • each voxel in the grid can correspond to a voxel that the 3D printer can form.
  • each voxel in the grid can correspond to a pixel that the 3D printer can form within a layer.
  • a voxel is sliced in an X-Y plane at its midpoint location along the printer's formation axis, e.g., Z axis, which is perpendicular to that plane, to form a corresponding pixel.
  • the process creates an image of a 2D slice of the 3D grid.
  • Creating an image can include accessing a rectangular slice of the 3D grid, where the slice is perpendicular to the direction of printing.
  • Voxels, and their corresponding pixels within the image are assigned a white intensity level if they are a part of the 3D structure, e.g., corresponding to points that should be cured.
  • Voxels, and their corresponding pixels within the image are assigned a black intensity level if they are not a part of the 3D structure, e.g., corresponding to points that should not be cured.
  • intensity levels can be represented as 8-bit values that range from 0 (black) to 255 (white).
  • the white intensity level is sufficient to cure photoactive resin during a predetermined curing time for a layer.
  • the white intensity level is a percentage (e.g., 90% or 95%) of a maximum intensity level generated by a 3D printer; in this case, the maximum intensity level exceeds an intensity level sufficient to cure photoactive resin.
  • different photoactive resins can require different curing intensities, durations, or both.
  • the process creates a slice of a specified thickness, uses mesh data to generate a 2D image at that slice, and uses the specified slice thickness to set the exposure duration for that slice.
  • the process performs an edge enhancement routine on the 2D slice image using the pixel reduction factor to produce a modified image.
  • the edge enhancement routine can include reducing intensity levels of the interior pixels by the pixel reduction factor and maintaining intensity levels of the exterior pixels and black pixels. See FIGS. 5 and 6 for examples of an edge enhancement routine.
  • the process determines whether there is a next slice to create. If so, the process loops back to 415 to create another slice. If there are no more slices, then the process continues at 435 . In some implementations, all slice images are created, and then the edge enhancement routine is applied to each of the slice images.
  • a batch of slice images is created, and then the edge enhancement routine is applied to each of the slice images in the batch, together or in sequence, e.g., a 3D printer can process a small number of slices (e.g., 2-5 slices) at a time in a processing pipeline, and the edge enhancement routine can operate on each slice (as appropriate for that slice) in the processing pipeline in turn.
  • a 3D printer can process a small number of slices (e.g., 2-5 slices) at a time in a processing pipeline, and the edge enhancement routine can operate on each slice (as appropriate for that slice) in the processing pipeline in turn.
  • the process generates one or more graphic files based on the modified images.
  • the digital model is sliced into N layers in the Z dimension and the process outputs a graphic file such as a file in accordance with a file format such as Portable Network Graphics (PNG) for each layer.
  • PNG Portable Network Graphics
  • the process outputs a graphic file containing multiple images for respective layers. Note that pixels within an image for a slice may not be modified for various reasons such as a slice's lack of containment within the 3D structure, e.g., a slice's lack of interior pixels, but such an image may be deemed as a modified image, nonetheless, due to the image being analyzed by the edge enhancement routine at 420 .
  • the process sends the one or more graphic files to the 3D printer.
  • the process controls the 3D printer to use the increased exposure duration parameter when printing the one or more graphic files.
  • the process controls the 3D printer by sending the increased exposure duration parameter together with the one or more graphic files.
  • an increased exposure duration parameter is embedded as metadata within the one or more graphic files.
  • sending information such as the one or more graphic files and the one or more exposure duration parameters can include transmitting data via a network connection (e.g., wireline or wirelessly) or Universal Serial Bus (USB).
  • a 3D printer can receive the digital model itself, perform the process of FIG.
  • a standalone computer performs the process of FIG. 4 and sends the contents of the one or more graphic files to the 3D printer via a wireline or wireless connection.
  • the process can determine an exposure duration on a per-slice basis. As such, a print job can use two or more different exposure durations for two or more respective slices. Moreover, the process can use a longer duration for at least the initial slice that is printed directly on the build plate to help provide a solid foundation for subsequent slices.
  • FIG. 5 shows a flowchart of an example of a process that performs an edge enhancement routine.
  • a device such as a printer controller or a computer can perform this process.
  • the process accesses an original image corresponding to a slice of a 3D model prepared for printing on a 3D printer that uses a photopolymer to create 3D objects. Accessing an original image can include retrieving an image from a multi-image file or a single-image file.
  • the process accesses a pixel reduction factor that is associated with an increased exposure duration parameter.
  • the increased exposure duration parameter is greater than an unmodified, e.g., nominal, exposure duration parameter associated with the photopolymer.
  • the pixel reduction factor is a fixed parameter within the process.
  • the pixel reduction factor is passed as a variable.
  • the pixel reduction factor is determined based on a pixel factor determination process; see, e.g., the process of FIG. 7 .
  • the process classifies pixels of the original image to identify interior, exterior, and black pixels.
  • the process includes accessing neighboring pixels of a target pixel of the original image, the target pixel having an intensity level greater than a black intensity level.
  • the process can classify the target pixel as an exterior pixel based on one or more of the neighboring pixels having the black intensity level.
  • the process can classify the target pixel as an interior pixel based on all of the neighboring pixels having an intensity level greater than the black intensity level.
  • grayscale input values for input intensity levels are possible.
  • the process can use a high-pass filtering technique such as a technique based on a Sobel filter or a Cany filter to classify pixels.
  • the process includes reducing a build area represented by the original image to preserve dimensional accuracy under the increased exposure duration parameter. For example, using the increased exposure duration parameter may cause thicker edges to develop during printing, and eroding the original image may be required.
  • classifying pixels of the original image, at 515 can include using the reduced build area to identify interior, exterior, and black pixels.
  • the process reduces intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the 3D printer corresponding to the interior pixels receive curing doses that are comparable to nominal doses. For such pixels, doses received under the reduced intensity levels and the increased exposure duration parameter are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels.
  • reducing an intensity level can include retrieving a pixel intensity level from an input image buffer, reducing the level, and writing the reduced level to an output image buffer.
  • reducing an intensity level can include reading a value from an array, reducing the value, and writing the reduced value back to the array.
  • comparable doses are the same doses or about the same, e.g., within a 1% or a 5% variance from the original doses.
  • the process maintains intensity levels of the exterior pixels so that printing areas of the 3D printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than nominal doses. Nominal doses for such pixels correspond to doses received under an unmodified exposure duration parameter.
  • maintaining an intensity level can include copying a pixel intensity level from an input image buffer to an output image buffer.
  • maintaining an intensity level can include leaving a pixel intensity level unchanged.
  • the process outputs a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing on the 3D printer.
  • Outputting a modified image can include dumping the contents of an output image buffer into an image file.
  • FIG. 6 shows a flowchart of another example of a process that performs an edge enhancement routine.
  • the process retrieves a 2D slice image of a 3D structure.
  • the process accesses a target pixel of the image. Accessing a target pixel can include retrieving a pixel intensity value.
  • the process determines whether the target pixel is black. Typically, a pixel intensity value of zero corresponds to a black pixel. In some implementations, pixel intensity values falling below a threshold can be classified as a black pixel and can be reassigned to be a zero intensity value. If the target pixel is black, then the process inserts the pixel unchanged into the output image at 620 . Otherwise, the process continues at 625 . In some implementations, inserting the pixel unchanged into the output image can include writing a pixel intensity level as-is to an output buffer.
  • the process accesses pixels that neighbor the target pixel. Accessing pixels that neighbor the target pixel can include retrieving intensity values for the pixel neighborhood of the target pixel. Typically, the pixel neighborhood includes eight adjacent pixels that immediately surround the target pixel in the same plane as the slice; however, less than eight pixels may be used if the target pixel is on or close to an image boundary.
  • the process determines whether any neighboring pixels are black. If any neighboring pixels are black, then the target pixel is classified as an exterior pixel and the process, at 620 , inserts the pixel unchanged into the output image at 620 .
  • the process inserts the modified pixel into the output image.
  • inserting the modified pixel unchanged into the output image can include writing a reduced pixel intensity level to an output buffer.
  • the process determines whether there is a next target pixel. In some implementations, the process iterates through each pixel of each row until the last pixel of the last row is processed. If there is a next target pixel, then process continues at 610 . Otherwise, the process, at 650 , produces the output image including any modified pixels. In some implementations, producing the output image can include dumping an output buffer into an image file. In some implementations, a process can extract edges from an original image into an extracted edge image by using one or more image processing techniques, such as by using Sobel or Cany filters. The process can reduce the values in the entire original image by a pixel reduction factor to produce a reduced version of the original image. The process can copy any non-black pixels in the extracted edge image back into the reduced version of the original image.
  • FIG. 7 shows a flowchart of an example of a process to determine an exposure duration parameter and a pixel reduction factor.
  • a device such as a printer controller or a computer can perform this process.
  • the process accesses a nominal exposure duration parameter for a photopolymer used by a 3D printer. Accessing a nominal exposure duration parameter can include retrieving a value stored in a database or a value embedded in a software routine.
  • the process determines the nominal exposure duration parameter based on a slice thickness parameter, a resin type used by the 3D printer, a light source used by the 3D printer, or a combination thereof.
  • the process determines an increased exposure duration parameter to enhance vertical edge formation during printing by the 3D printer.
  • the duration parameters can be expressed in any appropriate units such as seconds or milliseconds.
  • the increased exposure duration parameter can improve curing quality at the edges. Determining an increased exposure duration parameter can include using a slice thickness parameter, a resin type used by the 3D printer, a light source used by the 3D printer, or a combination thereof.
  • edge enhancement and exposure duration there is a trade off between edge enhancement and exposure duration.
  • the increase in edge fidelity can be expressed as the product of contrast between brightness of an exterior pixel and brightness of an interior pixel. If the brightness ratio of edge to interior is 4:1, the layer will take a 4 ⁇ increase in exposure duration. Different ratios are possible. However, ratios past a threshold may result in diminishing returns and may harm print quality, e.g., loss of resolution in the XY plane caused by light spreading. Further, print time increases as the ratio increases due to the longer exposure durations.
  • the process determines an exposure delta based on the nominal exposure duration parameter and the increased exposure duration parameter. For example, if the nominal exposure duration parameter is two seconds, and the increased exposure duration parameter is 8 seconds, then the exposure delta is a factor of 400%.
  • the process determines a pixel reduction factor based on the exposure delta to prevent over-curing of interior areas.
  • the pixel reduction factor can be expressed as a percentage. Determining the pixel reduction factor can include computing an inverse of the exposure delta and using the inverse value as the pixel reduction factor, and accordingly, the product of the pixel reduction factor and the exposure delta is one. If the exposure delta is a factor of 400% for example, then a corresponding pixel reduction factor is 25%.
  • the process can access a slice thickness parameter for a slice, and trigger edge enhancement to be performed based on the slice thickness exceeding a threshold.
  • the process can output a file that includes the final exposure duration values on a per-slice basis for use by the printer during printing.
  • FIGS. 8 A and 8 B show edge views of an example of a 3D structure printed without and with edge enhancement from the same digital model.
  • the slices derived from the digital model to create the printed structure have a layer height of 100 microns.
  • FIG. 8 A shows an edge view 801 of an example of a 3D structure that was printed without vertical edge enhancement.
  • FIG. 8 A pixel intensity levels were left unchanged and a nominal exposure duration parameter of 2 seconds was used during printing.
  • FIG. 8 B shows an edge view 851 of the example of the 3D structure that was printed with vertical edge enhancement.
  • FIG. 8 B pixel intensity levels of interior pixels were reduced and a greater than nominal exposure duration parameter of 8 seconds was used during printing.
  • scalloping is reduced in FIG. 8 B .
  • FIGS. 9 A, 9 B, and 9 C show images of another example of a 3D structure printed without and with edge enhancement from the same digital model. Further, FIGS. 9 D and 9 E show examples of slice images that are used to print the 3D structure.
  • FIG. 9 A shows an image 901 of a 3D structure that was printed without and with vertical edge enhancement from the same digital model. The upper portion 920 of the image 901 was printed without vertical edge enhancement. The lower portion 930 of the image 901 was printed with vertical edge enhancement.
  • FIG. 9 B shows a magnified image 951 of the 3D structure that corresponds to the upper portion 920 of FIG. 9 A .
  • FIG. 9 C shows an image 971 of the 3D structure that corresponds to the lower portion 930 of FIG.
  • FIG. 9 D shows an example of a slice image 981 that is used to print the upper portion 920 of the image 901 of FIG. 9 A .
  • the slice image 981 includes a region 982 of black pixels, and a region 984 of white pixels.
  • FIG. 9 D shows an example of a slice image 991 , transformed by an edge enhancement process, that is used to print the lower portion 930 of the image 901 of FIG. 9 A .
  • the slice image 991 includes a region 992 of black pixels, a region of white pixels 994 , and a region 996 of grayscale pixels.
  • a system can include a processor; and a memory structure coupled with the processor, the memory structure configured to store an original image corresponding to a slice of a three-dimensional model prepared for printing on a three-dimensional printer that uses a photopolymer to create a three-dimensional structure.
  • the processor can be configured to perform operations comprising: accessing a pixel reduction factor that is associated with an increased exposure duration parameter, wherein the increased exposure duration parameter is greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and outputting a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing on the three-dimensional printer.
  • a three-dimensional printer can include a vat capable of holding a liquid comprising a photopolymer, the vat including a window.
  • the printer can include a memory structure to store information including an original image corresponding to a slice of a three-dimensional model prepared for creation of a three-dimensional structure via the three-dimensional printer.
  • the printer can include a build plate configured and arranged to move within the vat during three-dimensional printing of the three-dimensional structure on the build plate.
  • the printer can include a light projection device to project light through the window.
  • the printer can include a controller to control the printing of the three-dimensional structure, movement of the build plate, and light modulation of the light projection device.
  • the controller can be configured to perform operations that include accessing a pixel reduction factor that is associated with an increased exposure duration parameter, wherein the increased exposure duration parameter is greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and creating a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing.
  • Operations can include printing the slice in accordance with the modified image to build a portion of the three-dimensional structure; and controlling the light
  • Embodiments of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Embodiments of the subject matter described in this specification can be implemented using one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer-readable medium can be a manufactured product, such as hard drive in a computer system or an optical disc sold through retail channels, or an embedded system.
  • the computer-readable medium can be acquired separately and later encoded with the one or more modules of computer program instructions, such as by delivery of the one or more modules of computer program instructions over a wired or wireless network.
  • the computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, or a combination of one or more of them.
  • data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a runtime environment, or a combination of one or more of them.
  • the apparatus can employ various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by, and/or under the control of, one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few.
  • Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back-end, middleware, or front-end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
  • Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • LAN local area network
  • WAN wide area network
  • inter-network e.g., the Internet
  • peer-to-peer networks e.g., ad hoc peer-to-peer networks.
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Abstract

Techniques and systems for enhancing vertical edges during photopolymer-based three-dimensional (3D) printing are described. A described technique includes accessing an original image corresponding to a slice of a 3D model prepared for printing on a 3D printer; accessing a pixel reduction factor that is associated with an increased exposure duration parameter; classifying pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that corresponding printing areas receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that corresponding printing areas receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and outputting a resulting modified image for 3D printing.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 17/508,091, filed Oct. 22, 2021, which is a continuation of U.S. patent application Ser. No. 17/206,810, filed Mar. 19, 2021, which is a continuation of U.S. patent application Ser. No. 16/998,106, filed Aug. 20, 2020, which is a continuation of U.S. patent application Ser. No. 15/719,118, filed Sep. 28, 2017, now U.S. Pat. No. 10,780,641, issued Sep. 22, 2020, which claims the benefit of U.S. Provisional Application No. 62/401,664, entitled “ENHANCED THREE DIMENSIONAL PRINTING OF VERTICAL EDGES” and filed on Sep. 29, 2016, each of which is incorporated herein by reference in its entirety.
  • BACKGROUND
  • This patent document relates to three-dimensional (3D) printing using photopolymers.
  • Photopolymer-based 3D printers that use bottom-up illumination can project light upwards through an optically transparent window into a vat of photoactive resin to cure at least a portion of the resin. Such printers can build a 3D structure by forming one layer at a time, where a subsequent layer adheres to the previous layer. The light can be patterned to cause some portions of the resin to cure and other portions not to cure, thereby creating substructures of the 3D structure.
  • SUMMARY
  • This patent document describes technologies relating to enhancing vertical edges during photopolymer-based three dimensional (3D) printing. In one aspect, a described technique includes accessing, by a data processing apparatus, an original image corresponding to a slice of a three-dimensional model prepared for printing on a three-dimensional printer that uses a photopolymer to create a three-dimensional structure; accessing a pixel reduction factor that is associated with an increased exposure duration parameter, the increased exposure duration parameter being greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and outputting a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing on the three-dimensional printer. Other implementations can include corresponding systems, apparatus, and computer program products.
  • These and other implementations can include one or more of the following features. Implementations can include sending the modified image to the three-dimensional printer; and controlling the three-dimensional printer to use the increased exposure duration parameter when printing the slice in accordance with the modified image to build a portion of a three-dimensional structure. Controlling the three-dimensional printer to use the increased exposure duration parameter can include sending the increased exposure duration parameter to the three-dimensional printer. In some implementations, the increased exposure duration parameter is selected to increase a curing quality at edges of the three-dimensional structure, and the pixel reduction factor is selected based on the increased exposure duration parameter to eliminate or minimize over-curing for interior areas within the three-dimensional structure. Implementations can include reducing a build area represented by the original image to preserve dimensional accuracy under the increased exposure duration parameter. Classifying pixels of the original image can include using the reduced build area to identify the interior pixels and the exterior pixels. Classifying the pixels can include accessing neighboring pixels of a target pixel of the original image, the target pixel having an intensity level greater than a black intensity level; classifying the target pixel as an exterior pixel if one or more of the neighboring pixels have the black intensity level; and classifying the target pixel as an interior pixel if all of the neighboring pixels have an intensity level greater than the black intensity level.
  • Particular implementations disclosed herein can provide one or more of the following advantages. A described technology can be used to smooth out vertical edges in three-dimensional printing. A described technology can improve edge quality without over-curing interior areas. Further, a described technology can improve edge quality for thick slices, such as those of 100 microns thickness or greater. A described technology can be used to more accurately print a 3D structure.
  • Details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages may be apparent from the description and drawings, and from the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an example of a 3D printing system coupled with a computer.
  • FIGS. 2A, 2B, 2C, and 2D show different intensity and curing profiles that motivate using a vertical edge enhancement technique.
  • FIGS. 3A and 3B respectively show cross-sections of an example of multiple printed layers of a 3D structure that are printed without and with vertical edge enhancement.
  • FIG. 4 shows a flowchart of an example of a process that enhances edges of a 3D digital model for manufacturing with a 3D printer.
  • FIG. 5 shows a flowchart of an example of a process that performs an edge enhancement routine.
  • FIG. 6 shows a flowchart of another example of a process that performs an edge enhancement routine.
  • FIG. 7 shows a flowchart of an example of a process to determine an exposure duration parameter and a pixel reduction factor.
  • FIGS. 8A and 8B show edge views of an example of a 3D structure printed without and with edge enhancement from the same digital model.
  • FIGS. 9A, 9B, 9C, 9D, and 9E show images associated with another example of a 3D structure printed without and with edge enhancement from the same digital model.
  • Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION
  • FIG. 1 shows an example of a 3D printing system 100 coupled with a computer 150. The computer 150 can provide information about a 3D structure to the 3D printing system 100 for printing. The computer 150 can communicate with a controller 145 of the printing system 100 via a wireline or wireless connection. The controller 145 can include integrated circuit technology, such as an integrated circuit board with embedded processor and firmware to control various system components such as a 3D printing mechanism 140 and a light projection device 142.
  • The system 100 includes a vat 110 to hold a liquid 120, which includes one or more photoactive resins. The vat 110 includes a window 115 in its bottom through which light is transmitted to cure resin to form a 3D printed structure 160 in a layer-by-layer build process. The 3D printed structure 160 is shown as a block, but as will be appreciated, a wide variety of complicated shapes can be 3D printed. The structure 160 is 3D printed on a build plate 130, which can be connected by a rod 135 to a 3D printing mechanism 140. The printing mechanism 140 can include various mechanical structures for moving the build plate 130 within the vat 110. This movement is relative movement, and thus the moving piece can be build plate 130, the vat 110, or both, in various implementations.
  • In some implementations, the window 115 includes a material such as polydimethylsiloxane (PDMS) to prevent resin from adhering to the window 115 during a curing procedure. Other techniques can be used to prevent resin from adhering to the window 115 such as a photo-inhibition technique that prevents resin from curing within a section of the vat 110 immediately above the window 115, while allowing resin to cure further away from the window 115.
  • The light projection device 142 can be positioned below the window 115. The controller 145 can operate the light projection device 142 to project a pattern of light 185 into the vat 110 to form substructures of the structure 160. The light 185 has a wavelength which is used to create the 3D structure 160 on the build plate 130 by curing the photoactive resin in the liquid 120 within a photo-initiation region 175, in accordance with a defined pattern or patterns. The wavelength can be selected based on the characteristics of the photoactive resin in the liquid 120. The build plate 130 can start at a position near the bottom of the vat 110, and varying patterns of the light 185 are directed through the window 115 to create layers of the solid structure 160 as the build plate 130 is raised out of the vat 110 by the printing mechanism 140. In some implementations, the printing mechanism 140 can employ a stepwise separation mechanism that raises the build plate 130 by a predetermined amount after each layer completion, e.g., after a predetermined curing time. In some implementations, the printing mechanism 140 can include mechanisms to aid in separation, e.g., by providing a rotation out of the plane of FIG. 1 . In some implementations, the printing mechanism 140 can employ a continuous separation mechanism that continuously raises the build plate 130.
  • The light projection device 142 can be configured to modulate its light output based on a two dimensional grid of pixels. In some implementations, the light projection device 142 can include a pixel addressable filter to allow controlled amounts of light to pass at some pixel locations while blocking or deflecting light at other pixel locations from a light source within the light projection device 142. A pixel addressable filter can include a digital micro-mirror device (DMD). In some implementations, the light projection device 142 can include a pixel addressable light source to produce controlled amounts of light at some pixel locations and not produce light at other pixel locations. In some implementations, the light projection device 142 includes a liquid crystal display (LCD) device, discrete light emitting diode (LED) array device, laser, or a digital light processing (DLP) projector.
  • In some implementations, the 3D printing system 100 can include sensors and be designed to modify its operations based on feedback from these sensors. For example, the 3D printing system 100 can use closed loop feedback from sensors in the printer to improve print reliability. Such feedback sensors can include one or more strain sensors on the rod 135 holding the build plate 130 to detect if adhesion has occurred and stop and/or adjust the print, and one or more sensors to detect polymer conversion, such as a spectrometer, a pyrometer, etc. These sensors can be used to confirm that the 3D printing is proceeding correctly, to determine if the resin has been fully cured before the 3D printing system 100 proceeds to the next layer, or both. Moreover, in some implementations, one or more cameras can be used along with computer vision techniques to check that the print is proceeding as expected. Such cameras can be positioned under the vat 110 to examine the output, e.g., 3D printed layer, which the controller 145 can compare to the input, e.g., mask or layer image.
  • The computer 150 can include a processor 152, memory 154, and interfaces such as a network interface or a Universal Serial Bus (USB) interface. The processor 152 can be one or multiple processors, which can each include multiple processor cores. The memory 154 can include volatile memory such as Random Access Memory (RAM). The memory 154 can include non-volatile memory such as flash memory or read-only memory (ROM). The computer 150 can include one or more types of computer storage media and devices, which can include the memory 154, to store instructions of programs that run on the processor 152. For example, a 3D printing program 156 can be stored in the memory 154 and run on the processor 152 to implement the techniques described herein. In some implementations, the controller 145 can include the 3D printing program 156.
  • The 3D printing program 156 can transform a digital model into a sequence of layers that collectively describe the structure 160. The 3D printing program 156 can access a file containing mesh data that represents a digital model. Mesh data can include descriptions of geometric shapes such as polygons and their locations within the digital model. The 3D printing program 156 can map the digital model into three-dimensional discrete points called voxels. In some implementations, a voxel can be mapped to a pixel within a layer. In some implementations, the digital model can be sliced into grids of pixels and each pixel represents a voxel. A voxel can be fully contained within the digital model, partially contained within the digital model, or outside of the digital model. For example, a pixel corresponding to a voxel fully contained within the model can be assigned to a white intensity level which causes light to be projected onto a corresponding printing area to cure resin. A pixel corresponding to voxel partially contained within the model can be assigned to a grayscale intensity level which causes some light to be projected onto a corresponding printing area to cure some resin. A pixel corresponding to a voxel not contained within the model can be assigned to a black intensity level which causes light not to be projected onto a corresponding printing area so that resin does not cure; this type of pixel can be referred to as a black pixel or a non-printing pixel. The pixels on the exterior of the model are those that are either fully or partially contained but adjacent to one or more not contained pixels. The pixels on the interior of the model are those that are fully contained and adjacent to other pixels that are all either fully or partially contained.
  • The 3D printing program 156, the controller 145, or both can employ a vertical edge enhancement technique 158 to enhance vertical edges. The vertical edge enhancement technique 158 can use an increased exposure duration parameter to deliver greater than nominal curing doses to the one or more edges of the 3D printed structure 160 to form smoother vertical edges. Further, the vertical edge enhancement technique 158 can modify slice image data to increase curing at exterior pixels of a slice image and prevent over-curing at interior pixels of the slice image. Note that exterior pixels can correspond to the one or more edges of the 3D printed structure 160 and can be referred to as edge pixels. Based on the output of the vertical edge enhancement technique 158, the 3D printing program 156, the controller 145, or both can output layer information, such as graphic files or light modulation command sequences, that represent respective patterns of light to be generated for each layer of the model.
  • In more detail, the 3D printing program 156, the controller 145, or both can modify slice images by reducing the intensity values of interior portions, e.g., corresponding to interior pixels, of the objects to be printed from white to a dark gray (e.g., 25% of their original values) via the vertical edge enhancement technique 158. Portions of the slice images at the edges of objects are left at their original gray or white values. The exposure time can be increased in inverse proportion to the reduction of the interior values (e.g., by 400%), such that the interior portions receive the same dose as they would have with the original white at the original exposure time, while the edge regions receive a proportionately larger (e.g., 400% larger) dose. This can improve the verticality of the printed edges by providing them with a higher dose of light than is received by the interior portions of the print.
  • FIGS. 2A, 2B, 2C, and 2D show different intensity and curing profiles that motivate using a vertical edge enhancement technique. FIG. 2A shows a graph 205 of an example of an intensity profile. The intensity profile represents a Gaussian-like fall-off in light intensity in the x-direction starting at the edge 208 of an exposed region in a slice. Here, pixels to the left of the edge 208 are illuminated, whereas pixels to the right of the edge 208 are not.
  • FIG. 2B shows a cross-section of an example of a curing profile 210 without absorption. Since there is no absorption in this theoretical example, the curing region 220 extends fully and vertically in the z-direction. Due to the fall-off shown in FIG. 2A in the x-direction, the curing region 220 gradually stops at the curing threshold 215 in the x-direction. Areas to the right of the curing threshold 215 do not receive a sufficient dose to cure a photopolymer.
  • FIG. 2C shows a cross-section of an example of a curing profile 230 that experiences absorption under a nominal curing dose. The nominal curing dose is associated with a nominal exposure duration parameter. Since there is absorption in this example, the curing region 240 gradually stops at the curing threshold 235 in the z-direction. Due to the fall-off shown in FIG. 2A in the x-direction, the curing region 240 gradually stops at the curing threshold 235 in the x-direction.
  • FIG. 2D shows a cross-section of an example of a curing profile 250 that experiences less absorption using a greater than nominal curing dose associated with a vertical edge enhancement technique. Here, the vertical edge enhancement technique uses an increased exposure duration parameter to provide a greater than nominal curing dose at the edge. Since there is absorption in this example, the curing region 260 gradually stops at the curing threshold 255 in the z-direction. Due to the fall-off shown in FIG. 2A in the x-direction, the curing region 260 gradually stops at the curing threshold 255 in the x-direction. However, with the greater than nominal dose used with this curing profile 250, the curing threshold 255 is pushed out farther in the z-direction, e.g., a greater volume of photopolymer is cured, when compared to the curing threshold 235 of FIG. 2C under the nominal dose. Such a push out can result in smoother edges between layers.
  • FIGS. 3A and 3B respectively show cross-sections of an example of multiple printed layers of a 3D structure that are printed without and with vertical edge enhancement. FIG. 3A shows a cross-section of an example of multiple printed layers 305 a, 305 b, 305 c of a 3D structure 301 that are printed without vertical edge enhancement. The printed layers 305 a-c exhibit scalloping due to absorption losses using nominal curing doses associated with a nominal exposure duration parameter. For example, the curved shaded regions 307 a, 307 b, 307 c, representing cured resin, do not fully extend from a left side 320 a of a printing region corresponding to an exterior pixel to a right side 320 b of the printing region for each of the layers 305 a-c. Note that the right side 320 b of the printing region represents an edge of the 3D structure 301. The reason for the uncured areas of each printing region is explained above with respect to FIG. 2C.
  • FIG. 3B shows a cross-section of an example of multiple printed layers 355 a, 355 b, 355 c of a 3D structure 351 that are printed with vertical edge enhancement. Vertical edge enhancement can minimize scalloping due to absorption losses by using a greater than nominal curing dose. For example, the partially curved shaded regions 357 a, 357 b, 357 c, representing cured resin, extend further from a left side 320 a of a printing region corresponding to an exterior pixel to a right side 320 b of the printing region for each of the layers 355 a-c than the respective layers 305 a-c of FIG. 3A. Note that the right side 320 b of the printing region represents an edge of the 3D structure 351. The reason for the greater area of cured resin is explained above with respect to FIG. 2D.
  • FIG. 4 shows a flowchart of an example of a process that enhances edges of a 3D digital model for manufacturing with a 3D printer. A device such as a printer controller or a computer can perform this process. At 401, the process obtains a digital model that describes a 3D structure. Obtaining a digital model can include accessing a file that defines the meshes that cover the surface of the structure. The file can be in a format such as the STL (STereoLithography) file format or the Polygon File Format (PLY). Other types of file formats are possible. In some implementations, the file can identify a list of voxels, or other representation thereof, that are included in the structure. In some implementations, the digital model is received over a network connection. For example, a user can upload a digital model to a server on the Internet for manufacturing with a 3D printer. In some implementations, a digital model can be created from actual objects. For example, a 3D scanner such as Magnetic resonance imaging (MRI) scanner, computed tomography (CT) scanner, or laser scanner can be used to generate digital models. In some implementations, a digital model can be generated by different images of an object. In some implementations, a digital model can be generated based on a microtome sectioning process.
  • At 405, the process accesses a pixel reduction factor that is associated with an increased exposure duration parameter. The increased exposure duration parameter can increase curing at the edges during 3D printing, while the pixel reduction factor can prevent over-curing at interior areas during 3D printing. Accessing a pixel reduction factor can include retrieving a value from a database or a value embedded in a software program. In some implementations, the pixel reduction factor is determined based on a pixel factor determination process; see, e.g., the process of FIG. 7 . In some implementations, the pixel reduction factor is determined slice-by-slice based on the digital model.
  • At 408, the process can selectively modify the digital model to preserve dimensional accuracy. For example, using the increased exposure duration parameter may cause thicker edges to develop during printing, and eroding the original image may be required to preserve dimensional accuracy. In some implementations, if it is determined that edge enhancement would thicken one or more edges by more than a predetermined threshold, the process can reduce, e.g., shrink, one or more build areas represented by the digital model to preserve dimensional accuracy under the increased exposure duration parameter. In some implementations, build area reduction can occur later in the process such as when individual slices undergo edge enhancement.
  • At 410, the process maps the digital model onto a three dimensional grid of voxels associated with a 3D printer. Mapping the digital model can include identifying voxels that are fully contained within the digital model, voxels that are partially contained within the digital model, and voxels that are outside of the digital model. In some implementations, the process can receive one or more parameters that describe the capabilities of the 3D printer such as resolutions in the X, Y, and Z dimensions, and maximum sizes for each dimension. The process can use these parameters to determine the number and shape of voxels for the grid. In some implementations, each voxel in the grid can correspond to a voxel that the 3D printer can form. In some implementations, each voxel in the grid can correspond to a pixel that the 3D printer can form within a layer. In some implementations, a voxel is sliced in an X-Y plane at its midpoint location along the printer's formation axis, e.g., Z axis, which is perpendicular to that plane, to form a corresponding pixel.
  • At 415, the process creates an image of a 2D slice of the 3D grid. Creating an image can include accessing a rectangular slice of the 3D grid, where the slice is perpendicular to the direction of printing. Voxels, and their corresponding pixels within the image, are assigned a white intensity level if they are a part of the 3D structure, e.g., corresponding to points that should be cured. Voxels, and their corresponding pixels within the image, are assigned a black intensity level if they are not a part of the 3D structure, e.g., corresponding to points that should not be cured. In some implementations, intensity levels can be represented as 8-bit values that range from 0 (black) to 255 (white). The white intensity level is sufficient to cure photoactive resin during a predetermined curing time for a layer. In some implementations, the white intensity level is a percentage (e.g., 90% or 95%) of a maximum intensity level generated by a 3D printer; in this case, the maximum intensity level exceeds an intensity level sufficient to cure photoactive resin. Note that different photoactive resins can require different curing intensities, durations, or both. In some implementations, the process creates a slice of a specified thickness, uses mesh data to generate a 2D image at that slice, and uses the specified slice thickness to set the exposure duration for that slice.
  • At 420, the process performs an edge enhancement routine on the 2D slice image using the pixel reduction factor to produce a modified image. The edge enhancement routine can include reducing intensity levels of the interior pixels by the pixel reduction factor and maintaining intensity levels of the exterior pixels and black pixels. See FIGS. 5 and 6 for examples of an edge enhancement routine. At 425, the process determines whether there is a next slice to create. If so, the process loops back to 415 to create another slice. If there are no more slices, then the process continues at 435. In some implementations, all slice images are created, and then the edge enhancement routine is applied to each of the slice images. In some implementations, a batch of slice images is created, and then the edge enhancement routine is applied to each of the slice images in the batch, together or in sequence, e.g., a 3D printer can process a small number of slices (e.g., 2-5 slices) at a time in a processing pipeline, and the edge enhancement routine can operate on each slice (as appropriate for that slice) in the processing pipeline in turn.
  • At 435, the process generates one or more graphic files based on the modified images. In some implementations, the digital model is sliced into N layers in the Z dimension and the process outputs a graphic file such as a file in accordance with a file format such as Portable Network Graphics (PNG) for each layer. In some implementations, the process outputs a graphic file containing multiple images for respective layers. Note that pixels within an image for a slice may not be modified for various reasons such as a slice's lack of containment within the 3D structure, e.g., a slice's lack of interior pixels, but such an image may be deemed as a modified image, nonetheless, due to the image being analyzed by the edge enhancement routine at 420.
  • At 440, the process sends the one or more graphic files to the 3D printer. At 445, the process controls the 3D printer to use the increased exposure duration parameter when printing the one or more graphic files. In some implementations, the process controls the 3D printer by sending the increased exposure duration parameter together with the one or more graphic files. In some implementations, an increased exposure duration parameter is embedded as metadata within the one or more graphic files. In some implementations, sending information such as the one or more graphic files and the one or more exposure duration parameters can include transmitting data via a network connection (e.g., wireline or wirelessly) or Universal Serial Bus (USB). In some implementations, a 3D printer can receive the digital model itself, perform the process of FIG. 4 , and send the contents of the one or more graphic files to a light projection device within the 3D printer. Sending the contents can include transmitting a sequence of bits over a serial bus between a controller and a projection system. In some implementations, a standalone computer performs the process of FIG. 4 and sends the contents of the one or more graphic files to the 3D printer via a wireline or wireless connection. In some implementations, the process can determine an exposure duration on a per-slice basis. As such, a print job can use two or more different exposure durations for two or more respective slices. Moreover, the process can use a longer duration for at least the initial slice that is printed directly on the build plate to help provide a solid foundation for subsequent slices.
  • FIG. 5 shows a flowchart of an example of a process that performs an edge enhancement routine. A device such as a printer controller or a computer can perform this process. At 505, the process accesses an original image corresponding to a slice of a 3D model prepared for printing on a 3D printer that uses a photopolymer to create 3D objects. Accessing an original image can include retrieving an image from a multi-image file or a single-image file. At 510, the process accesses a pixel reduction factor that is associated with an increased exposure duration parameter. The increased exposure duration parameter is greater than an unmodified, e.g., nominal, exposure duration parameter associated with the photopolymer. In some implementations, the pixel reduction factor is a fixed parameter within the process. In some implementations, the pixel reduction factor is passed as a variable. In some implementations, the pixel reduction factor is determined based on a pixel factor determination process; see, e.g., the process of FIG. 7 .
  • At 515, the process classifies pixels of the original image to identify interior, exterior, and black pixels. In some implementations, the process includes accessing neighboring pixels of a target pixel of the original image, the target pixel having an intensity level greater than a black intensity level. The process can classify the target pixel as an exterior pixel based on one or more of the neighboring pixels having the black intensity level. The process can classify the target pixel as an interior pixel based on all of the neighboring pixels having an intensity level greater than the black intensity level. Note that grayscale input values for input intensity levels are possible. In some implementations, the process can use a high-pass filtering technique such as a technique based on a Sobel filter or a Cany filter to classify pixels. In some implementations, the process includes reducing a build area represented by the original image to preserve dimensional accuracy under the increased exposure duration parameter. For example, using the increased exposure duration parameter may cause thicker edges to develop during printing, and eroding the original image may be required. Thus, classifying pixels of the original image, at 515, can include using the reduced build area to identify interior, exterior, and black pixels.
  • At 520, the process reduces intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the 3D printer corresponding to the interior pixels receive curing doses that are comparable to nominal doses. For such pixels, doses received under the reduced intensity levels and the increased exposure duration parameter are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels. In some implementations, reducing an intensity level can include retrieving a pixel intensity level from an input image buffer, reducing the level, and writing the reduced level to an output image buffer. In some implementations, reducing an intensity level can include reading a value from an array, reducing the value, and writing the reduced value back to the array. In some implementations, comparable doses are the same doses or about the same, e.g., within a 1% or a 5% variance from the original doses.
  • At 525, the process maintains intensity levels of the exterior pixels so that printing areas of the 3D printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than nominal doses. Nominal doses for such pixels correspond to doses received under an unmodified exposure duration parameter. In some implementations, maintaining an intensity level can include copying a pixel intensity level from an input image buffer to an output image buffer. In some implementations, maintaining an intensity level can include leaving a pixel intensity level unchanged.
  • At 530, the process outputs a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing on the 3D printer. Outputting a modified image can include dumping the contents of an output image buffer into an image file.
  • FIG. 6 shows a flowchart of another example of a process that performs an edge enhancement routine. At 605, the process retrieves a 2D slice image of a 3D structure. At 610, the process accesses a target pixel of the image. Accessing a target pixel can include retrieving a pixel intensity value. At 615, the process determines whether the target pixel is black. Typically, a pixel intensity value of zero corresponds to a black pixel. In some implementations, pixel intensity values falling below a threshold can be classified as a black pixel and can be reassigned to be a zero intensity value. If the target pixel is black, then the process inserts the pixel unchanged into the output image at 620. Otherwise, the process continues at 625. In some implementations, inserting the pixel unchanged into the output image can include writing a pixel intensity level as-is to an output buffer.
  • To determine whether the pixel is an interior or exterior pixel, the process, at 625, accesses pixels that neighbor the target pixel. Accessing pixels that neighbor the target pixel can include retrieving intensity values for the pixel neighborhood of the target pixel. Typically, the pixel neighborhood includes eight adjacent pixels that immediately surround the target pixel in the same plane as the slice; however, less than eight pixels may be used if the target pixel is on or close to an image boundary. At 630, the process determines whether any neighboring pixels are black. If any neighboring pixels are black, then the target pixel is classified as an exterior pixel and the process, at 620, inserts the pixel unchanged into the output image at 620. If none of the neighboring pixels are black, then the target pixel is classified as an interior pixel and the process, at 635, reduces the target pixel's value. At 640, the process inserts the modified pixel into the output image. In some implementations, inserting the modified pixel unchanged into the output image can include writing a reduced pixel intensity level to an output buffer.
  • At 645, the process determines whether there is a next target pixel. In some implementations, the process iterates through each pixel of each row until the last pixel of the last row is processed. If there is a next target pixel, then process continues at 610. Otherwise, the process, at 650, produces the output image including any modified pixels. In some implementations, producing the output image can include dumping an output buffer into an image file. In some implementations, a process can extract edges from an original image into an extracted edge image by using one or more image processing techniques, such as by using Sobel or Cany filters. The process can reduce the values in the entire original image by a pixel reduction factor to produce a reduced version of the original image. The process can copy any non-black pixels in the extracted edge image back into the reduced version of the original image.
  • FIG. 7 shows a flowchart of an example of a process to determine an exposure duration parameter and a pixel reduction factor. A device such as a printer controller or a computer can perform this process. At 701, the process accesses a nominal exposure duration parameter for a photopolymer used by a 3D printer. Accessing a nominal exposure duration parameter can include retrieving a value stored in a database or a value embedded in a software routine. In some implementations, the process determines the nominal exposure duration parameter based on a slice thickness parameter, a resin type used by the 3D printer, a light source used by the 3D printer, or a combination thereof. At 702, the process determines an increased exposure duration parameter to enhance vertical edge formation during printing by the 3D printer. Note that the duration parameters can be expressed in any appropriate units such as seconds or milliseconds. The increased exposure duration parameter can improve curing quality at the edges. Determining an increased exposure duration parameter can include using a slice thickness parameter, a resin type used by the 3D printer, a light source used by the 3D printer, or a combination thereof. Note that there is a trade off between edge enhancement and exposure duration. The increase in edge fidelity can be expressed as the product of contrast between brightness of an exterior pixel and brightness of an interior pixel. If the brightness ratio of edge to interior is 4:1, the layer will take a 4× increase in exposure duration. Different ratios are possible. However, ratios past a threshold may result in diminishing returns and may harm print quality, e.g., loss of resolution in the XY plane caused by light spreading. Further, print time increases as the ratio increases due to the longer exposure durations.
  • At 703, the process determines an exposure delta based on the nominal exposure duration parameter and the increased exposure duration parameter. For example, if the nominal exposure duration parameter is two seconds, and the increased exposure duration parameter is 8 seconds, then the exposure delta is a factor of 400%. At 704, the process determines a pixel reduction factor based on the exposure delta to prevent over-curing of interior areas. In some implementations, the pixel reduction factor can be expressed as a percentage. Determining the pixel reduction factor can include computing an inverse of the exposure delta and using the inverse value as the pixel reduction factor, and accordingly, the product of the pixel reduction factor and the exposure delta is one. If the exposure delta is a factor of 400% for example, then a corresponding pixel reduction factor is 25%. In some implementations, the process can access a slice thickness parameter for a slice, and trigger edge enhancement to be performed based on the slice thickness exceeding a threshold.
  • Further, the process can output a file that includes the final exposure duration values on a per-slice basis for use by the printer during printing.
  • FIGS. 8A and 8B show edge views of an example of a 3D structure printed without and with edge enhancement from the same digital model. The slices derived from the digital model to create the printed structure have a layer height of 100 microns. FIG. 8A shows an edge view 801 of an example of a 3D structure that was printed without vertical edge enhancement. For FIG. 8A, pixel intensity levels were left unchanged and a nominal exposure duration parameter of 2 seconds was used during printing. FIG. 8B shows an edge view 851 of the example of the 3D structure that was printed with vertical edge enhancement. For FIG. 8B, pixel intensity levels of interior pixels were reduced and a greater than nominal exposure duration parameter of 8 seconds was used during printing. As compared to FIG. 8A, scalloping is reduced in FIG. 8B.
  • FIGS. 9A, 9B, and 9C show images of another example of a 3D structure printed without and with edge enhancement from the same digital model. Further, FIGS. 9D and 9E show examples of slice images that are used to print the 3D structure. FIG. 9A shows an image 901 of a 3D structure that was printed without and with vertical edge enhancement from the same digital model. The upper portion 920 of the image 901 was printed without vertical edge enhancement. The lower portion 930 of the image 901 was printed with vertical edge enhancement. FIG. 9B shows a magnified image 951 of the 3D structure that corresponds to the upper portion 920 of FIG. 9A. FIG. 9C shows an image 971 of the 3D structure that corresponds to the lower portion 930 of FIG. 9A that was printed with vertical edge enhancement. As compared to FIG. 9B, scalloping is reduced in FIG. 8C resulting in smoother edges. FIG. 9D shows an example of a slice image 981 that is used to print the upper portion 920 of the image 901 of FIG. 9A. The slice image 981 includes a region 982 of black pixels, and a region 984 of white pixels. FIG. 9D shows an example of a slice image 991, transformed by an edge enhancement process, that is used to print the lower portion 930 of the image 901 of FIG. 9A. The slice image 991 includes a region 992 of black pixels, a region of white pixels 994, and a region 996 of grayscale pixels.
  • In some implementations, a system can include a processor; and a memory structure coupled with the processor, the memory structure configured to store an original image corresponding to a slice of a three-dimensional model prepared for printing on a three-dimensional printer that uses a photopolymer to create a three-dimensional structure. The processor can be configured to perform operations comprising: accessing a pixel reduction factor that is associated with an increased exposure duration parameter, wherein the increased exposure duration parameter is greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and outputting a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing on the three-dimensional printer.
  • A three-dimensional printer can include a vat capable of holding a liquid comprising a photopolymer, the vat including a window. The printer can include a memory structure to store information including an original image corresponding to a slice of a three-dimensional model prepared for creation of a three-dimensional structure via the three-dimensional printer. The printer can include a build plate configured and arranged to move within the vat during three-dimensional printing of the three-dimensional structure on the build plate. The printer can include a light projection device to project light through the window. The printer can include a controller to control the printing of the three-dimensional structure, movement of the build plate, and light modulation of the light projection device.
  • The controller can be configured to perform operations that include accessing a pixel reduction factor that is associated with an increased exposure duration parameter, wherein the increased exposure duration parameter is greater than an unmodified exposure duration parameter associated with the photopolymer; classifying pixels of the original image to identify interior pixels of the original image and exterior pixels of the original image; reducing intensity levels of the interior pixels by the pixel reduction factor so that printing areas of the three-dimensional printer corresponding to the interior pixels receive first curing doses under the increased exposure duration parameter and the reduced intensity levels that are comparable to doses received under an unmodified exposure duration parameter and unreduced intensity levels; maintaining intensity levels of the exterior pixels so that printing areas of the three-dimensional printer corresponding to the exterior pixels receive second curing doses under the increased exposure duration parameter that are greater than doses associated with the unmodified exposure duration parameter; and creating a modified image based on the reduced intensity levels and the maintained intensity levels that correspond to the slice for printing. Operations can include printing the slice in accordance with the modified image to build a portion of the three-dimensional structure; and controlling the light projection device based on the increased exposure duration parameter when printing the slice in accordance with the modified image to build the portion of the three-dimensional structure.
  • Embodiments of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented using one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer-readable medium can be a manufactured product, such as hard drive in a computer system or an optical disc sold through retail channels, or an embedded system. The computer-readable medium can be acquired separately and later encoded with the one or more modules of computer program instructions, such as by delivery of the one or more modules of computer program instructions over a wired or wireless network. The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, or a combination of one or more of them.
  • The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a runtime environment, or a combination of one or more of them. In addition, the apparatus can employ various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification can be performed by, and/or under the control of, one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described is this specification, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • While this specification contains many specifics, these should not be construed as limitations on the scope of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • Other embodiments are within the scope of the following claims.

Claims (22)

1.-19. (canceled)
20. A method for processing a three-dimensional (3D) object for printing by a 3D printer, comprising:
(a) obtaining, by a computer processor, a digital image corresponding to at least a portion of said 3D object;
(b) identifying, by said computer processor, at least one interior pixel of said digital image and at least one exterior pixel of said digital image;
(c) changing, by said computer processor, a light intensity level of said at least one interior pixel and an additional light intensity level of said at least one exterior pixel relative to one another, to generate a light intensity level profile of said digital image, wherein said light intensity profile is usable by said 3D printer to print said at least said portion of said 3D object.
21. The method of claim 20, wherein, in (c), one of said light intensity level and said additional light intensity level is reduced, while the other of said light intensity level and said additional light intensity level remains substantially unchanged.
22. The method of claim 20, further comprising assigning increased duration of exposure of (i) said at least one interior pixel to light at said light intensity level that is changed or (ii) said at least one exterior pixel to light at said additional light intensity level that is changed.
23. The method of claim 22, comprising assigning increased duration of exposure of (i) said at least one interior pixel to light at said light intensity level that is changed and (ii) said at least one exterior pixel to light at said additional light intensity level that is changed.
24. The method of claim 22, wherein said light intensity level and said additional light intensity level are changed based at least in part on said increased duration of exposure.
25. The method of claim 20, wherein said light intensity level and said additional light intensity level are changed based on a nominal curing parameter of a resin that is usable for forming said at least said portion of said 3D object.
26. The method of claim 20, wherein said light intensity level and said additional light intensity level are changed to enhance curing quality of said at least said portion of said 3D object.
27. The method of claim 20, wherein said light intensity level and said additional light intensity level are changed to enhance curing quality at one or more edges of said at least said portion of said 3D object.
28. The method of claim 20, wherein said identifying comprises classifying a target pixel as said at least one exterior pixel when one or more neighboring pixels of said target pixel has a black light intensity level.
29. The method of claim 20, further comprising sending said light intensity level profile to said 3D printer to print said at least said portion of said 3D object.
30. A system for processing a three-dimensional (3D) object for printing by a 3D printer, comprising:
a computer processor in digital communication with a computer memory, wherein said computer processor is configured to:
(a) obtain a digital image corresponding to at least a portion of said 3D object;
(b) identify at least one interior pixel of said digital image and at least one exterior pixel of said digital image;
(c) change a light intensity level of said at least one interior pixel and an additional light intensity level of said at least one exterior pixel relative to one another, to generate a light intensity level profile of said digital image, wherein said light intensity profile is usable by said 3D printer to print said at least said portion of said 3D object.
31. The system of claim 30, wherein, in (c), one of said light intensity level and said additional light intensity level is reduced, while the other of said light intensity level and said additional light intensity level remains substantially unchanged.
32. The system of claim 30, wherein said computer processor is further configured to increase duration of exposure of (i) said at least one interior pixel to light at said light intensity level that is changed or (ii) said at least one exterior pixel to light at said additional light intensity level that is changed.
33. The system of claim 32, wherein said computer processor is configured to increase duration of exposure of (i) said at least one interior pixel to light at said light intensity level that is changed and (ii) said at least one exterior pixel to light at said additional light intensity level that is changed.
34. The system of claim 32, wherein said light intensity level and said additional light intensity level are changed based at least in part on said increased duration of exposure.
35. The system of claim 30, wherein said light intensity level and said additional light intensity level are changed based on a nominal curing parameter of a resin that is usable for forming said at least said portion of said 3D object.
36. The system of claim 30, wherein said light intensity level and said additional light intensity level are changed to enhance curing quality of said at least said portion of said 3D object.
37. The system of claim 30, wherein said light intensity level and said additional light intensity level are changed to enhance curing quality at one or more edges of said at least said portion of said 3D object.
38. The system of claim 30, wherein said identifying in (b) comprises classifying a target pixel as said at least one exterior pixel when one or more neighboring pixels of said target pixel has a black light intensity level.
39. The system of claim 30, wherein said computer processor is further configured to send said light intensity level profile to said 3D printer to print said at least said portion of said 3D object.
19. The three-dimensional printer of claim 14, wherein classifying the pixels comprises:
accessing neighboring pixels of a target pixel of the original image, the target pixel having an intensity level greater than a black intensity level;
classifying the target pixel as an exterior pixel if one or more of the neighboring pixels have the black intensity level; and
classifying the target pixel as an interior pixel if all of the neighboring pixels have an intensity level greater than the black intensity level.
US17/828,271 2016-09-29 2022-05-31 Enhanced three dimensional printing of vertical edges Pending US20230081400A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/828,271 US20230081400A1 (en) 2016-09-29 2022-05-31 Enhanced three dimensional printing of vertical edges

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201662401664P 2016-09-29 2016-09-29
US15/719,118 US10780641B2 (en) 2016-09-29 2017-09-28 Enhanced three dimensional printing of vertical edges
US202016998106A 2020-08-20 2020-08-20
US202117206810A 2021-03-19 2021-03-19
US202117508091A 2021-10-22 2021-10-22
US17/828,271 US20230081400A1 (en) 2016-09-29 2022-05-31 Enhanced three dimensional printing of vertical edges

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US202117508091A Continuation 2016-09-29 2021-10-22

Publications (1)

Publication Number Publication Date
US20230081400A1 true US20230081400A1 (en) 2023-03-16

Family

ID=61687517

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/719,118 Active 2038-11-11 US10780641B2 (en) 2016-09-29 2017-09-28 Enhanced three dimensional printing of vertical edges
US17/828,271 Pending US20230081400A1 (en) 2016-09-29 2022-05-31 Enhanced three dimensional printing of vertical edges

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/719,118 Active 2038-11-11 US10780641B2 (en) 2016-09-29 2017-09-28 Enhanced three dimensional printing of vertical edges

Country Status (1)

Country Link
US (2) US10780641B2 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017062630A1 (en) 2015-10-07 2017-04-13 Autodesk, Inc. Sub-pixel grayscale three-dimensional printing
US10252468B2 (en) 2016-05-13 2019-04-09 Holo, Inc. Stereolithography printer
CN108927993B (en) * 2017-05-26 2020-12-15 三纬国际立体列印科技股份有限公司 Photocuring 3D printing method of multi-light source module
CN108673899A (en) * 2018-04-13 2018-10-19 重庆三峡学院 A kind of networking 3D printer monitoring system and monitoring method
EP3844659A4 (en) * 2018-09-12 2021-10-27 Siemens Industry Software Inc. Internal channel network detections for 3d printing
WO2020076304A1 (en) 2018-10-09 2020-04-16 Hewlett-Packard Development Company, L.P. Modifying object geometries based on radiant heating distribution
US10723069B2 (en) 2018-11-01 2020-07-28 Origin Laboratories, Inc. Method for build separation from a curing interface in an additive manufacturing process
US10532577B1 (en) * 2018-11-07 2020-01-14 Electronics For Imaging, Inc. Unitary ink tank for printer system
CN109703033A (en) * 2019-01-18 2019-05-03 深圳市硬核智娱科技有限公司 It is a kind of to be stably connected with formula building blocks 3D printing system with comparing function
CN111745959B (en) * 2020-07-06 2022-06-28 优你造科技(北京)有限公司 3D printing method and 3D printing equipment
US11794411B2 (en) 2020-12-04 2023-10-24 Stratasys, Inc. Part quality monitoring in a stereolithographic additive manufacturing system
CN112862705B (en) * 2021-01-23 2023-08-25 西安点云生物科技有限公司 Device, equipment, method and storage medium for optimizing edge antialiasing of photo-cured slice image
US11833758B2 (en) * 2021-04-30 2023-12-05 Hewlett-Packard Development Company, L.P. Submitting 3D object models for 3D printing having stored digital model in a 3D print file as an integerized triangle mesh
EP4119329A1 (en) * 2021-07-12 2023-01-18 Essilor International Method for additively manufacturing an ophthalmic device and manufacturing system configured to carry out such a method
CN114407364B (en) * 2021-12-31 2023-10-24 深圳市纵维立方科技有限公司 Slicing method, printing system and electronic equipment of three-dimensional model

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5403680A (en) 1988-08-30 1995-04-04 Osaka Gas Company, Ltd. Photolithographic and electron beam lithographic fabrication of micron and submicron three-dimensional arrays of electronically conductive polymers
US5418608A (en) 1993-05-04 1995-05-23 Harbor Branch Oceanographic Institution Inc. Three dimensional mapping systems and methods
JP3476114B2 (en) 1996-08-13 2003-12-10 富士通株式会社 Stereoscopic display method and apparatus
US6500378B1 (en) 2000-07-13 2002-12-31 Eom Technologies, L.L.C. Method and apparatus for creating three-dimensional objects by cross-sectional lithography
US6867774B1 (en) 2002-12-02 2005-03-15 Ngrain (Canada) Corporation Method and apparatus for transforming polygon data to voxel data for general purpose applications
JP2005280073A (en) 2004-03-29 2005-10-13 Fuji Photo Film Co Ltd Method for exposure processing of lithographic plate and equipment for performing this method
US8217939B1 (en) 2008-10-17 2012-07-10 Ngrain (Canada) Corporation Method and system for calculating visually improved edge voxel normals when converting polygon data to voxel data
US8666142B2 (en) 2008-11-18 2014-03-04 Global Filtration Systems System and method for manufacturing
JP5516145B2 (en) 2010-06-30 2014-06-11 セイコーエプソン株式会社 Optical detection device, display device, and electronic apparatus
CN103561927B (en) 2011-05-31 2016-07-27 3M创新有限公司 For the method preparing the microstructured tool with discontinuous shape characteristic and the goods manufactured by described instrument
US10248740B2 (en) 2012-04-09 2019-04-02 Autodesk, Inc. Three-dimensional printing preparation
WO2014092680A1 (en) 2012-12-10 2014-06-19 Dirtt Environmental Solutions Inc. Efficient lighting effects in design software
US9836879B2 (en) 2013-04-16 2017-12-05 Autodesk, Inc. Mesh skinning technique
US20160250809A1 (en) 2013-05-24 2016-09-01 Looking Glass Hk Ltd. Method for manufacturing a physical volumetric representation of a virtual three-dimensional object
WO2015072921A1 (en) 2013-11-14 2015-05-21 Structo Pte. Ltd Additive manufacturing device and method
US9987808B2 (en) 2013-11-22 2018-06-05 Johnson & Johnson Vision Care, Inc. Methods for formation of an ophthalmic lens with an insert utilizing voxel-based lithography techniques
US9457518B2 (en) 2014-02-07 2016-10-04 Adobe Systems Incorporated Method and apparatus for controlling printability of a 3-dimensional model
TWI629162B (en) 2014-03-25 2018-07-11 Dws有限責任公司 Computer-implementted method, and equipment and computer program product for defining a supporting structure for a three-dimensional object to be made through stereolithography
US10252466B2 (en) 2014-07-28 2019-04-09 Massachusetts Institute Of Technology Systems and methods of machine vision assisted additive fabrication
WO2016044483A1 (en) 2014-09-16 2016-03-24 The Regents Of The University Of California Method for fabrication of microwells for controlled formation of 3-dimensional multicellular-shapes
WO2016098364A1 (en) 2014-12-18 2016-06-23 コニカミノルタ株式会社 Optical unit, and projector provided with same
US9840045B2 (en) 2014-12-31 2017-12-12 X Development Llc Voxel 3D printer
WO2016186609A1 (en) 2015-05-15 2016-11-24 Hewlett-Packard Development Company, L.P. Three-dimensional printing systems
KR101741212B1 (en) 2015-08-25 2017-05-29 삼성에스디에스 주식회사 System and method for transmitting cross-sectional images of three-dimensional object, and transmitting apparatus for executing the same
WO2017062630A1 (en) 2015-10-07 2017-04-13 Autodesk, Inc. Sub-pixel grayscale three-dimensional printing
US9833839B2 (en) 2016-04-14 2017-12-05 Desktop Metal, Inc. Fabricating an interface layer for removable support
US10252468B2 (en) 2016-05-13 2019-04-09 Holo, Inc. Stereolithography printer

Also Published As

Publication number Publication date
US20180086003A1 (en) 2018-03-29
US10780641B2 (en) 2020-09-22

Similar Documents

Publication Publication Date Title
US20230081400A1 (en) Enhanced three dimensional printing of vertical edges
US20230101921A1 (en) Sub-pixel grayscale three-dimensional printing
US10780643B2 (en) Stereolithography printer mapping a plurality of pixels of a cross-sectional image to corresponding mirrors of a plurality of mirrors of a digital micromirror unit
US10933588B2 (en) Stereolithography printer
CN101109898B (en) Method for producing a three-dimensional object
US7636610B2 (en) Method and device for producing a three-dimensional object, and computer and data carrier useful therefor
US9415544B2 (en) Wall smoothness, feature accuracy and resolution in projected images via exposure levels in solid imaging
JP6474995B2 (en) Slice data creation device, slice data creation method, program, and computer-readable recording medium
US20190291341A1 (en) Light Homogenization Method for Multi-Source Large-Scale Surface Exposure 3D Printing
EP3311988A1 (en) A method for joint color and translucency 3d printing and a joint color and translucency 3d printing device
CN112677487A (en) Control method and control system for 3D printing and 3D printing equipment
KR20180105797A (en) Method and apparatus for generating 3d printing data
CN113103587B (en) Control method and control system for 3D printing and 3D printing equipment
US20200290282A1 (en) Techniques for optimizing photopolymer cure energy in additive fabrication
US20220350305A1 (en) Digital image transformation to reduce effects of scatter during digital light processing-style manufacturing
US11370165B2 (en) Method for improving resolution in LCD screen based 3D printers
US20220281178A1 (en) Systems and methods for three-dimensional printing and products produced thereby
WO2023243430A1 (en) Information processing method, information processing device, and program
WO2023193006A2 (en) Image transformations to enhance edge and surface clarity in additive manufacturing
CN116945610A (en) DLP3D printing method and system for layer pattern optimization
WO2022217128A9 (en) Digital image transformation to reduce effects of scatter during digital light processing-style manufacturing

Legal Events

Date Code Title Description
AS Assignment

Owner name: HOLO, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AUTODESK, INC.;REEL/FRAME:061942/0448

Effective date: 20171129

Owner name: AUTODESK, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GREENE, RICHARD M.;ADZIMA, BRIAN JAMES;KRANZ, STEPHEN JAMES;REEL/FRAME:061942/0233

Effective date: 20161004

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: WTI FUND X, INC., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:HOLO, INC.;REEL/FRAME:066888/0195

Effective date: 20220822

Owner name: VENTURE LENDING & LEASING IX, INC., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:HOLO, INC.;REEL/FRAME:066888/0195

Effective date: 20220822

AS Assignment

Owner name: SOUTHWEST GREENE INTERNATIONAL, INC., CALIFORNIA

Free format text: UCC ARTICLE 9 SALE;ASSIGNORS:VENTURE LENDING & LEASING IX, INC.;WTI FUND X, INC.;REEL/FRAME:066908/0833

Effective date: 20240322