US20190026953A1 - Generating a shape profile for a 3d object - Google Patents

Generating a shape profile for a 3d object Download PDF

Info

Publication number
US20190026953A1
US20190026953A1 US16/072,277 US201616072277A US2019026953A1 US 20190026953 A1 US20190026953 A1 US 20190026953A1 US 201616072277 A US201616072277 A US 201616072277A US 2019026953 A1 US2019026953 A1 US 2019026953A1
Authority
US
United States
Prior art keywords
bounding
shapes
shape
bounding shapes
layers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/072,277
Other versions
US11043042B2 (en
Inventor
Ana Patricia Del Angel
Jun Zeng
Sebastia Cortes i Herms
Scott White
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Assigned to HP PRINTING AND COMPUTING SOLUTIONS, S.L.U. reassignment HP PRINTING AND COMPUTING SOLUTIONS, S.L.U. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CORTES I HERMS, Sebastia
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HP PRINTING AND COMPUTING SOLUTIONS, S.L.U., WHITE, SCOTT A, DEL ANGEL, ANA PATRICIA, ZENG, JUN
Publication of US20190026953A1 publication Critical patent/US20190026953A1/en
Application granted granted Critical
Publication of US11043042B2 publication Critical patent/US11043042B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/008Cut plane or projection plane definition

Definitions

  • 3D (three-dimensional) manufacturing systems typically employ additive manufacturing techniques to build or print parts within a 3D build envelope of the 3D manufacturing system. As individual parts typically do not require the entire 3D build envelope, 3D manufacturing systems are often operated to build multiple distinct parts within the build envelope concurrently during a common build operation.
  • FIG. 1 is a block diagram of an example computing apparatus for generating a shape profile for a 3D object
  • FIG. 2 depicts an example method for generating a shape profile for a 3D model
  • FIG. 3 shows an example method for partitioning superset polygons into bounding shapes
  • FIGS. 4A-4F illustrate example diagrams depicting various stages of bounding shape formation and partitioning
  • FIG. 5 shows a flowchart of an example method of further partitioning partitioned boundary shapes
  • FIGS. 6A-6C respectively, show an example mapping of an array of grid boxes onto a surface of a boundary shape
  • FIG. 7 illustrates examples of various shape profiles that may be generated with different resolution values provided for a 3D object.
  • the present disclosure is described by referring mainly to an example thereof.
  • numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent however, that the present disclosure may be practiced without limitation to these specific details. In other instances, some methods and structures have not been described in detail so as not to unnecessarily obscure the present disclosure.
  • the terms “a” and “an” are intended to denote at least one of a particular element, the term “includes” means includes but not limited to, the term “including” means including but not limited to, and the term “based on” means based at least in part on.
  • Industrial grade 3D printers have large build volumes so that print service providers (PSPs) that need to produce many discrete and oftentimes different, complex parts have the necessary high throughput.
  • PSPs print service providers
  • a 3D printer may produce hundreds of parts (or equivalently, 3D objects) in a batch, and each part may have tens of thousands of geometrical entities to be analyzed.
  • the printing process may be optimized between printing large quantities of geometrical entities requiring analysis of complex shapes while simultaneously providing a quick turn-around.
  • Printing applications may also include multiple data pipeline applications that may demand quick computations.
  • Cage generation and cloud-based build bed previews and modifications are other processes that may also require fast computations.
  • the shape profiles for the 3D objects may include geometrical entities that closely approximate the shapes of 3D objects. That is the shape profiles may have a significantly smaller number of geometrical entities that approximate the shapes of the 3D objects than the actual shapes of the 3D objects. In one regard, therefore, computations implementing the shape profiles may be significantly faster than computations implementing more accurate geometrical entities of the actual 3D objects.
  • the generated shape profiles may be used in the optimization of packaging of parts for 3D printing so that a significant reduction in geometric entities of shapes may be achieved, which in turn may enable faster computing and efficient data transmission even while ensuring accuracy of prints.
  • the methods disclosed herein may include the partitioning of layers into a plurality of stacked boxes, in which the stacked boxes include respective polygons that represent portions of a 3D model.
  • the polygons in the stacked box may be assembled into a superset polygon and the superset polygon may be partitioned into bounding shapes.
  • the bounding shapes may be further partitioned in multiple directions to form cells that contain fill values corresponding to the polygons contained in the cells.
  • the bounding shapes may be further partitioned based upon computed volume errors and replacement of the partitioned bounding shapes with further partitioned bounding shapes.
  • the shape profile of a 3D model may be generated from the formed cells, in which the cells may have any suitable geometrical shapes such as cubes, honeycombs, triangles, tetrahedrons, etc.
  • the cells may have any geometrical shapes that may be used to discretize a solid object.
  • the shape profile disclosed herein may define a geometry corresponding to the plurality of stacked boxes.
  • the amount of time and processing power that 3D printers may require to process a plurality of 3D objects for printing may be significantly reduced by using the generated shape profiles over other processing techniques.
  • the amount of storage space required to store profiles of the 3D objects may be significantly reduced, which may also result in greater efficiency in the operations of the 3D printers and/or data storage devices.
  • the computing apparatuses may be included in or may be 3D printers and may thus generate shape profiles, implement the generated shape profiles (e.g., use the shape profiles in parts packaging and/or cage generation operations), and may print 3D objects using the generated shape profiles.
  • FIG. 1 is a block diagram of an example computing apparatus 100 for generating a shape profile for a 3D model. It should be understood that the computing apparatus 100 depicted in FIG. 1 may include additional components and that some of the components described herein may be removed and/or modified without departing from a scope of the computing apparatus 100 .
  • the computing apparatus 100 is depicted as including a processing device 110 and a non-transitory computer readable storage medium 120 .
  • the processing device 110 may fetch, decode, and execute processor-readable instructions, such as instructions 122 - 132 stored on the computer readable storage medium 120 , to control various processes for generating a shape profile of a 3D model.
  • the processing device 110 may include one or more electronic circuits that include electronic components for performing the functionalities of the instructions 122 - 132 .
  • the processing device 110 may execute the instructions 122 to access a 3D model of a 3D object that is to be 3D printed.
  • the processing device 110 may access the 3D model and/or information regarding the 3D model from an external device or the information may be retrieved from a local data store (not shown).
  • the 3D object may be a unitary piece composed of a plurality of sections and extending along first, second and third directions, for example, the X, Y, and Z directions of a Cartesian coordinate system.
  • the second and third directions may extend along the length and breadth of a build area platform (not shown) of a 3D printer while the first direction may extend normal to a plane formed along the second and the third directions along the height of a build volume (or build direction) in which the 3D printer builds the 3D object.
  • the processing device 110 may execute the instructions 124 to slice the 3D model along a first direction to produce a plurality of layers in parallel planes that are defined across the second and third directions.
  • the first, second, and third directions are orthogonal to each other similar to a Cartesian coordinate system.
  • Each of the layers may thus be sliced through the 3D model such that each of the layers is composed of polygons that represent portions or sections of the 3D model.
  • the polygons in a layer may be contours of the 3D model in that layer.
  • the polygons may be marked, for instance, using an ordering of nodes in a clockwise or counter-clockwise manner, to be in one of two groups.
  • the first group may be an outer contour group, in which the inside of which is solid and so represents interior portions of the 3D model.
  • the second group may be an inner contour group, in which the inside of which is a hole.
  • each of the layers may be generated to have the same thickness and the thickness may be of a predetermined size.
  • the thickness of each of the layers may be based upon a printing resolution of a 3D printer that is to print the 3D object, e.g., around 100 microns.
  • the thickness of each of the layers may be relatively larger, e.g., around 1 millimeter.
  • the thickness of each of the layers may be user-defined.
  • the thicknesses of the layers may not be constant, i.e., may not be the same with respect to each other. Instead, the layers may be generated to have different thicknesses with respect to each other.
  • the processing device 110 may execute the instructions 126 to partition the plurality of layers into a plurality of stacked boxes containing the respective polygons.
  • the spaces between the locations at which the layers have been sliced may be construed as the stacked boxes.
  • the stacked boxes may also contain the respective polygons.
  • the processing device 110 may execute the instructions 128 to, for each of the stacked boxes, assemble the polygons in the stacked box into a superset polygon. For instance, for a particular stacked box, the processing device 110 may assemble the polygons in that stacked box together to form the superset polygon, or a polygon that includes all of the polygons in the stacked box.
  • the processing device 110 may execute the instructions 130 to, for each of the stacked boxes, partition the superset polygon into bounding shapes. For instance, the processing device 110 may partition the superset polygon into a plurality of bounding shapes such that each of the bounding shapes includes a polygon from the superset polygon.
  • the bounding shapes may be formed through an iterative partitioning operation based upon computed volume errors in the partitioned bounding shapes.
  • the bounding shapes may include any suitable geometric shape including, but not limited to square boxes, rectangular boxes, triangular shaped boxes, octagon shaped boxes, etc.
  • the bounding shapes may be any geometrical entity that may be used to discretize a solid object.
  • the sizes of the bounding shapes may be varied and may be based upon user input. For instance, a user may define the resolution at which the bounding shapes are generated to control the amount of storage space occupied by a shape profile generated using the bounding shapes.
  • the processing device 110 may also execute the instructions 132 to generate a shape profile of the 3D object using the bounding shapes.
  • the shape profile may include the bounding shapes arranged together according to the polygons of the 3D model contained in the bounding shapes.
  • the amount of storage space required by the shape profile may depend upon the resolution of the shape profiles, which, as discussed above, may be user-defined.
  • the shape profile of the 3D object may be a graphical representation of the generated bounding shapes and/or a data representation of the bounding shapes.
  • the processing device 110 may store the generated shape profile in a local data store and/or may communicate the generated shape profile to a 3D printer. Additionally, a plurality of the generated shape profiles may be used in a packing operation that may be implemented to determine an arrangement at which a plurality of 3D objects are to be printed in a build envelope of a 3D printer. Particularly, the packing operation may be implemented to determine how the generated shape profiles may be arranged to maximize the number of 3D objects that may be printed in the build envelope during a single printing operation. In one regard, by using the shape profiles of the 3D objects instead of the 3D models, the computational requirement and the time to execute the packing operation may be significantly reduced. In one example, the processing device 110 may implement the packing operation using the generated shape profiles. In another example, a separate processing device may implement the packing operation using the shape profiles generated by the processing device 110 .
  • the computing apparatus 100 may be a computing device, such as a personal computer, a server computer, a printer, a 3D printer, a smartphone, a tablet computer, etc.
  • the processing device 110 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, an application specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the computer readable storage medium 120 .
  • the computer readable storage medium 120 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
  • the computer readable storage medium 120 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like.
  • the computer readable storage medium 120 may be a non-transitory computer readable storage medium, where the term “non-transitory” does not encompass transitory propagating signals.
  • the computing apparatus 100 may include a data store to which the processing device 110 may be in communication.
  • the data store may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, phase change RAM (PCRAM), memristor, flash memory, and the like.
  • the computing apparatus 100 may further include an input/output interface (not shown) through which the processing device 110 may communicate with an external device(s) (not shown), for instance, to receive and store the information pertaining to the 3D objects, e.g., 3D models, user-defined resolution values, etc.
  • the input/output interface may include hardware and/or software to enable the processing device 110 to communicate with the external device(s).
  • the input/output interface may enable a wired or wireless connection to the output device(s).
  • the input/output interface may further include a network interface card and/or may also include hardware and/or software to enable the processing device 110 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display, another computing device, etc., through which a user may input instructions into the apparatus 100 .
  • FIG. 2 depicts an example method 200 for generating a shape profile for a 3D model
  • FIG. 3 depicts an example method 300 for partitioning superset polygons into bounding shapes. It should be apparent to those of ordinary skill in the art that the methods 200 and 300 may represent generalized illustrations and that other operations may be added or existing operations may be removed, modified, or rearranged without departing from the scopes of the methods 200 and 300 .
  • a 3D model of a 3D object may be accessed.
  • the processing device 110 may execute the instructions 122 to access the 3D model, which may be a graphical representation of a 3D object that a 3D printer is to print, data pertaining to the physical parameters of the 3D object, or the like.
  • the processing device 110 may access the 3D model from a local data storage or may receive the 3D model from an external source.
  • the processing device 110 may execute the instructions 124 to slice the 3D model along a first direction to generate a plurality of layers in the 3D model.
  • the first direction may be, for instance, the z-direction in the Cartesian coordinate system and each of the plurality of layers may extend along respective parallel planes that are defined across a second direction and a third direction.
  • the second direction may be the x-direction and the third direction may be the y-direction in the Cartesian coordinate system.
  • Each of the layers may thus be sliced through the 3D model such that each of the layers is composed of polygons that represent portions or sections of the 3D model.
  • the polygons in a layer may be contours of the 3D model in that layer as discussed above.
  • each of the layers may have the same thickness or different thicknesses as discussed above.
  • FIG. 4A An example of a 3D model 402 is depicted in FIG. 4A and may be used to illustrate some of the features of the method 200 .
  • the 3D model 402 may be sliced into a plurality of layers 404 - 408 such that each of the layers 404 - 408 has the same thickness with respect to each other.
  • the layers 404 - 408 may have different thicknesses with respect to each other.
  • layers 404 - 408 may have a thickness that varies from about 100 microns to about 1 millimeter.
  • FIG. 4A An example of a 3D model 402 is depicted in FIG. 4A and may be used to illustrate some of the features of the method 200 .
  • the 3D model 402 may be sliced into a plurality of layers 404 - 408 such that each of the layers 404 - 408 has the same thickness with respect to each other.
  • the layers 404 - 408 may have different thicknesses with respect to each other.
  • the layers 404 - 408 may be stacked in a first direction 410 and extend along respective parallel planes that are defined across a second direction 412 and a third direction 414 .
  • the directions 410 - 414 are depicted as being orthogonal to each other and may correspond to the axes of a Cartesian coordinate system.
  • each of the layers 404 - 408 is depicted as being composed of respective polygons 420 , which are shown in dashed lines. That is, for instance, each of the layers 404 - 408 may include polygons that represent respective portions of the 3D model 402 .
  • the processing device 110 may execute the instructions 126 to partition the plurality of layers into a plurality of stacked boxes containing the respective polygons.
  • the spaces between the locations at which the layers have been sliced may be construed as the stacked boxes.
  • the stacked boxes may also contain the respective polygons.
  • Various manners in which the layers may be partitioned into stacked boxes is described in greater detail below with respect to the method 300 .
  • the processing device 110 may execute the instructions 128 to, for each of the stacked boxes, assemble the polygons in the stacked box into a superset polygon. For instance, for a particular stacked box, the processing device 110 may assemble the polygons in that stacked box together to form the superset polygon.
  • the superset polygon may be a polygon that includes all of the polygons in the stacked box. As shown in FIG. 4A , each of the layers 404 - 408 may represent a respective stacked box and the polygons 420 in each of the stacked boxes may be a respective superset polygon.
  • the processing device 110 may execute the instructions 130 to, for each of the stacked boxes, partition the superset polygon into bounding shapes. For instance, the processing device 110 may partition the superset polygon into a plurality of bounding shapes such that each of the bounding shapes includes a polygon from the superset polygon. As discussed in greater detail herein below with respect to the method 300 in FIG. 3 , the bounding shapes may be formed through an iterative partitioning operation based upon computed volume errors in the partitioned bounding shapes. In addition, as discussed above, the bounding shapes may include any suitable geometric shape, the sizes of the bounding shapes may be varied, and the sizes may be based upon user input.
  • the processing device 110 may apply a padding to enlarge the superset polygon in the stacked boxes and the processing device 110 may partition the enlarged superset polygon.
  • the enlarged superset polygon may cause the shape profile generated for the 3D object to be relatively larger than the 3D object to, for instance, provide padding between the 3D object and another 3D object in a build envelope of a 3D printer.
  • the padding may be uniform in thickness or may have user-defined forms. For instance, certain geometrical features may be provided with more padding to increase insulation from neighboring 3D objects. In this case, the padding thickness may depend on the geometrical features of the 3D object.
  • the padding thickness may also depend on the particular materials used in building the 3D object or the portions of the 3D object containing the geometrical features. In an example in which a 3D object is formed of multiple materials, different applied materials may require different insulation thicknesses to be decoupled from neighboring 3D objects.
  • the processing device 110 may execute the instructions 132 to generate a shape profile of the 3D object using the bounding shapes.
  • the shape profile may include the bounding shapes arranged together according to the arrangement of the polygons of the 3D model contained in the bounding shapes.
  • the amount of storage space required by the shape profile may depend upon the resolution of the shape profiles, which, as discussed above, may be user-defined.
  • the shape profile of the 3D object may be a graphical representation of the generated bounding shapes and/or a data representation of the bounding shapes. For instance, the shape profile may include various values to identify the locations of the bounding shapes.
  • the method 300 may depict an example of operations that may be performed at block 210 in the method 200 depicted in FIG. 2 .
  • the processing device 210 may execute the instructions 126 to perform the operations described with respect to the method 300 .
  • a vector of a first bounding shape that is defined by a first layer and a last layer of the plurality of layers forming the stacked boxes may be initialized.
  • the first bounding shape may extend from the first layer to the last layer in which the stacked boxes are formed.
  • An example of a first bounding shape 430 is depicted in FIG. 4B .
  • the first bounding shape 430 has a rectangular shape and encompasses the first layer 404 , the second layer 406 , and the third layer 408 .
  • the first bounding shape 430 is defined by the layers 404 - 408 that form the stacked boxes.
  • a vector 432 that has a size that equivalent to the height of the bounding shape 430 .
  • a volume error of the first bounding shape may be computed.
  • the volume error may be defined as the difference between a volume of a bounding shape and a volume of portion of the 3D object contained in the bounding shape.
  • the volume error of the first bounding shape 430 may be computed by computing the volume of the first bounding shape 430 and the volume of the 3D object 402 contained in the first bounding shape 430 and subtracting the volume of the 3D object 402 from the volume of the first bounding shape 430 .
  • the volume error for each of the layers forming the stacked boxes may be computed and recorded.
  • the volume errors for each of the layers may be computed and the volume error for the first bounding shape may be computed from the computed volume errors of each of the layers.
  • the volume error for the first bounding shape may be computed by adding the volume errors of the layers contained in the bounding shape.
  • the volume errors for each of the layers 404 , 406 , and 408 may be computed and recorded and the volume error for the bounding shape 430 may be computed by adding up the volume errors for each of the layers 404 , 406 , and 408 .
  • the volume errors of the further partitioned bounding shapes may be computed by adding the volume errors of the layers contained in the partitioned bounding shapes.
  • the volume errors for the partitioned bounding shapes may be computed in a relatively quick and efficient manner.
  • the first bounding shape may be partitioned into two bounding shapes.
  • the first bounding shape may be partitioned into two bounding shapes by splitting the first bounding shape such that each of the partitioned bounding shapes have similar volume errors with respect to each other.
  • the first bounding shape may be partitioned such that the difference in volume errors between the partitioned bounding shapes is minimized.
  • the volume errors for each of the partitioned bounding shapes may be computed by adding up the previously computed volume errors of the layers contained in each of the partitioned bounding shapes.
  • the first bounding shape may be partitioned along a plane that extends across the second 412 and the third 414 directions and through a layer.
  • int bipartition int i0, int i1, float *err
  • the first bounding shape in the stacked boxes may be replaced with the two partitioned bounding shapes.
  • FIG. 4C depicts the partitioned bounding shapes as elements 440 and 442 .
  • one of the partitioned bounding shapes 440 may include a first stacked box 404 and the other partitioned bounding shape 442 may include the second stacked box 406 and the third stacked box 408 .
  • a vector 444 depicting a size of the partitioned bounding shape 440 is relatively smaller than the vector 432 depicting the size of the first bounding shape 430 .
  • a volume error of each of the two partitioned bounding shapes may be computed.
  • the volume error of a partitioned bounding shape may be computed by adding up the previously computed volume errors of the layers contained in the partitioned bounding shape.
  • the partitioned bounding shape having the highest volume error may be located. Thus, for instance, a determination may be made as to which of the partitioned bounding shapes 440 and 442 has the highest volume error. If the bounding shapes 440 and 442 have the same volume error, one of the bounding shapes 440 and 442 may be selected at random.
  • the located bounding shape having the highest volume error may be partitioned into two additional bounding shapes.
  • the located bounding shape in the stacked boxes may be replaced with the two additional bounding shapes.
  • An example of the additionally partitioned bounding shapes 450 and 452 replacing one of the partitioned bounding shapes 442 is depicted in FIG. 4D .
  • one of the additionally partitioned bounding shapes 450 may include the second stacked box 406 and the other additionally partitioned bounding shape 452 may include the third stacked box 408 .
  • a vector 454 depicting a size of the additionally partitioned bounding shape 450 may be the same as than the vector 444 depicting the size of the partitioned bounding shape 440 .
  • a determination may be made as to whether the size of the vector is less than a predetermined threshold value. For instance, a determination may be made as to whether the size of the smallest vector depicting the size of a partitioned bounding shape is smaller than a predetermined threshold value.
  • the predetermined threshold value may be equivalent to the thickness of the layers 404 - 408 and may thus correspond to the resolution at which the 3D model is initially partitioned into the layers 404 - 408 .
  • blocks 310 - 318 may be repeated.
  • the partitioned bounding shape 440 depicted in FIG. 4D may be additionally partitioned because that bounding shape 440 may have the highest volume error.
  • blocks 310 - 318 may be repeated until a determination is made at block 318 that the size of the smallest vector defining a size of a partitioned bounding shape is smaller than the predetermined threshold value, at which point the method 300 may end as indicated at block 320 .
  • a result of implementing the method 300 may be that a plurality of partitioned bounding shapes that contain polygons representing the 3D object 402 and extend along parallel planes across two-axes may be identified.
  • Various operations of the method 300 may also be implemented to further partition the bounding shapes in the other two directions 412 and 414 . That is, for instance, the processing device 110 may execute the instructions 130 to partition the superset polygon in the respective stacked boxes into partitioned bounding shapes through implementation of various operations of the method 300 on each of the partitioned bounding shapes.
  • the processing device 110 may execute the instructions 130 to partition the superset polygon in the respective stacked boxes into partitioned bounding shapes through implementation of various operations of the method 300 on each of the partitioned bounding shapes.
  • a plurality of partitioned bounding shapes 440 , 450 , and 452 may be identified.
  • the processing device 110 may, for each of the partitioned bounding shapes 440 , 450 , and 452 , further partition the partitioned bounding shape 440 , 450 , 452 along second planes, for instance, planes that extend vertically across the first 410 and second 412 directions as shown in the diagram 460 in FIG. 4E .
  • the processing device 110 may thus execute blocks 306 - 320 on each of the partitioned bounding shapes to further partition the bounding shapes along second planes.
  • the processing device 110 may execute blocks 306 - 320 on each of the further partitioned bounding shapes along third planes, e.g., planes that extend along the first 410 and third 414 directions.
  • An example of the further partitioned bounding shapes is shown in the diagram 470 in FIG. 4F .
  • the further partitioned bounding shapes may be individual cells 472 having cube shapes.
  • each of the cells 472 may include a polygon of the superset polygon contained in the stacked boxes.
  • the cells 472 may be formed to have other geometric shapes, such as, honeycomb shapes, triangular shapes, etc.
  • the resolutions at which the cells 472 may be formed may be user-defined. For instance, a user may specify resolution values (NX, NY, NZ) along the first, second and the third directions for example, the X, Y, and Z directions. Higher values for the resolution values (NX, NY, NZ) may result in better body-fitted resolutions, which entail more geometrical entities being analyzed.
  • the ceiling limit which limits the maximum number of spatial points to be processed may be denoted by (NX+1)*(NY+1)*(NZ+1).
  • the computing apparatus 100 may be used for a large spectrum of applications ranging from packing procedures that generally specify a particular number of geometric entities for processing to caging procedures where higher resolution is preferred.
  • the simple design of the 3D object 402 may result in the bounding shapes containing the polygons having a rectangular or square shaped outline, other 3D object designs may result in the bounding shapes having more complicated outlines.
  • An example of a plurality of bounding shapes formed through implementation of the method 300 by partitioning the bounding shapes along multiple planes is depicted in the diagrams shown in FIG. 7 . As may be determined from that figure, the bounding shapes may follow the outline of the 3D object.
  • the bounding shapes have been described and depicted as being square or rectangular shaped boxes, it should be understood that the bounding shapes may have other suitable shapes, including, triangular, hexagonal, octagonal, etc.
  • the processing device 110 may further partition the partitioned bounding shapes through a separate operation other than the ones used in the method 300 .
  • FIG. 5 there is shown a flowchart of another example method 500 of further partitioning the partitioned boundary shapes in the other two directions. That is, the processing device 110 may further partition each of the boundary shapes, which have been partitioned along a first direction, in the other two directions in a manner that does not include the operations recited in the method 300 .
  • the processing device 110 may implement the method 500 following implementation of the method 300 .
  • the processing device 110 may execute the instructions 130 to partition the superset polygon into bounding shapes to implement the method 500 . That is, the processing device 110 may partition the superset polygons in each of the bounding shapes generated through implementation of the method 300 along each of the other two directions.
  • a bounding shape containing a superset polygon may be selected.
  • the top bounding shape 452 may be selected.
  • the top bounding shape 452 may be formed of a 2D top surface and a 2D bottom surface that form the bounding shape 452 .
  • one of the top 2D surface and the bottom 2D surface may be selected.
  • the outline of the part of the 3D object 402 that extends through the selected surface may be identified in the 2D surface.
  • An example of a 2D surface 600 containing the outline 602 of the part of the 3D object 402 is shown in FIGS. 6A-6C .
  • an array of grid boxes may be overlaid or mapped on the selected 2D surface.
  • An example of the mapping or overlaying of the array of grid boxes 612 is shown in FIGS. 6A-6C .
  • the superset polygon 610 may be made up of a plurality of polygons such that a subset of the polygons represent the external contours (or solid portions of the 3D object) and another subset of the polygons represent internal contours or holes.
  • the superset polygon 610 represents a solid star shape. Accordingly, the grid boxes that lie within the polygons that make up the external contours may be turned on while the remaining grid boxes lying within the polygons representing the internal contours may be turned off.
  • a list of polygons within the selected 2D surface may be obtained. For instance, the list of polygons may denote the locations within the grid boxes 612 of where the polygons of the superset polygon 610 are located.
  • fill values for the grid boxes 612 overlaid on the 2D surface may be determined based on the positions of the grid boxes 612 with respect to the polygons in the selected 2D surface.
  • the grid boxes 612 lying within the solid portion of the superset polygon 610 may be set to have a certain fill value while the grid boxes 612 laying outside of the superset polygon may have a different fill value.
  • the shaded portions at 602 and 604 in FIGS. 6B and 6C may represent the progress of a scan line fill procedure in determining the values of grid boxes based on their positions relative to the superset polygon 610 .
  • An example of the cells containing polygons, i.e., cells having certain fill values, are shown in FIG. 6C as shaded cells.
  • the processing device 110 may thus determine the further partitioned boundary shapes as being the cells containing the polygons.
  • the processing device 110 may determine whether an additional 2D surface exists for which the fill values are to be determined. If yes, a next 2D surface may be selected at block 504 and blocks 506 - 512 may be repeated for the additional 2D surface. If no 2D surfaces remain for processing in the currently selected bounding shape, it is determined at block 514 if another bounding shape remains to be processed. If at block 514 , it is determined that no further bounding shapes remain for processing, the method 500 may terminate on the end block.
  • the method 500 may return to block 502 and blocks 504 - 514 may be repeated for additional bounding shapes until a determination is made that no further partitioned bounding shapes remain to be processed, at which point the method 500 may terminate at the end block.
  • Some or all of the operations set forth in the methods 200 , 300 , and 500 may be contained as utilities, programs, or subprograms, in any desired computer accessible medium.
  • the methods 200 , 300 , and 500 may be embodied by computer programs, which may exist in a variety of forms both active and inactive. For example, they may exist as machine readable instructions, including source code, object code, executable code or other formats. Any of the above may be embodied on a non-transitory computer readable storage medium.
  • non-transitory computer readable storage media include computer system RAM, ROM, EPROM, EEPROM, and magnetic or optical disks or tapes. It is therefore to be understood that any electronic device capable of executing the above-described functions may perform those functions enumerated above.
  • the processing of the partitioned bounding shapes is shown as occurring serially only by way of illustration and that the 2D surfaces and the plurality of partitioned bounding shapes may be similarly processed in parallel to speed up the process of generating the shape profile.
  • the computing apparatus 100 may include multiple processing devices 110 and/or the processing device 110 may include multiple cores that may process multiple ones of the virtual layers as described with respect to FIG. 5 in parallel.
  • FIG. 7 illustrates examples of various shape profiles that may be generated through implementation of the methods disclosed herein with different resolution values provided for a 3D object 710 .
  • the number of geometrical entities and consequently the compute payload may depend on the resolution values NX, NY, and NZ respectively provided in the various directions, for example, in the X, Y, and Z directions.
  • a first shape profile 720 is based on relatively low resolution value and hence has the least number of geometrical entities (boundary shapes) to be processed.
  • the shape profile 720 may therefore be computed faster than the second and third shape profiles 730 and 740 .
  • the resolution values for the second shape profile 730 may be higher than those used for the first shape profile 720 while the resolution values used for the third shape profile 740 may be higher than those used for the first and second shape profiles 720 , 730 .
  • the third shape profile 740 may produce a 3D printed object that is closer to the original 3D object in shape but may have higher compute payload as compared with the other shape profiles 720 , 730 .
  • the resolution values used to generate the shape profiles 720 , 730 , 740 may be user-selectable based on the applications as outlined herein.
  • the shape profiles 720 , 730 and 740 may be defined by a plurality of partitioned boundary shapes, in which the boundary shapes may be partitioned in the manners described herein.
  • the efficiency of many 3D printing applications may be enhanced by implementing the shape profiles as discussed herein.
  • the shape profiles may be used to enhance security of 3D printing applications.
  • the shape profiles may be approximations of the polygons that represent the 3D object as opposed to identical replicas that are normally printed. Therefore, the 3D objects printed from the shape profiles may conceal the exact shape of a 3D object. As a result, unauthorized personnel may not inadvertently view the details of the 3D objects being printed from the shape profiles as disclosed herein.
  • the user selectable resolution values afforded by the examples disclosed herein may enable better usage of the space within the build volume.
  • a single bounding box as required by the packing procedures may be replaced by the vector of stacked boxes and the geometrical operations such as placement, collision detection and the like may be applied simultaneously to the stacked boxes so that the geometrical operations may be executed about the common center of mass for the 3D object enclosed by the stacked boxes.
  • Design rules such as including the required gaps between various 3D objects to be accommodated within a build envelope may be facilitated by enlarging the superset polygons for the bounding shapes (e.g., apply a padding) before the partitioning in the second and the third directions.
  • the padding may be added to the superset polygons at various thicknesses depending upon user-defined inputs, geometrical features of the 3D object, materials used to build the 3D object, combinations thereof, etc.
  • Another 3D printing application may include generation of support cages that isolate a 3D object or a group of 3D objects from a powder bed to protect the objects from abrasive finishing processes.
  • the support cages may be designed to allow powder outflow and provide sufficient padding while efficiently using the build volume with high packing density.
  • the shape profiles disclosed herein may enable automated generation of a cage skeleton used for printing the cage.
  • a contour defined by the virtual stacked boxes from the exposed edges of the shape profile which may include the superset polygons of the NL layers are identified, and the cage skeleton may be generated from the contour of the stacked boxes.
  • a padding may also be incorporated within the cage skeleton by enlarging the superset polygons for each of the stacked boxes.
  • a cage may be generated for a single part or for a group of parts in accordance with the examples discussed herein.
  • the procedure to generate the cage for a group of parts involves an operation that Booleans all the part shapes together prior to applying the shape profile.
  • Another 3D printing application that may be enhanced by incorporation of the shape profiles disclosed herein may be build bed packing preview, which may allow visual inspection of the placement of a 3D object in the build envelope.
  • Hundreds of 3D objects may be potentially included within the build envelope.
  • the shape profiles for the 3D objects as disclosed herein may allow significant geometrical entity reduction. This may reduce the data size required to transmit a view of the contents of the build envelope thereby reducing the network bandwidth required to transmit the view. Moreover, transmitting lower data sizes may also reduce the latency which may result in more responsive web interfaces.
  • the shape profiles as disclosed herein when implemented in the cloud-based visual inspection applications may reduce the cost of the visual inspection services while enhancing user experience.
  • the flexibility afforded by implementing the shape profiles disclosed herein may not only enable providing previews of the contents of the build envelope but may also enable cloud-based editing of the build envelope. Users previewing the contents of the build envelope may be enabled to provide alternative arrangements for shapes within the build envelope.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Image Generation (AREA)

Abstract

According to an example, a processing device may slice a 3D model along a first direction to generate a plurality of layers in parallel planes defined across a second direction and a third direction, in which each of the plurality of layers is composed of respective polygons representing portions of the 3D model. The plurality of layers may be partitioned into a plurality of stacked boxes containing the respective polygons and for each stacked box of the plurality of stacked boxes, the polygons in the stacked box may be assembled into a superset polygon and the superset polygon may be partitioned into bounding shapes. A shape profile of the 3D object may be generated using the bounding shapes.

Description

    BACKGROUND
  • 3D (three-dimensional) manufacturing systems typically employ additive manufacturing techniques to build or print parts within a 3D build envelope of the 3D manufacturing system. As individual parts typically do not require the entire 3D build envelope, 3D manufacturing systems are often operated to build multiple distinct parts within the build envelope concurrently during a common build operation.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Features of the present disclosure are illustrated by way of example and not limited in the following figure(s), in which like numerals indicate like elements, in which:
  • FIG. 1 is a block diagram of an example computing apparatus for generating a shape profile for a 3D object;
  • FIG. 2 depicts an example method for generating a shape profile for a 3D model;
  • FIG. 3 shows an example method for partitioning superset polygons into bounding shapes;
  • FIGS. 4A-4F illustrate example diagrams depicting various stages of bounding shape formation and partitioning;
  • FIG. 5 shows a flowchart of an example method of further partitioning partitioned boundary shapes;
  • FIGS. 6A-6C, respectively, show an example mapping of an array of grid boxes onto a surface of a boundary shape; and
  • FIG. 7 illustrates examples of various shape profiles that may be generated with different resolution values provided for a 3D object.
  • DETAILED DESCRIPTION
  • For simplicity and illustrative purposes, the present disclosure is described by referring mainly to an example thereof. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent however, that the present disclosure may be practiced without limitation to these specific details. In other instances, some methods and structures have not been described in detail so as not to unnecessarily obscure the present disclosure. As used herein, the terms “a” and “an” are intended to denote at least one of a particular element, the term “includes” means includes but not limited to, the term “including” means including but not limited to, and the term “based on” means based at least in part on.
  • Industrial grade 3D printers have large build volumes so that print service providers (PSPs) that need to produce many discrete and oftentimes different, complex parts have the necessary high throughput. In a typical industrial setting, a 3D printer may produce hundreds of parts (or equivalently, 3D objects) in a batch, and each part may have tens of thousands of geometrical entities to be analyzed. In order to achieve the required high throughput, the printing process may be optimized between printing large quantities of geometrical entities requiring analysis of complex shapes while simultaneously providing a quick turn-around. Printing applications may also include multiple data pipeline applications that may demand quick computations. For example, when optimally orienting and placing many different parts into the same build volume (parts packaging) it can be challenging to apply all the geometrical entities precisely within the build volume given the limited computing resources. Cage generation and cloud-based build bed previews and modifications are other processes that may also require fast computations.
  • Disclosed herein are apparatuses and methods for generating shape profiles for 3D objects, for instance, 3D objects that are to be printed. The shape profiles for the 3D objects may include geometrical entities that closely approximate the shapes of 3D objects. That is the shape profiles may have a significantly smaller number of geometrical entities that approximate the shapes of the 3D objects than the actual shapes of the 3D objects. In one regard, therefore, computations implementing the shape profiles may be significantly faster than computations implementing more accurate geometrical entities of the actual 3D objects.
  • The generated shape profiles may be used in the optimization of packaging of parts for 3D printing so that a significant reduction in geometric entities of shapes may be achieved, which in turn may enable faster computing and efficient data transmission even while ensuring accuracy of prints. Particularly, the methods disclosed herein may include the partitioning of layers into a plurality of stacked boxes, in which the stacked boxes include respective polygons that represent portions of a 3D model. In addition, for each of the stacked boxes, the polygons in the stacked box may be assembled into a superset polygon and the superset polygon may be partitioned into bounding shapes. The bounding shapes may be further partitioned in multiple directions to form cells that contain fill values corresponding to the polygons contained in the cells. That is, for instance, the bounding shapes may be further partitioned based upon computed volume errors and replacement of the partitioned bounding shapes with further partitioned bounding shapes. In addition, the shape profile of a 3D model may be generated from the formed cells, in which the cells may have any suitable geometrical shapes such as cubes, honeycombs, triangles, tetrahedrons, etc. For instance, the cells may have any geometrical shapes that may be used to discretize a solid object. The shape profile disclosed herein may define a geometry corresponding to the plurality of stacked boxes.
  • Through implementation of the methods and computing apparatuses disclosed herein to generate shape profiles, the amount of time and processing power that 3D printers may require to process a plurality of 3D objects for printing may be significantly reduced by using the generated shape profiles over other processing techniques. In addition, the amount of storage space required to store profiles of the 3D objects may be significantly reduced, which may also result in greater efficiency in the operations of the 3D printers and/or data storage devices. As discussed herein, in an example, the computing apparatuses may be included in or may be 3D printers and may thus generate shape profiles, implement the generated shape profiles (e.g., use the shape profiles in parts packaging and/or cage generation operations), and may print 3D objects using the generated shape profiles.
  • Referring now to the figures, FIG. 1 is a block diagram of an example computing apparatus 100 for generating a shape profile for a 3D model. It should be understood that the computing apparatus 100 depicted in FIG. 1 may include additional components and that some of the components described herein may be removed and/or modified without departing from a scope of the computing apparatus 100.
  • The computing apparatus 100 is depicted as including a processing device 110 and a non-transitory computer readable storage medium 120. The processing device 110 may fetch, decode, and execute processor-readable instructions, such as instructions 122-132 stored on the computer readable storage medium 120, to control various processes for generating a shape profile of a 3D model. As an alternative or in addition to retrieving and executing instructions, the processing device 110 may include one or more electronic circuits that include electronic components for performing the functionalities of the instructions 122-132.
  • The processing device 110 may execute the instructions 122 to access a 3D model of a 3D object that is to be 3D printed. The processing device 110 may access the 3D model and/or information regarding the 3D model from an external device or the information may be retrieved from a local data store (not shown). In one example, the 3D object may be a unitary piece composed of a plurality of sections and extending along first, second and third directions, for example, the X, Y, and Z directions of a Cartesian coordinate system. By the way of illustration and not limitation, the second and third directions may extend along the length and breadth of a build area platform (not shown) of a 3D printer while the first direction may extend normal to a plane formed along the second and the third directions along the height of a build volume (or build direction) in which the 3D printer builds the 3D object.
  • The processing device 110 may execute the instructions 124 to slice the 3D model along a first direction to produce a plurality of layers in parallel planes that are defined across the second and third directions. For instance, the first, second, and third directions are orthogonal to each other similar to a Cartesian coordinate system. Each of the layers may thus be sliced through the 3D model such that each of the layers is composed of polygons that represent portions or sections of the 3D model. The polygons in a layer may be contours of the 3D model in that layer. In addition, the polygons may be marked, for instance, using an ordering of nodes in a clockwise or counter-clockwise manner, to be in one of two groups. The first group may be an outer contour group, in which the inside of which is solid and so represents interior portions of the 3D model. The second group may be an inner contour group, in which the inside of which is a hole.
  • According to an example, each of the layers may be generated to have the same thickness and the thickness may be of a predetermined size. For instance, the thickness of each of the layers may be based upon a printing resolution of a 3D printer that is to print the 3D object, e.g., around 100 microns. In other examples, the thickness of each of the layers may be relatively larger, e.g., around 1 millimeter. In still further examples, the thickness of each of the layers may be user-defined. In another example, the thicknesses of the layers may not be constant, i.e., may not be the same with respect to each other. Instead, the layers may be generated to have different thicknesses with respect to each other.
  • The processing device 110 may execute the instructions 126 to partition the plurality of layers into a plurality of stacked boxes containing the respective polygons. According to an example, the spaces between the locations at which the layers have been sliced may be construed as the stacked boxes. In this regard, because the polygons representing features of the 3D model are contained in those locations, the stacked boxes may also contain the respective polygons.
  • The processing device 110 may execute the instructions 128 to, for each of the stacked boxes, assemble the polygons in the stacked box into a superset polygon. For instance, for a particular stacked box, the processing device 110 may assemble the polygons in that stacked box together to form the superset polygon, or a polygon that includes all of the polygons in the stacked box.
  • The processing device 110 may execute the instructions 130 to, for each of the stacked boxes, partition the superset polygon into bounding shapes. For instance, the processing device 110 may partition the superset polygon into a plurality of bounding shapes such that each of the bounding shapes includes a polygon from the superset polygon. As discussed in greater detail herein below, the bounding shapes may be formed through an iterative partitioning operation based upon computed volume errors in the partitioned bounding shapes. In addition, the bounding shapes may include any suitable geometric shape including, but not limited to square boxes, rectangular boxes, triangular shaped boxes, octagon shaped boxes, etc. For instance, the bounding shapes may be any geometrical entity that may be used to discretize a solid object. Moreover, the sizes of the bounding shapes may be varied and may be based upon user input. For instance, a user may define the resolution at which the bounding shapes are generated to control the amount of storage space occupied by a shape profile generated using the bounding shapes.
  • The processing device 110 may also execute the instructions 132 to generate a shape profile of the 3D object using the bounding shapes. The shape profile may include the bounding shapes arranged together according to the polygons of the 3D model contained in the bounding shapes. In addition, the amount of storage space required by the shape profile may depend upon the resolution of the shape profiles, which, as discussed above, may be user-defined. The shape profile of the 3D object may be a graphical representation of the generated bounding shapes and/or a data representation of the bounding shapes.
  • According to an example, the processing device 110 may store the generated shape profile in a local data store and/or may communicate the generated shape profile to a 3D printer. Additionally, a plurality of the generated shape profiles may be used in a packing operation that may be implemented to determine an arrangement at which a plurality of 3D objects are to be printed in a build envelope of a 3D printer. Particularly, the packing operation may be implemented to determine how the generated shape profiles may be arranged to maximize the number of 3D objects that may be printed in the build envelope during a single printing operation. In one regard, by using the shape profiles of the 3D objects instead of the 3D models, the computational requirement and the time to execute the packing operation may be significantly reduced. In one example, the processing device 110 may implement the packing operation using the generated shape profiles. In another example, a separate processing device may implement the packing operation using the shape profiles generated by the processing device 110.
  • The computing apparatus 100 may be a computing device, such as a personal computer, a server computer, a printer, a 3D printer, a smartphone, a tablet computer, etc. The processing device 110 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, an application specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the computer readable storage medium 120. The computer readable storage medium 120 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, the computer readable storage medium 120 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like. In some implementations, the computer readable storage medium 120 may be a non-transitory computer readable storage medium, where the term “non-transitory” does not encompass transitory propagating signals.
  • Although not shown, the computing apparatus 100 may include a data store to which the processing device 110 may be in communication. The data store may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, phase change RAM (PCRAM), memristor, flash memory, and the like. The computing apparatus 100 may further include an input/output interface (not shown) through which the processing device 110 may communicate with an external device(s) (not shown), for instance, to receive and store the information pertaining to the 3D objects, e.g., 3D models, user-defined resolution values, etc. The input/output interface may include hardware and/or software to enable the processing device 110 to communicate with the external device(s). The input/output interface may enable a wired or wireless connection to the output device(s). The input/output interface may further include a network interface card and/or may also include hardware and/or software to enable the processing device 110 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display, another computing device, etc., through which a user may input instructions into the apparatus 100.
  • Various manners in which the computing apparatus 100 may be implemented are discussed in greater detail with respect to the methods 200 and 300 respectively depicted in FIGS. 2 and 3. Particularly, FIG. 2 depicts an example method 200 for generating a shape profile for a 3D model and FIG. 3 depicts an example method 300 for partitioning superset polygons into bounding shapes. It should be apparent to those of ordinary skill in the art that the methods 200 and 300 may represent generalized illustrations and that other operations may be added or existing operations may be removed, modified, or rearranged without departing from the scopes of the methods 200 and 300.
  • The descriptions of the methods 200 and 300 are made with reference to the computing apparatus 100 illustrated in FIG. 1 for purposes of illustration. It should, however, be clearly understood that computing apparatuses having other configurations may be implemented to perform either or both of the methods 200 and 300 without departing from the scopes of the methods 200 and 300.
  • With reference first to FIG. 2, at block 202, a 3D model of a 3D object may be accessed. For instance, the processing device 110 may execute the instructions 122 to access the 3D model, which may be a graphical representation of a 3D object that a 3D printer is to print, data pertaining to the physical parameters of the 3D object, or the like. In addition, the processing device 110 may access the 3D model from a local data storage or may receive the 3D model from an external source.
  • At block 204, the processing device 110 may execute the instructions 124 to slice the 3D model along a first direction to generate a plurality of layers in the 3D model. The first direction may be, for instance, the z-direction in the Cartesian coordinate system and each of the plurality of layers may extend along respective parallel planes that are defined across a second direction and a third direction. The second direction may be the x-direction and the third direction may be the y-direction in the Cartesian coordinate system. Each of the layers may thus be sliced through the 3D model such that each of the layers is composed of polygons that represent portions or sections of the 3D model. The polygons in a layer may be contours of the 3D model in that layer as discussed above. In addition, each of the layers may have the same thickness or different thicknesses as discussed above.
  • An example of a 3D model 402 is depicted in FIG. 4A and may be used to illustrate some of the features of the method 200. As shown in FIG. 4A, the 3D model 402 may be sliced into a plurality of layers 404-408 such that each of the layers 404-408 has the same thickness with respect to each other. However, the layers 404-408 may have different thicknesses with respect to each other. As noted above, layers 404-408 may have a thickness that varies from about 100 microns to about 1 millimeter. As also shown in FIG. 4A, the layers 404-408 may be stacked in a first direction 410 and extend along respective parallel planes that are defined across a second direction 412 and a third direction 414. The directions 410-414 are depicted as being orthogonal to each other and may correspond to the axes of a Cartesian coordinate system.
  • In addition, each of the layers 404-408 is depicted as being composed of respective polygons 420, which are shown in dashed lines. That is, for instance, each of the layers 404-408 may include polygons that represent respective portions of the 3D model 402.
  • At block 206, the processing device 110 may execute the instructions 126 to partition the plurality of layers into a plurality of stacked boxes containing the respective polygons. According to an example, the spaces between the locations at which the layers have been sliced may be construed as the stacked boxes. In this regard, because the polygons representing features of the 3D model are contained in those locations, the stacked boxes may also contain the respective polygons. Various manners in which the layers may be partitioned into stacked boxes is described in greater detail below with respect to the method 300.
  • At block 208, the processing device 110 may execute the instructions 128 to, for each of the stacked boxes, assemble the polygons in the stacked box into a superset polygon. For instance, for a particular stacked box, the processing device 110 may assemble the polygons in that stacked box together to form the superset polygon. The superset polygon may be a polygon that includes all of the polygons in the stacked box. As shown in FIG. 4A, each of the layers 404-408 may represent a respective stacked box and the polygons 420 in each of the stacked boxes may be a respective superset polygon.
  • At block 210, the processing device 110 may execute the instructions 130 to, for each of the stacked boxes, partition the superset polygon into bounding shapes. For instance, the processing device 110 may partition the superset polygon into a plurality of bounding shapes such that each of the bounding shapes includes a polygon from the superset polygon. As discussed in greater detail herein below with respect to the method 300 in FIG. 3, the bounding shapes may be formed through an iterative partitioning operation based upon computed volume errors in the partitioned bounding shapes. In addition, as discussed above, the bounding shapes may include any suitable geometric shape, the sizes of the bounding shapes may be varied, and the sizes may be based upon user input.
  • According to an example, the processing device 110 may apply a padding to enlarge the superset polygon in the stacked boxes and the processing device 110 may partition the enlarged superset polygon. In one regard, the enlarged superset polygon may cause the shape profile generated for the 3D object to be relatively larger than the 3D object to, for instance, provide padding between the 3D object and another 3D object in a build envelope of a 3D printer. The padding may be uniform in thickness or may have user-defined forms. For instance, certain geometrical features may be provided with more padding to increase insulation from neighboring 3D objects. In this case, the padding thickness may depend on the geometrical features of the 3D object. The padding thickness may also depend on the particular materials used in building the 3D object or the portions of the 3D object containing the geometrical features. In an example in which a 3D object is formed of multiple materials, different applied materials may require different insulation thicknesses to be decoupled from neighboring 3D objects.
  • At block 212, the processing device 110 may execute the instructions 132 to generate a shape profile of the 3D object using the bounding shapes. The shape profile may include the bounding shapes arranged together according to the arrangement of the polygons of the 3D model contained in the bounding shapes. In addition, the amount of storage space required by the shape profile may depend upon the resolution of the shape profiles, which, as discussed above, may be user-defined. The shape profile of the 3D object may be a graphical representation of the generated bounding shapes and/or a data representation of the bounding shapes. For instance, the shape profile may include various values to identify the locations of the bounding shapes.
  • With reference now to FIG. 3, there is shown an example method 300 for partitioning superset polygons into bounding shapes. In other words, the method 300 may depict an example of operations that may be performed at block 210 in the method 200 depicted in FIG. 2. According to an example, the processing device 210 may execute the instructions 126 to perform the operations described with respect to the method 300.
  • At block 302, a vector of a first bounding shape that is defined by a first layer and a last layer of the plurality of layers forming the stacked boxes may be initialized. For instance, the first bounding shape may extend from the first layer to the last layer in which the stacked boxes are formed. An example of a first bounding shape 430 is depicted in FIG. 4B. As shown in that figure, the first bounding shape 430 has a rectangular shape and encompasses the first layer 404, the second layer 406, and the third layer 408. In other words, the first bounding shape 430 is defined by the layers 404-408 that form the stacked boxes. Also shown in that figure is a vector 432 that has a size that equivalent to the height of the bounding shape 430.
  • At block 304, a volume error of the first bounding shape may be computed. The volume error may be defined as the difference between a volume of a bounding shape and a volume of portion of the 3D object contained in the bounding shape. In this regard, and with reference to FIG. 4B, the volume error of the first bounding shape 430 may be computed by computing the volume of the first bounding shape 430 and the volume of the 3D object 402 contained in the first bounding shape 430 and subtracting the volume of the 3D object 402 from the volume of the first bounding shape 430.
  • According to an example, the volume error for each of the layers forming the stacked boxes may be computed and recorded. Thus, for instance, at block 304, the volume errors for each of the layers may be computed and the volume error for the first bounding shape may be computed from the computed volume errors of each of the layers. For instance, the volume error for the first bounding shape may be computed by adding the volume errors of the layers contained in the bounding shape. By way of example, the volume errors for each of the layers 404, 406, and 408 may be computed and recorded and the volume error for the bounding shape 430 may be computed by adding up the volume errors for each of the layers 404, 406, and 408. Moreover, as discussed in greater detail herein below, as the bounding shape is further partitioned, the volume errors of the further partitioned bounding shapes may be computed by adding the volume errors of the layers contained in the partitioned bounding shapes. In one regard, by initially computing the volume errors for the layers, the volume errors for the partitioned bounding shapes may be computed in a relatively quick and efficient manner.
  • Shown below is an example program flow for computing the volume error of a bounding shape:
  • float * getVolumeErrs(int i0, int i1) {
    //[i0, i1] defines a section of continuous layers (this virtual bounding
    box) bounding shape volume: compute the largest extents (X/Y
    min/max) for all 2-D virtual layers for all sliced layers [i0, i1]. Multiply by the height
    - H*(i1−i0+1).
    3D object volume: Sum all 2D layers area for all sliced layers
    [i0, i1], then multiply by slice layer thickness.
    Volumeerror = bounding shape volume − 3D object volume
    Return Volumeerror
    }.
  • At block 306, the first bounding shape may be partitioned into two bounding shapes. For instance, the first bounding shape may be partitioned into two bounding shapes by splitting the first bounding shape such that each of the partitioned bounding shapes have similar volume errors with respect to each other. In other words, the first bounding shape may be partitioned such that the difference in volume errors between the partitioned bounding shapes is minimized. As discussed above, the volume errors for each of the partitioned bounding shapes may be computed by adding up the previously computed volume errors of the layers contained in each of the partitioned bounding shapes. In addition, the first bounding shape may be partitioned along a plane that extends across the second 412 and the third 414 directions and through a layer.
  • Shown below is an example program flow for a bi-partition methodology which may be used to select the partition plane of a bounding shape to partition the bounding shape:
  • int bipartition (int i0, int i1, float *err) {
    //i0 & i1 are starting/ending layer IDs for the section that is to be
    partitioned via the binary partitioning
    //returns the ID of the partition layer
    head = i0
    tail = i1
    while (tail−head)>0 {
    ip = 0.5*(head + tail)
    err[0] = getVolumeErrs(i0, ip)
    err[1] = getVolumeErrs(ip, i1)
    if ((err[0] − err[1]>tolerance) {tolerance can be
    0.1*totalVollume
    tail = ip
    }
    else if ((err[1] − err[0])>tolerance) {
    Head = ip
    }
    else {
    Return ip
    }
  • At block 308, the first bounding shape in the stacked boxes may be replaced with the two partitioned bounding shapes. An example of the partitioned bounding shapes replacing the first bounding shape 430 is shown in FIG. 4C, which depicts the partitioned bounding shapes as elements 440 and 442. As shown, one of the partitioned bounding shapes 440 may include a first stacked box 404 and the other partitioned bounding shape 442 may include the second stacked box 406 and the third stacked box 408. As also shown in FIG. 4C, a vector 444 depicting a size of the partitioned bounding shape 440 is relatively smaller than the vector 432 depicting the size of the first bounding shape 430.
  • At block 310, a volume error of each of the two partitioned bounding shapes may be computed. As discussed above, the volume error of a partitioned bounding shape may be computed by adding up the previously computed volume errors of the layers contained in the partitioned bounding shape.
  • At block 312, the partitioned bounding shape having the highest volume error may be located. Thus, for instance, a determination may be made as to which of the partitioned bounding shapes 440 and 442 has the highest volume error. If the bounding shapes 440 and 442 have the same volume error, one of the bounding shapes 440 and 442 may be selected at random.
  • At block 314, the located bounding shape having the highest volume error may be partitioned into two additional bounding shapes. In addition, at block 316 the located bounding shape in the stacked boxes may be replaced with the two additional bounding shapes. An example of the additionally partitioned bounding shapes 450 and 452 replacing one of the partitioned bounding shapes 442 is depicted in FIG. 4D. As shown, one of the additionally partitioned bounding shapes 450 may include the second stacked box 406 and the other additionally partitioned bounding shape 452 may include the third stacked box 408. As also shown in FIG. 4D, a vector 454 depicting a size of the additionally partitioned bounding shape 450 may be the same as than the vector 444 depicting the size of the partitioned bounding shape 440.
  • At block 318, a determination may be made as to whether the size of the vector is less than a predetermined threshold value. For instance, a determination may be made as to whether the size of the smallest vector depicting the size of a partitioned bounding shape is smaller than a predetermined threshold value. The predetermined threshold value may be equivalent to the thickness of the layers 404-408 and may thus correspond to the resolution at which the 3D model is initially partitioned into the layers 404-408.
  • In response to a determination that the size of the vector is greater than or equal to the predetermined threshold value, blocks 310-318 may be repeated. Thus, for instance, the partitioned bounding shape 440 depicted in FIG. 4D may be additionally partitioned because that bounding shape 440 may have the highest volume error. In addition, blocks 310-318 may be repeated until a determination is made at block 318 that the size of the smallest vector defining a size of a partitioned bounding shape is smaller than the predetermined threshold value, at which point the method 300 may end as indicated at block 320. A result of implementing the method 300 may be that a plurality of partitioned bounding shapes that contain polygons representing the 3D object 402 and extend along parallel planes across two-axes may be identified.
  • Various operations of the method 300 may also be implemented to further partition the bounding shapes in the other two directions 412 and 414. That is, for instance, the processing device 110 may execute the instructions 130 to partition the superset polygon in the respective stacked boxes into partitioned bounding shapes through implementation of various operations of the method 300 on each of the partitioned bounding shapes. By way of example and with reference back to FIG. 4D, following implementation of the method 300, a plurality of partitioned bounding shapes 440, 450, and 452 may be identified. The processing device 110 may, for each of the partitioned bounding shapes 440, 450, and 452, further partition the partitioned bounding shape 440, 450, 452 along second planes, for instance, planes that extend vertically across the first 410 and second 412 directions as shown in the diagram 460 in FIG. 4E.
  • The processing device 110 may thus execute blocks 306-320 on each of the partitioned bounding shapes to further partition the bounding shapes along second planes. In addition, the processing device 110 may execute blocks 306-320 on each of the further partitioned bounding shapes along third planes, e.g., planes that extend along the first 410 and third 414 directions. An example of the further partitioned bounding shapes is shown in the diagram 470 in FIG. 4F. As shown in that figure, the further partitioned bounding shapes may be individual cells 472 having cube shapes. In addition, each of the cells 472 may include a polygon of the superset polygon contained in the stacked boxes. However, it should be noted that the cells 472 may be formed to have other geometric shapes, such as, honeycomb shapes, triangular shapes, etc.
  • The resolutions at which the cells 472 may be formed may be user-defined. For instance, a user may specify resolution values (NX, NY, NZ) along the first, second and the third directions for example, the X, Y, and Z directions. Higher values for the resolution values (NX, NY, NZ) may result in better body-fitted resolutions, which entail more geometrical entities being analyzed. The ceiling limit which limits the maximum number of spatial points to be processed may be denoted by (NX+1)*(NY+1)*(NZ+1). By specifying the resolution values in the three directions, the user may be able to direct control over the trade-off between quality of the shape profile and the payload (number of geometrical entities). Hence, the computing apparatus 100 may be used for a large spectrum of applications ranging from packing procedures that generally specify a particular number of geometric entities for processing to caging procedures where higher resolution is preferred.
  • Although the simple design of the 3D object 402 may result in the bounding shapes containing the polygons having a rectangular or square shaped outline, other 3D object designs may result in the bounding shapes having more complicated outlines. An example of a plurality of bounding shapes formed through implementation of the method 300 by partitioning the bounding shapes along multiple planes is depicted in the diagrams shown in FIG. 7. As may be determined from that figure, the bounding shapes may follow the outline of the 3D object. In addition, although the bounding shapes have been described and depicted as being square or rectangular shaped boxes, it should be understood that the bounding shapes may have other suitable shapes, including, triangular, hexagonal, octagonal, etc.
  • According to another example, following implementation of the method 300 for one plane direction, the processing device 110 may further partition the partitioned bounding shapes through a separate operation other than the ones used in the method 300. For instance, and with reference to FIG. 5, there is shown a flowchart of another example method 500 of further partitioning the partitioned boundary shapes in the other two directions. That is, the processing device 110 may further partition each of the boundary shapes, which have been partitioned along a first direction, in the other two directions in a manner that does not include the operations recited in the method 300. In one regard, the processing device 110 may implement the method 500 following implementation of the method 300.
  • The processing device 110 may execute the instructions 130 to partition the superset polygon into bounding shapes to implement the method 500. That is, the processing device 110 may partition the superset polygons in each of the bounding shapes generated through implementation of the method 300 along each of the other two directions.
  • At block 502, a bounding shape containing a superset polygon may be selected. For instance, with respect to FIG. 4D, the top bounding shape 452 may be selected. As also shown in that figure, the top bounding shape 452 may be formed of a 2D top surface and a 2D bottom surface that form the bounding shape 452. At block 504, one of the top 2D surface and the bottom 2D surface may be selected. It should be noted that the outline of the part of the 3D object 402 that extends through the selected surface may be identified in the 2D surface. An example of a 2D surface 600 containing the outline 602 of the part of the 3D object 402 is shown in FIGS. 6A-6C.
  • At block 506, an array of grid boxes may be overlaid or mapped on the selected 2D surface. An example of the mapping or overlaying of the array of grid boxes 612 is shown in FIGS. 6A-6C. The superset polygon 610 may be made up of a plurality of polygons such that a subset of the polygons represent the external contours (or solid portions of the 3D object) and another subset of the polygons represent internal contours or holes. In the example shown in FIGS. 6A-6C, the superset polygon 610 represents a solid star shape. Accordingly, the grid boxes that lie within the polygons that make up the external contours may be turned on while the remaining grid boxes lying within the polygons representing the internal contours may be turned off. At block 508, a list of polygons within the selected 2D surface may be obtained. For instance, the list of polygons may denote the locations within the grid boxes 612 of where the polygons of the superset polygon 610 are located.
  • At block 510, fill values for the grid boxes 612 overlaid on the 2D surface may be determined based on the positions of the grid boxes 612 with respect to the polygons in the selected 2D surface. The grid boxes 612 lying within the solid portion of the superset polygon 610 may be set to have a certain fill value while the grid boxes 612 laying outside of the superset polygon may have a different fill value. The shaded portions at 602 and 604 in FIGS. 6B and 6C may represent the progress of a scan line fill procedure in determining the values of grid boxes based on their positions relative to the superset polygon 610. An example of the cells containing polygons, i.e., cells having certain fill values, are shown in FIG. 6C as shaded cells. The processing device 110 may thus determine the further partitioned boundary shapes as being the cells containing the polygons.
  • At block 512, the processing device 110 may determine whether an additional 2D surface exists for which the fill values are to be determined. If yes, a next 2D surface may be selected at block 504 and blocks 506-512 may be repeated for the additional 2D surface. If no 2D surfaces remain for processing in the currently selected bounding shape, it is determined at block 514 if another bounding shape remains to be processed. If at block 514, it is determined that no further bounding shapes remain for processing, the method 500 may terminate on the end block. If at block 514, it is determined that a further bounding shape remains for processing, the method 500 may return to block 502 and blocks 504-514 may be repeated for additional bounding shapes until a determination is made that no further partitioned bounding shapes remain to be processed, at which point the method 500 may terminate at the end block.
  • Following conclusion of the method 500, a plurality of further partitioned boundary shapes similar to the ones shown in FIG. 4F and FIG. 7 may be generated.
  • Some or all of the operations set forth in the methods 200, 300, and 500 may be contained as utilities, programs, or subprograms, in any desired computer accessible medium. In addition, the methods 200, 300, and 500 may be embodied by computer programs, which may exist in a variety of forms both active and inactive. For example, they may exist as machine readable instructions, including source code, object code, executable code or other formats. Any of the above may be embodied on a non-transitory computer readable storage medium.
  • Examples of non-transitory computer readable storage media include computer system RAM, ROM, EPROM, EEPROM, and magnetic or optical disks or tapes. It is therefore to be understood that any electronic device capable of executing the above-described functions may perform those functions enumerated above.
  • It may be appreciated that the processing of the partitioned bounding shapes is shown as occurring serially only by way of illustration and that the 2D surfaces and the plurality of partitioned bounding shapes may be similarly processed in parallel to speed up the process of generating the shape profile. For instance, the computing apparatus 100 may include multiple processing devices 110 and/or the processing device 110 may include multiple cores that may process multiple ones of the virtual layers as described with respect to FIG. 5 in parallel.
  • FIG. 7 illustrates examples of various shape profiles that may be generated through implementation of the methods disclosed herein with different resolution values provided for a 3D object 710. As mentioned above, the number of geometrical entities and consequently the compute payload may depend on the resolution values NX, NY, and NZ respectively provided in the various directions, for example, in the X, Y, and Z directions. It can be seen from the figure that a first shape profile 720 is based on relatively low resolution value and hence has the least number of geometrical entities (boundary shapes) to be processed. The shape profile 720 may therefore be computed faster than the second and third shape profiles 730 and 740. The resolution values for the second shape profile 730 may be higher than those used for the first shape profile 720 while the resolution values used for the third shape profile 740 may be higher than those used for the first and second shape profiles 720, 730. Hence, the third shape profile 740 may produce a 3D printed object that is closer to the original 3D object in shape but may have higher compute payload as compared with the other shape profiles 720, 730. The resolution values used to generate the shape profiles 720, 730, 740 may be user-selectable based on the applications as outlined herein. As seen from FIG. 7, the shape profiles 720, 730 and 740 may be defined by a plurality of partitioned boundary shapes, in which the boundary shapes may be partitioned in the manners described herein.
  • The efficiency of many 3D printing applications may be enhanced by implementing the shape profiles as discussed herein. In addition, the shape profiles may be used to enhance security of 3D printing applications. Generally speaking, the shape profiles may be approximations of the polygons that represent the 3D object as opposed to identical replicas that are normally printed. Therefore, the 3D objects printed from the shape profiles may conceal the exact shape of a 3D object. As a result, unauthorized personnel may not inadvertently view the details of the 3D objects being printed from the shape profiles as disclosed herein.
  • Parts packing or procedures placing many parts or 3D objects into the same build volume to be fabricated together may use bounding boxes to represent a part and place the part within a build volume. While the use of the bounding boxes may lead to lower compute payloads, a downside may be that the space within the build volume may be used less efficiently. For example, if an L-shaped part is approximated as a stacked box, the empty space on the upper right hand side of the ‘L’ shape is wasted. In this case, a shape profile represented by a set of geometric entities with the resolution values such as NX=NY=1 and NZ=2 may provide an arrangement such that the upper right hand side portion of the ‘L’ shape remains vacant and a different part may be placed within the vacancy. Thus, the user selectable resolution values afforded by the examples disclosed herein may enable better usage of the space within the build volume. To incorporate the shape profiles as disclosed herein into packing procedures, a single bounding box as required by the packing procedures may be replaced by the vector of stacked boxes and the geometrical operations such as placement, collision detection and the like may be applied simultaneously to the stacked boxes so that the geometrical operations may be executed about the common center of mass for the 3D object enclosed by the stacked boxes. Design rules such as including the required gaps between various 3D objects to be accommodated within a build envelope may be facilitated by enlarging the superset polygons for the bounding shapes (e.g., apply a padding) before the partitioning in the second and the third directions. As discussed above, the padding may be added to the superset polygons at various thicknesses depending upon user-defined inputs, geometrical features of the 3D object, materials used to build the 3D object, combinations thereof, etc.
  • Another 3D printing application may include generation of support cages that isolate a 3D object or a group of 3D objects from a powder bed to protect the objects from abrasive finishing processes. The support cages may be designed to allow powder outflow and provide sufficient padding while efficiently using the build volume with high packing density. The shape profiles disclosed herein may enable automated generation of a cage skeleton used for printing the cage. In one example, a contour defined by the virtual stacked boxes from the exposed edges of the shape profile which may include the superset polygons of the NL layers are identified, and the cage skeleton may be generated from the contour of the stacked boxes. A padding may also be incorporated within the cage skeleton by enlarging the superset polygons for each of the stacked boxes. Different levels of caging resolutions may be available depending on the applications. For instance, higher powder flowability may require coarser cages while finer cages provide better protection against abrasive finishing procedures. The cages also facilitate higher packing density since the cages are generated from the shape profiles and hence may better fit the body parts that they enclose. A cage may be generated for a single part or for a group of parts in accordance with the examples discussed herein. The procedure to generate the cage for a group of parts involves an operation that Booleans all the part shapes together prior to applying the shape profile.
  • Another 3D printing application that may be enhanced by incorporation of the shape profiles disclosed herein may be build bed packing preview, which may allow visual inspection of the placement of a 3D object in the build envelope. Hundreds of 3D objects may be potentially included within the build envelope. When a visual inspection service is provided as a cloud service, the shape profiles for the 3D objects as disclosed herein may allow significant geometrical entity reduction. This may reduce the data size required to transmit a view of the contents of the build envelope thereby reducing the network bandwidth required to transmit the view. Moreover, transmitting lower data sizes may also reduce the latency which may result in more responsive web interfaces. The shape profiles as disclosed herein when implemented in the cloud-based visual inspection applications may reduce the cost of the visual inspection services while enhancing user experience. The flexibility afforded by implementing the shape profiles disclosed herein may not only enable providing previews of the contents of the build envelope but may also enable cloud-based editing of the build envelope. Users previewing the contents of the build envelope may be enabled to provide alternative arrangements for shapes within the build envelope.
  • Although described specifically throughout the entirety of the instant disclosure, representative examples of the present disclosure have utility over a wide range of applications, and the above discussion is not intended and should not be construed to be limiting, but is offered as an illustrative discussion of aspects of the disclosure.
  • What has been described and illustrated herein is an example of the disclosure along with some of its variations. The terms, descriptions and figures used herein are set forth by way of illustration only and are not meant as limitations. Many variations are possible within the spirit and scope of the disclosure, which is intended to be defined by the following claims—and their equivalents—in which all terms are meant in their broadest reasonable sense unless otherwise indicated.

Claims (15)

What is claimed is:
1. A computing apparatus comprising:
a processing device;
a non-transitory computer readable storage medium on which are stored instructions that when executed by the processing device cause the processing device to:
access a three dimensional (3D) model of a 3D object;
slice the 3D model along a first direction to generate a plurality of layers in parallel planes defined across a second direction and a third direction, wherein the first, second, and third directions are orthogonal to each other, and wherein each of the plurality of layers is composed of respective polygons representing portions of the 3D model;
partition the plurality of layers into a plurality of stacked boxes containing the respective polygons;
for each stacked box of the plurality of stacked boxes,
assemble the polygons in the stacked box into a superset polygon; and
partition the superset polygon into bounding shapes; and
generate a shape profile of the 3D object using the bounding shapes.
2. The computing apparatus according to claim 1, wherein the polygons representing portions of the 3D model represent contours of the 3D model, and wherein contours represent one of an interior portion and a hole of the 3D model.
3. The computing apparatus according to claim 1, wherein to partition the plurality of layers into the plurality of stacked boxes, the instructions are further to cause the processing device to:
initialize a vector of a first bounding shape that is defined by a first layer and a last layer of the plurality of layers forming the stacked boxes;
compute a volume error of the first bounding shape;
partition the first bounding shape into two bounding shapes, wherein a difference in volume error between the two bounding shapes is minimized;
replace the first bounding shape in the stacked boxes with the two bounding shapes;
compute a volume error for each of the two bounding shapes;
locate the bounding shape having the highest volume error;
partition the located bounding shape into two additional bounding shapes, wherein a difference in volume error between the two additional bounding shapes is minimized; and
replace the located bounding shaped in the stacked boxes with the two additional bounding shapes.
4. The computing apparatus according to claim 3, wherein the instructions are further to cause the processing device to compute volume errors for each of the plurality of layers, to compute the volume error for the first bounding shape by adding the computed volume errors of the layers contained in the first bounding shape together, and to compute the volume errors for each of the two bounding shapes using the computed volume errors of the layers respectively contained in the two bounding shapes.
5. The computing apparatus according to claim 3, wherein the instructions are further to cause the processing device to:
continue to compute volume errors, partition bounding shapes having the largest volume errors, and replace the partitioned bounding shapes with further partitioned bounding shapes until a size of the vector falls below a predetermined threshold number of partitions.
6. The computing apparatus according to claim 1, wherein to partition the superset polygon into bounding shapes, the instructions are further to cause the processing device to:
initialize a vector of a first bounding shape;
partition the first bounding shape into two bounding shapes along the second direction;
compute a volume error for each of the two bounding shapes;
locate the bounding shape having the largest volume error;
partition the located bounding shape having the largest volume error into two additional bounding shapes along the second direction; and
replace the first bounding shape with the partitioned bounding shape and the additional partitioned bounding shapes.
7. The computing apparatus according to claim 6, wherein the instructions are further to cause the processing device to:
further partition the partitioned bounding shapes along the third direction;
compute a volume error for each of the further partitioned bounding shapes;
locate the further partitioned bounding shape having the largest volume error;
partition the located further partitioned bounding shape having the largest volume error into two additional further partitioned bounding shapes along the third direction; and
replace the partitioned bounding shapes with the further partitioned bounding shape and the additional further partitioned bounding shapes.
8. The computing apparatus according to claim 6, wherein the instructions are further to cause the processor to:
for each of the stacked boxes, map a grid of cells over each of the layers in the stacked box and implement a process to determine locations in the grid of the respective polygons, wherein the cells in the grid form partitioned bounding shapes along the second and third directions containing the respective polygons.
9. A method comprising:
accessing a three dimensional (3D) model of a 3D object;
slicing, by a processing device, the 3D model along a first direction to generate a plurality of layers in parallel planes defined across a second direction and a third direction, wherein the first, second, and third directions are orthogonal to each other, and wherein each of the plurality of layers is composed of respective polygons representing portions of the 3D model;
partitioning by the processing device, the plurality of layers into a plurality of stacked boxes containing the respective polygons;
for each stacked box of the plurality of stacked boxes, by the processing device:
assembling the polygons in the stacked box into a superset polygon; and
partitioning the superset polygon into bounding shapes; and
generating, by the processing device, a shape profile of the 3D object using the bounding shapes.
10. The method according to claim 9, wherein partitioning the superset polygons into bounding shapes further comprises:
iteratively partitioning the bounding shapes based upon computed volume errors of the iteratively partitioned bounding shapes until a size of the bounding shapes in each of the first, second, and third directions falls below a predetermined threshold value.
11. The method according to claim 10, wherein iteratively partitioning the bounding shapes further comprises iteratively partitioning the bounding shapes having the largest volume errors in each of the first, second, and third directions until the iteratively partitioned bounding shapes have a dimension that falls below the predetermined threshold value in each of the first, second, and third directions.
12. The method according to claim 11, wherein the predetermined threshold value is a user defined resolution value and wherein the bounding shapes comprise shapes selected from cubes, rectangular boxes, honeycombs, and tetrahedrons.
13. The method according to claim 9, further comprising:
for each stacked box, applying a padding to enlarge the superset polygon, and wherein partitioning the superset polygon further comprises partitioning the enlarged superset polygon.
14. A non-transitory computer readable storage medium on which are stored machine readable instructions that when executed by a processing device, cause the processing device to:
access a three dimensional (3D) model of a 3D object;
slice the 3D model along a first direction to generate a plurality of layers in parallel planes defined across a second direction and a third direction, wherein the first, second, and third directions are orthogonal to each other, and wherein each of the plurality of layers is composed of respective polygons representing portions of the 3D model;
partition the plurality of layers into a plurality of stacked boxes containing the respective polygons;
for each stacked box of the plurality of stacked boxes,
assemble the polygons in the stacked box into a superset polygon; and
partition the superset polygon into bounding shapes; and
generate a shape profile of the 3D object using the bounding shapes.
15. The non-transitory computer readable storage medium according to claim 14, wherein the machine readable instructions are further to cause the processing device to:
iteratively partition the bounding shapes based upon computed volume errors of the iteratively partitioned bounding shapes until a size of the bounding shapes in each of the first, second, and third directions falls below a predetermined threshold value.
US16/072,277 2016-05-16 2016-05-16 Generating a shape profile for a 3D object Active US11043042B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/032740 WO2017200527A1 (en) 2016-05-16 2016-05-16 Generating a shape profile for a 3d object

Publications (2)

Publication Number Publication Date
US20190026953A1 true US20190026953A1 (en) 2019-01-24
US11043042B2 US11043042B2 (en) 2021-06-22

Family

ID=60325416

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/072,277 Active US11043042B2 (en) 2016-05-16 2016-05-16 Generating a shape profile for a 3D object

Country Status (2)

Country Link
US (1) US11043042B2 (en)
WO (1) WO2017200527A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210109499A1 (en) * 2016-10-27 2021-04-15 Desprez Llc System and method for generating a quote for fabrication of a part to be fabricated
US11043042B2 (en) * 2016-05-16 2021-06-22 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3D object
CN113414987A (en) * 2021-06-23 2021-09-21 哈尔滨理工大学 3D printing self-adaptive layering thickness method
US11282288B2 (en) 2019-11-20 2022-03-22 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding data in notched shapes
CN116681841A (en) * 2023-08-03 2023-09-01 中国科学院长春光学精密机械与物理研究所 Quality evaluation method for tomographic reconstruction and storage medium
US11900629B2 (en) * 2019-12-11 2024-02-13 Nvidia Corporation Surface profile estimation and bump detection for autonomous machine

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109866418A (en) 2017-12-04 2019-06-11 三纬国际立体列印科技股份有限公司 The 3D printer and its gradation Method of printing of graded printing
DE102018201739A1 (en) * 2018-02-05 2019-08-08 Eos Gmbh Electro Optical Systems Method and apparatus for providing a control instruction set
US20210012049A1 (en) * 2018-02-16 2021-01-14 Coventor, Inc. System and method for multi-material mesh generation from fill-fraction voxel data
WO2019177606A1 (en) 2018-03-14 2019-09-19 Hewlett-Packard Development Company, L.P. Three dimensional model categories
WO2020076285A1 (en) * 2018-10-08 2020-04-16 Hewlett-Packard Development Company, L.P. Validating object model data for additive manufacturing

Citations (129)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517602A (en) * 1992-12-03 1996-05-14 Hewlett-Packard Company Method and apparatus for generating a topologically consistent visual representation of a three dimensional surface
US5673377A (en) * 1994-09-02 1997-09-30 Ray Dream, Inc. Method and system for displaying a representation of a three-dimensional object with surface features that conform to the surface of the three-dimensional object
US6429864B1 (en) * 1999-11-10 2002-08-06 Create.It Services Ag Method for traversing a binary space partition or octree and image processor for implementing the method
US20020158867A1 (en) * 2001-04-30 2002-10-31 Bloomenthal Jules I. Method to compute the medial axis/surface of a three-dimensional object
US20020186216A1 (en) * 2001-06-11 2002-12-12 Baumberg Adam Michael 3D computer modelling apparatus
US20020190986A1 (en) * 2001-06-12 2002-12-19 Minolta Co., Ltd. Method, apparatus, and computer program for generating three-dimensional shape data or volume data
US20030001836A1 (en) * 2001-03-12 2003-01-02 Ernst Fabian Edgar Reconstructor for and method of generating a three-dimensional representation and image display apparatus comprising the reconstructor
US20030035061A1 (en) * 2001-08-13 2003-02-20 Olympus Optical Co., Ltd. Shape extraction system and 3-D (three dimension) information acquisition system using the same
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US6580426B1 (en) * 1999-03-03 2003-06-17 Canon Kabushiki Kaisha Computer graphics apparatus for processing of data defining a three-dimensional computer model to partition the three-dimensional space into a plurality of sectors
US20040170255A1 (en) * 2003-02-27 2004-09-02 Shimadzu Corporation Radiographic X-ray device
US20040193392A1 (en) * 2001-02-28 2004-09-30 Williams Richard Andrew Object interaction simulation
US20040267400A1 (en) * 2001-08-16 2004-12-30 Hitoshi Ohmori Die machining method and device by v-cad data
US20050017971A1 (en) * 2003-07-24 2005-01-27 Cleve Ard Ray tracing hierarchy
US20050068317A1 (en) * 2002-06-28 2005-03-31 Fujitsu Limited Program, method, and device for comparing three-dimensional images in voxel form
US20050151735A1 (en) * 2003-12-20 2005-07-14 Martijn Boekhorst Method for determining the bounding voxelisation of a 3D polygon
US20060056726A1 (en) * 2004-08-17 2006-03-16 Konica Minolta Medical & Graphic, Inc. Image processing device, and program
US20060071932A1 (en) * 2002-11-21 2006-04-06 Koninklijke Philips Electronics N.V. Method and apparatus for visualizing a sequece of volume images
US20060147106A1 (en) * 2004-12-22 2006-07-06 Lining Yang Using temporal and spatial coherence to accelerate maximum/minimum intensity projection
US20060152510A1 (en) * 2002-06-19 2006-07-13 Jochen Dick Cross-platform and data-specific visualisation of 3d data records
US20060274065A1 (en) * 2003-08-18 2006-12-07 Georgiy Buyanovskiy Method and system for adaptive direct volume rendering
US20060274061A1 (en) * 2005-06-02 2006-12-07 Hongwu Wang Four-dimensional volume of interest
US20060290695A1 (en) * 2001-01-05 2006-12-28 Salomie Ioan A System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US20070014480A1 (en) * 2005-07-13 2007-01-18 General Electric Company Method and apparatus for creating a multi-resolution framework for improving medical imaging workflow
US7206987B2 (en) * 2003-04-30 2007-04-17 Hewlett-Packard Development Company, L.P. Error detection and correction in a layered, 3-dimensional storage architecture
US20080118118A1 (en) * 2006-11-22 2008-05-22 Ralf Berger Method and Apparatus for Segmenting Images
US20080238919A1 (en) * 2007-03-27 2008-10-02 Utah State University System and method for rendering of texel imagery
US20080259075A1 (en) * 2007-04-19 2008-10-23 David Keith Fowler Dynamically Configuring and Selecting Multiple Ray Tracing Intersection Methods
US20080292169A1 (en) * 2007-05-21 2008-11-27 Cornell University Method for segmenting objects in images
US20090060309A1 (en) * 2007-08-30 2009-03-05 Canon Kabushiki Kaisha Radiation image processing apparatus and method thereof
US20090096787A1 (en) * 2007-04-12 2009-04-16 Fujifilm Corporation Method and apparatus for processing three dimensional images, and recording medium having a program for processing three dimensional images recorded therein
US20090167763A1 (en) * 2000-06-19 2009-07-02 Carsten Waechter Quasi-monte carlo light transport simulation by efficient ray tracing
US20090208075A1 (en) * 2006-05-19 2009-08-20 Koninklijke Philips Electronics N. V. Error adaptive functional iimaging
US20090292206A1 (en) * 2008-05-20 2009-11-26 Toshiba Medical Systems Corporation Image processing apparatus and computer program product
US20090312996A1 (en) * 2008-06-13 2009-12-17 Schlumberger Technology Corporation Feedback control using a simlator of a subterranean structure
US7747055B1 (en) * 1998-11-25 2010-06-29 Wake Forest University Health Sciences Virtual endoscopy with improved image segmentation and lesion detection
US20100191757A1 (en) * 2009-01-27 2010-07-29 Fujitsu Limited Recording medium storing allocation control program, allocation control apparatus, and allocation control method
US20110043521A1 (en) * 2009-08-18 2011-02-24 Dreamworks Animation Llc Ray-aggregation for ray-tracing during rendering of imagery
US7898540B2 (en) * 2005-09-12 2011-03-01 Riken Method and program for converting boundary data into cell inner shape data
US20110193859A1 (en) * 2010-02-09 2011-08-11 Samsung Electronics Co., Ltd Apparatus and method for generating octree based 3D map
US20110285710A1 (en) * 2010-05-21 2011-11-24 International Business Machines Corporation Parallelized Ray Tracing
US20110316855A1 (en) * 2010-06-24 2011-12-29 International Business Machines Corporation Parallelized Streaming Accelerated Data Structure Generation
US8130223B1 (en) * 2008-09-10 2012-03-06 Nvidia Corporation System and method for structuring an A-buffer to support multi-sample anti-aliasing
US20120180000A1 (en) * 2011-01-10 2012-07-12 Compal Electronics, Inc. Method and system for simulating three-dimensional operating interface
US20130066812A1 (en) * 2011-09-13 2013-03-14 Stratasys, Inc. Solid identification grid engine for calculating support material volumes, and methods of use
US20130187903A1 (en) * 2012-01-24 2013-07-25 Pavlos Papageorgiou Image processing method and system
US20130321414A1 (en) * 2009-08-07 2013-12-05 Cherif Atia Algreatly Converting a 3d model into multiple matrices
US20140067333A1 (en) * 2012-09-04 2014-03-06 Belcan Corporation CAD-Based System for Product Definition, Inspection and Validation
US20140176545A1 (en) * 2012-12-21 2014-06-26 Nvidia Corporation System, method, and computer program product implementing an algorithm for performing thin voxelization of a three-dimensional model
US20140231266A1 (en) * 2011-07-13 2014-08-21 Nuvotronics, Llc Methods of fabricating electronic and mechanical structures
US20140313195A1 (en) * 2008-02-29 2014-10-23 Cherif Atia Algreatly 3D Model Mapping
US20140324204A1 (en) * 2013-04-18 2014-10-30 Massachusetts Institute Of Technology Methods and apparati for implementing programmable pipeline for three-dimensional printing including multi-material applications
US20140330796A1 (en) * 2013-05-03 2014-11-06 Nvidia Corporation Compressed pointers for cell structures
US20140327667A1 (en) * 2013-05-02 2014-11-06 Samsung Medison Co., Ltd. Medical imaging apparatus and control method for the same
US20150084953A1 (en) * 2012-04-19 2015-03-26 Thomson Licensing Method and apparatus for estimating error metrics for multi-component 3d models
US20150138201A1 (en) * 2013-11-20 2015-05-21 Fovia, Inc. Volume rendering color mapping on polygonal objects for 3-d printing
US20150148930A1 (en) * 2013-11-27 2015-05-28 Adobe Systems Incorporated Method and apparatus for preserving structural integrity of 3-dimensional models when printing at varying scales
US20150161786A1 (en) * 2013-12-06 2015-06-11 Siemens Aktiengesellschaft Query-specific generation and retrieval of medical volume images
US9076219B2 (en) * 2011-10-07 2015-07-07 Electronics And Telecommunications Research Institute Space segmentation method for 3D point clouds
US9111385B2 (en) * 2011-11-25 2015-08-18 Samsung Electronics Co., Ltd. Apparatus and method for rendering volume data
US20150262416A1 (en) * 2014-03-13 2015-09-17 Pixar Importance sampling of sparse voxel octrees
US20160101570A1 (en) * 2014-10-09 2016-04-14 Autodesk, Inc. Multi-material three dimensional models
US20160303803A1 (en) * 2015-04-14 2016-10-20 Shapeways, Inc. Multi-part counting system for three-dimensional printed parts
US20160337549A1 (en) * 2015-05-14 2016-11-17 Xerox Corporation 3d printer steganography
US20160332388A1 (en) * 2015-05-12 2016-11-17 Seoul National University R&Db Foundation Method of forming transparent 3d object and transparent 3d object formed thereby
US20160334964A1 (en) * 2013-12-31 2016-11-17 SAMSUNG ELECTRONICS CO., LTD. Co., Ltd. User interface system and method for enabling mark-based interaction for images
US20160358384A1 (en) * 2015-06-08 2016-12-08 Airbus Operations (S.A.S.) Damage detection and repair system and method using enhanced geolocation
US9552664B2 (en) * 2014-09-04 2017-01-24 Nvidia Corporation Relative encoding for a block-based bounding volume hierarchy
US20170039759A1 (en) * 2014-04-17 2017-02-09 3D Slash Three dimensional modeling
US9582607B2 (en) * 2014-09-04 2017-02-28 Nividia Corporation Block-based bounding volume hierarchy
US20170091965A1 (en) * 2015-09-29 2017-03-30 Yandex Europe Ag Method of and system for generating simplified borders of graphical objects
US20170113414A1 (en) * 2014-08-29 2017-04-27 Hewlett-Packard Development Company, L.P. Generation of three-dimensional objects
US20170246812A1 (en) * 2014-10-29 2017-08-31 Hewlett-Packard Development Company, L.P. Converting at least a portion of a 3-d object into a format suitable for printing
US20170249782A1 (en) * 2015-01-30 2017-08-31 Hewlett-Packard Development Company, L.P . Generating slicing data from a tree data structure
US20170323436A1 (en) * 2016-05-06 2017-11-09 L-3 Communications Security & Detection Systems, Inc. Systems and methods for generating projection images
US9842424B2 (en) * 2014-02-10 2017-12-12 Pixar Volume rendering using adaptive buckets
US20170365095A1 (en) * 2015-04-16 2017-12-21 Hewlett-Packard Development Company, L.P. Three-dimensional threshold matrix for three-dimensional halftoning
US20170364316A1 (en) * 2015-01-30 2017-12-21 Hewlett-Packard Development Company, L.P. Material volume coverage representation of a three-dimensional object
US20170372513A1 (en) * 2015-01-30 2017-12-28 Hewlett-Packard Development Company, L.P. Generating slice data from a voxel representation
US20170371318A1 (en) * 2015-01-30 2017-12-28 Hewlett-Packard Development Company, L.P. Generating control data for sub-objects
US20170368755A1 (en) * 2016-06-22 2017-12-28 Massachusetts Institute Of Technology Methods and Apparatus for 3D Printing of Point Cloud Data
US20180001566A1 (en) * 2015-01-30 2018-01-04 Hewlett-Packard Development Company, L.P. Three-dimensional object substructures
US20180032060A1 (en) * 2015-04-20 2018-02-01 Hewlett-Packard Development Company, L.P. Creating a voxel representation of a three dimensional (3-d) object
US20180046167A1 (en) * 2016-08-12 2018-02-15 Microsoft Technology Licensing, Llc 3D Printing Using 3D Video Data
US20180052447A1 (en) * 2015-04-28 2018-02-22 Hewlett-Packard Development Company, L.P. Structure using three-dimensional halftoning
US20180052947A1 (en) * 2015-04-23 2018-02-22 Hewlett-Packard Development Company, L.P. Lattice structure representation for a three-dimensional object
US20180151254A1 (en) * 2016-11-30 2018-05-31 Electronics And Telecommunications Research Institute High-speed similar case search method and device through reduction of large scale multi-dimensional time series health data to multiple dimensions
US10054932B2 (en) * 2013-03-11 2018-08-21 Autodesk, Inc. Techniques for two-way slicing of a 3D model for manufacturing
US20180240269A1 (en) * 2017-02-20 2018-08-23 My Virtual Reality Software As Method for visualizing three dimensional data
US20180264722A1 (en) * 2015-04-14 2018-09-20 Hewlett-Packard Development Company, L.P. Marking build material
US20180272429A1 (en) * 2015-02-06 2018-09-27 Dresser-Rand Company Methods for Additive Manufacturing of a Single Piece Piston
US20180276887A1 (en) * 2016-12-16 2018-09-27 University Of Manitoba Medial Axis Extraction for Complex 3D Objects
US20180276316A1 (en) * 2017-03-23 2018-09-27 Autodesk, Inc. Creating gradients of different materials for three-dimensional models in computer aided design applications
US20180281294A1 (en) * 2016-08-26 2018-10-04 Wacker Chemie Ag Method for producing shaped bodies
US20180299869A1 (en) * 2016-01-14 2018-10-18 Ricoh Company, Ltd. Modeling process device, modeling process system, and medium
US20180300947A1 (en) * 2015-04-30 2018-10-18 Saudi Arabian Oil Company Three-Dimensional Fluid Micromodels
US20180307443A1 (en) * 2016-01-29 2018-10-25 Hewlett-Packard Development Company, L.P. Error diffusion
US20180307207A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180307206A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180304550A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180319087A1 (en) * 2017-05-08 2018-11-08 Autodesk, Inc. Estimating physical property of 3d printed parts
US20180321658A1 (en) * 2016-01-29 2018-11-08 Hewlett-Packard Development Company, L.P. Identify a model that matches a 3d object
US20180348735A1 (en) * 2017-06-02 2018-12-06 Autodesk, Inc. Agent-based slicing
US20180349220A1 (en) * 2017-06-05 2018-12-06 International Business Machines Corporation Proximity correction in three-dimensional manufacturing
US20180354196A1 (en) * 2017-06-07 2018-12-13 Xyzprinting, Inc. Warpage prevented printing method of 3d printer
US20180365518A1 (en) * 2016-03-29 2018-12-20 Tencent Technology (Shenzhen) Company Limited Target object presentation method and apparatus
US20180361729A1 (en) * 2015-12-21 2018-12-20 Ord Solutions Inc. Large format 3d printing augmented with 3d scanning and anomoly tracking feedback
US20180373227A1 (en) * 2017-06-22 2018-12-27 Autodesk, Inc. Building and attaching support structures for 3d printing
US20190001574A1 (en) * 2017-06-30 2019-01-03 Autodesk, Inc. Systems and methods for determining dynamic forces in a liquefier system in additive manufacturing
US20190039368A1 (en) * 2016-09-26 2019-02-07 Hewlett-Packard Development Company, L.P. 3D Print Selection Based On Voxel Property Association and Conflict Resolution
US20190056717A1 (en) * 2017-01-27 2019-02-21 Hewlett-Packard Development Company, L.P. Predicting distributions of values of layers for three-dimensional printing
US20190054700A1 (en) * 2017-08-15 2019-02-21 Cincinnati Incorporated Machine learning for additive manufacturing
US20190056716A1 (en) * 2017-01-25 2019-02-21 Hewlett-Packard Development Company, L.P. Producing instructions that control three-dimensional printing from voxels
US20190066391A1 (en) * 2017-08-30 2019-02-28 Go Ghost, LLC Method of modifying ray tracing samples after rendering and before rasterizing
US20190061239A1 (en) * 2017-08-22 2019-02-28 Huilin LIU Novel separable 3d printer
US20190061231A1 (en) * 2017-08-31 2019-02-28 Xyzprinting, Inc. 3d printing method using strengthened auxiliary wall
US20190061277A1 (en) * 2017-08-31 2019-02-28 Xyzprinting, Inc. Slicing and printing method for colour 3d physical model with protective film
US20190111590A1 (en) * 2017-10-17 2019-04-18 Autodesk, Inc. Conformal cooling molds with lattice structures for injection molding
US20190122427A1 (en) * 2016-07-26 2019-04-25 Hewlett-Packard Development Company, L.P. Indexing voxels for 3d printing
US20190130525A1 (en) * 2017-03-29 2019-05-02 Zhijing George Mou Methods and systems for real time 3d-space search and point-cloud registration using a dimension-shuffle transform
US20190130642A1 (en) * 2016-05-24 2019-05-02 Technion Research & Development Foundation Limited Systems and methods for generating volumetric models
US20190138670A1 (en) * 2016-04-27 2019-05-09 Within Technologies Ltd. Methods and systems for generating lattice recommendations in computer-aided design applications
US20190180503A1 (en) * 2017-12-12 2019-06-13 Fujitsu Limited Estimation apparatus, estimation method, and non-transitory computer-readable storage medium for storing estimation program
US20190228578A1 (en) * 2016-10-14 2019-07-25 Hewlett-Packard Development Company, L.P. Rebuilding three-dimensional models to provide simplified three-dimensional models
US20190236850A1 (en) * 2016-09-08 2019-08-01 Sony Corporation Information processing device and information processing method
US20190243336A1 (en) * 2016-09-22 2019-08-08 The University Of British Columbia Geometric modelling for facilitating simulation for manufacturing operations
US20190251743A1 (en) * 2016-11-01 2019-08-15 Panasonic Intellectual Property Corporation Of America Display method and display device
US20200105058A1 (en) * 2018-01-30 2020-04-02 Gaia3D, Inc. Method for processing 3d data for use in web service and system using the same
US10661506B2 (en) * 2015-01-30 2020-05-26 Hewlett-Packard Development Company, L.P. Indexing cells of N-dimensional objects

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4888583A (en) * 1988-03-14 1989-12-19 Ligocki Terry J Method and apparatus for rendering an image from data arranged in a constructive solid geometry format
US5506785A (en) * 1993-02-11 1996-04-09 Dover Systems Corporation Method and apparatus for generating hollow and non-hollow solid representations of volumetric data
US7054029B1 (en) * 1999-03-09 2006-05-30 Canon Kabushiki Kaisha Image processing apparatus and method, and storage medium
KR100356016B1 (en) * 1999-12-21 2002-10-18 한국전자통신연구원 Automatic parcel volume capture system and volume capture method using parcel image recognition
US6930682B1 (en) * 2001-11-20 2005-08-16 Hewlett-Packard Development Company, L.P. Subdivision operation and user interface for volume sculpting
JP2004094663A (en) * 2002-08-30 2004-03-25 Fujitsu Ltd Conversion check device, conversion check method, program and storage medium
US7290221B2 (en) * 2003-04-16 2007-10-30 Hewlett-Packard Development Company, L.P. User interface, method and apparatus for providing three-dimensional object fabrication status
US7542622B1 (en) * 2003-06-02 2009-06-02 The Trustees Of Columbia University In The City Of New York Spatio-temporal treatment of noisy images using brushlets
US20050113680A1 (en) * 2003-10-29 2005-05-26 Yoshihiro Ikeda Cerebral ischemia diagnosis assisting apparatus, X-ray computer tomography apparatus, and apparatus for aiding diagnosis and treatment of acute cerebral infarct
JP2005249820A (en) * 2004-03-01 2005-09-15 Seiko Epson Corp Color correcting circuit and image display device with the same
GB2418827B (en) * 2004-09-28 2010-11-10 British Broadcasting Corp Method and system for providing a volumetric representation of a 3-Dimensional object
WO2006069496A1 (en) 2004-12-31 2006-07-06 Fujitsu Limited A search method of 3d model and device thereof
US9251585B2 (en) * 2007-07-12 2016-02-02 Siemens Aktiengesellschaft Coregistration and analysis of multi-modal images obtained in different geometries
US20090027380A1 (en) * 2007-07-23 2009-01-29 Vivek Rajan 3-D visualization
US8190585B2 (en) * 2010-02-17 2012-05-29 Lockheed Martin Corporation Supporting multiple different applications having different data needs using a voxel database
US9020223B2 (en) * 2010-06-16 2015-04-28 A2 Surgical Method for determining bone resection on a deformed bone surface from few parameters
WO2012109658A2 (en) * 2011-02-11 2012-08-16 Emory University Systems, methods and computer readable storage mediums storing instructions for segmentation of medical images
US9105116B2 (en) * 2011-09-22 2015-08-11 Xerox Corporation System and method employing variable size binding elements in virtual rendering of a print production piece
EP2852932A1 (en) 2012-05-22 2015-04-01 Telefónica, S.A. A method and a system for generating a realistic 3d reconstruction model for an object or being
US9224235B2 (en) * 2013-03-20 2015-12-29 Nvidia Corporation System, method, and computer program product for compression of a bounding volume hierarchy
US9248611B2 (en) 2013-10-07 2016-02-02 David A. Divine 3-D printed packaging
US9744726B2 (en) 2013-11-25 2017-08-29 Xerox Corporation 3D print manufacturing of packages with personalized labeling technology
EP3094471B1 (en) * 2014-01-16 2021-06-02 Hewlett-Packard Development Company, L.P. Processing three-dimensional object data of an object to be generated by an additive manufacturing process
US9434108B2 (en) 2014-03-10 2016-09-06 Microsoft Technology Licensing, Llc Fabricating full color three-dimensional objects
US9946816B2 (en) * 2014-03-18 2018-04-17 Palo Alto Research Center Incorporated System for visualizing a three dimensional (3D) model as printed from a 3D printer
US10061870B2 (en) * 2014-03-18 2018-08-28 Palo Alto Research Center Incorporated Automated metrology and model correction for three dimensional (3D) printability
US9747394B2 (en) * 2014-03-18 2017-08-29 Palo Alto Research Center Incorporated Automated design and manufacturing feedback for three dimensional (3D) printability
EP3126122B1 (en) * 2014-03-31 2019-09-04 Hewlett-Packard Development Company, L.P. Generating three-dimensional objects
US9833948B2 (en) 2014-05-08 2017-12-05 Adobe Systems Incorporated 3D printing of colored models on multi-head printers
EP3204921B1 (en) * 2014-10-08 2020-10-07 Hewlett-Packard Development Company, L.P. Diffusing an error in three-dimensional contone model data
EP3234922A4 (en) * 2015-04-21 2018-12-05 Hewlett-Packard Development Company, L.P. Octree serialization
US10474134B2 (en) * 2015-04-29 2019-11-12 University Of Southern California Systems and methods for compensating for 3D shape deviations in additive manufacturing
US10559086B1 (en) * 2015-05-15 2020-02-11 4DMobile, LLC System for volume dimensioning via holographic sensor fusion
WO2017011009A1 (en) * 2015-07-15 2017-01-19 Hewlett-Packard Development Company, L.P. Processing object part data for a three-dimensional object
US10091490B2 (en) * 2015-10-08 2018-10-02 Hewlett-Packard Development Company, L.P. Scan recommendations
WO2017095376A1 (en) * 2015-11-30 2017-06-08 Hewlett-Packard Development Company, L.P. Parameter adjustments based on strength change
US10335995B2 (en) * 2015-12-16 2019-07-02 Xerox Corporation System and method for compensating for dissimilar shrinkage rates in different materials used to form a three-dimensional printed object during additive manufacturing
WO2017200527A1 (en) * 2016-05-16 2017-11-23 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3d object
US10395372B2 (en) * 2016-06-28 2019-08-27 University Of Cincinnati Systems, media, and methods for pre-processing and post-processing in additive manufacturing
WO2018021064A1 (en) * 2016-07-29 2018-02-01 ソニー株式会社 Image processing device and image processing method
US10748326B2 (en) * 2016-08-19 2020-08-18 Movidius Ltd. Rendering operations using sparse volumetric data
JP2018036771A (en) * 2016-08-30 2018-03-08 ローランドディー.ジー.株式会社 Stereoscopic data processing device and stereoscopic data processing program
US10891786B2 (en) * 2016-10-11 2021-01-12 Hewlett-Packard Development Company, L.P. Generating data for a three-dimensional (3D) printable object, including a truss structure
EP3469546B1 (en) * 2016-10-12 2022-04-27 Hewlett-Packard Development Company, L.P. Sub-volume octrees
WO2018136025A1 (en) * 2017-01-17 2018-07-26 Hewlett-Packard Development Company, L.P. Fabricating a replacement component
US11059229B2 (en) * 2017-01-27 2021-07-13 Hewlett-Packard Development Company, L.P. Rules for printing three-dimensional parts
US11232853B2 (en) * 2017-04-21 2022-01-25 Cubisme, Inc. System and method for creating, querying, and displaying a MIBA master file
EP3716218A4 (en) * 2017-11-22 2021-03-31 Panasonic Intellectual Property Corporation of America Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device and three-dimensional data decoding device

Patent Citations (135)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517602A (en) * 1992-12-03 1996-05-14 Hewlett-Packard Company Method and apparatus for generating a topologically consistent visual representation of a three dimensional surface
US5673377A (en) * 1994-09-02 1997-09-30 Ray Dream, Inc. Method and system for displaying a representation of a three-dimensional object with surface features that conform to the surface of the three-dimensional object
US7747055B1 (en) * 1998-11-25 2010-06-29 Wake Forest University Health Sciences Virtual endoscopy with improved image segmentation and lesion detection
US6580426B1 (en) * 1999-03-03 2003-06-17 Canon Kabushiki Kaisha Computer graphics apparatus for processing of data defining a three-dimensional computer model to partition the three-dimensional space into a plurality of sectors
US6429864B1 (en) * 1999-11-10 2002-08-06 Create.It Services Ag Method for traversing a binary space partition or octree and image processor for implementing the method
US20090167763A1 (en) * 2000-06-19 2009-07-02 Carsten Waechter Quasi-monte carlo light transport simulation by efficient ray tracing
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US20060290695A1 (en) * 2001-01-05 2006-12-28 Salomie Ioan A System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US20040193392A1 (en) * 2001-02-28 2004-09-30 Williams Richard Andrew Object interaction simulation
US20030001836A1 (en) * 2001-03-12 2003-01-02 Ernst Fabian Edgar Reconstructor for and method of generating a three-dimensional representation and image display apparatus comprising the reconstructor
US20020158867A1 (en) * 2001-04-30 2002-10-31 Bloomenthal Jules I. Method to compute the medial axis/surface of a three-dimensional object
US20020186216A1 (en) * 2001-06-11 2002-12-12 Baumberg Adam Michael 3D computer modelling apparatus
US20020190986A1 (en) * 2001-06-12 2002-12-19 Minolta Co., Ltd. Method, apparatus, and computer program for generating three-dimensional shape data or volume data
US20030035061A1 (en) * 2001-08-13 2003-02-20 Olympus Optical Co., Ltd. Shape extraction system and 3-D (three dimension) information acquisition system using the same
US20040267400A1 (en) * 2001-08-16 2004-12-30 Hitoshi Ohmori Die machining method and device by v-cad data
US20060152510A1 (en) * 2002-06-19 2006-07-13 Jochen Dick Cross-platform and data-specific visualisation of 3d data records
US20050068317A1 (en) * 2002-06-28 2005-03-31 Fujitsu Limited Program, method, and device for comparing three-dimensional images in voxel form
US20060071932A1 (en) * 2002-11-21 2006-04-06 Koninklijke Philips Electronics N.V. Method and apparatus for visualizing a sequece of volume images
US20040170255A1 (en) * 2003-02-27 2004-09-02 Shimadzu Corporation Radiographic X-ray device
US7206987B2 (en) * 2003-04-30 2007-04-17 Hewlett-Packard Development Company, L.P. Error detection and correction in a layered, 3-dimensional storage architecture
US20050017971A1 (en) * 2003-07-24 2005-01-27 Cleve Ard Ray tracing hierarchy
US8040350B2 (en) * 2003-08-18 2011-10-18 Fovia, Inc. Method and system for adaptive direct volume rendering
US20060274065A1 (en) * 2003-08-18 2006-12-07 Georgiy Buyanovskiy Method and system for adaptive direct volume rendering
US7173616B2 (en) * 2003-12-20 2007-02-06 International Business Machines Corporation Method for determining the bounding voxelisation of a 3D polygon
US20050151735A1 (en) * 2003-12-20 2005-07-14 Martijn Boekhorst Method for determining the bounding voxelisation of a 3D polygon
US20060056726A1 (en) * 2004-08-17 2006-03-16 Konica Minolta Medical & Graphic, Inc. Image processing device, and program
US20060147106A1 (en) * 2004-12-22 2006-07-06 Lining Yang Using temporal and spatial coherence to accelerate maximum/minimum intensity projection
US20060274061A1 (en) * 2005-06-02 2006-12-07 Hongwu Wang Four-dimensional volume of interest
US20070014480A1 (en) * 2005-07-13 2007-01-18 General Electric Company Method and apparatus for creating a multi-resolution framework for improving medical imaging workflow
US7898540B2 (en) * 2005-09-12 2011-03-01 Riken Method and program for converting boundary data into cell inner shape data
US20090208075A1 (en) * 2006-05-19 2009-08-20 Koninklijke Philips Electronics N. V. Error adaptive functional iimaging
US20080118118A1 (en) * 2006-11-22 2008-05-22 Ralf Berger Method and Apparatus for Segmenting Images
US20080238919A1 (en) * 2007-03-27 2008-10-02 Utah State University System and method for rendering of texel imagery
US20090096787A1 (en) * 2007-04-12 2009-04-16 Fujifilm Corporation Method and apparatus for processing three dimensional images, and recording medium having a program for processing three dimensional images recorded therein
US20080259075A1 (en) * 2007-04-19 2008-10-23 David Keith Fowler Dynamically Configuring and Selecting Multiple Ray Tracing Intersection Methods
US20080292169A1 (en) * 2007-05-21 2008-11-27 Cornell University Method for segmenting objects in images
US20090060309A1 (en) * 2007-08-30 2009-03-05 Canon Kabushiki Kaisha Radiation image processing apparatus and method thereof
US20140313195A1 (en) * 2008-02-29 2014-10-23 Cherif Atia Algreatly 3D Model Mapping
US20090292206A1 (en) * 2008-05-20 2009-11-26 Toshiba Medical Systems Corporation Image processing apparatus and computer program product
US20090312996A1 (en) * 2008-06-13 2009-12-17 Schlumberger Technology Corporation Feedback control using a simlator of a subterranean structure
US8130223B1 (en) * 2008-09-10 2012-03-06 Nvidia Corporation System and method for structuring an A-buffer to support multi-sample anti-aliasing
US20100191757A1 (en) * 2009-01-27 2010-07-29 Fujitsu Limited Recording medium storing allocation control program, allocation control apparatus, and allocation control method
US20130321414A1 (en) * 2009-08-07 2013-12-05 Cherif Atia Algreatly Converting a 3d model into multiple matrices
US20110043521A1 (en) * 2009-08-18 2011-02-24 Dreamworks Animation Llc Ray-aggregation for ray-tracing during rendering of imagery
US20110193859A1 (en) * 2010-02-09 2011-08-11 Samsung Electronics Co., Ltd Apparatus and method for generating octree based 3D map
US20110285710A1 (en) * 2010-05-21 2011-11-24 International Business Machines Corporation Parallelized Ray Tracing
US20110316855A1 (en) * 2010-06-24 2011-12-29 International Business Machines Corporation Parallelized Streaming Accelerated Data Structure Generation
US20120180000A1 (en) * 2011-01-10 2012-07-12 Compal Electronics, Inc. Method and system for simulating three-dimensional operating interface
US20140231266A1 (en) * 2011-07-13 2014-08-21 Nuvotronics, Llc Methods of fabricating electronic and mechanical structures
US20130066812A1 (en) * 2011-09-13 2013-03-14 Stratasys, Inc. Solid identification grid engine for calculating support material volumes, and methods of use
US9076219B2 (en) * 2011-10-07 2015-07-07 Electronics And Telecommunications Research Institute Space segmentation method for 3D point clouds
US9111385B2 (en) * 2011-11-25 2015-08-18 Samsung Electronics Co., Ltd. Apparatus and method for rendering volume data
US20130187903A1 (en) * 2012-01-24 2013-07-25 Pavlos Papageorgiou Image processing method and system
US20150084953A1 (en) * 2012-04-19 2015-03-26 Thomson Licensing Method and apparatus for estimating error metrics for multi-component 3d models
US20140067333A1 (en) * 2012-09-04 2014-03-06 Belcan Corporation CAD-Based System for Product Definition, Inspection and Validation
US20140176545A1 (en) * 2012-12-21 2014-06-26 Nvidia Corporation System, method, and computer program product implementing an algorithm for performing thin voxelization of a three-dimensional model
US10054932B2 (en) * 2013-03-11 2018-08-21 Autodesk, Inc. Techniques for two-way slicing of a 3D model for manufacturing
US20140324204A1 (en) * 2013-04-18 2014-10-30 Massachusetts Institute Of Technology Methods and apparati for implementing programmable pipeline for three-dimensional printing including multi-material applications
US20140327667A1 (en) * 2013-05-02 2014-11-06 Samsung Medison Co., Ltd. Medical imaging apparatus and control method for the same
US20140330796A1 (en) * 2013-05-03 2014-11-06 Nvidia Corporation Compressed pointers for cell structures
US20150138201A1 (en) * 2013-11-20 2015-05-21 Fovia, Inc. Volume rendering color mapping on polygonal objects for 3-d printing
US20150148930A1 (en) * 2013-11-27 2015-05-28 Adobe Systems Incorporated Method and apparatus for preserving structural integrity of 3-dimensional models when printing at varying scales
US20150161786A1 (en) * 2013-12-06 2015-06-11 Siemens Aktiengesellschaft Query-specific generation and retrieval of medical volume images
US20160334964A1 (en) * 2013-12-31 2016-11-17 SAMSUNG ELECTRONICS CO., LTD. Co., Ltd. User interface system and method for enabling mark-based interaction for images
US9842424B2 (en) * 2014-02-10 2017-12-12 Pixar Volume rendering using adaptive buckets
US20150262416A1 (en) * 2014-03-13 2015-09-17 Pixar Importance sampling of sparse voxel octrees
US20170039759A1 (en) * 2014-04-17 2017-02-09 3D Slash Three dimensional modeling
US20170113414A1 (en) * 2014-08-29 2017-04-27 Hewlett-Packard Development Company, L.P. Generation of three-dimensional objects
US9552664B2 (en) * 2014-09-04 2017-01-24 Nvidia Corporation Relative encoding for a block-based bounding volume hierarchy
US9582607B2 (en) * 2014-09-04 2017-02-28 Nividia Corporation Block-based bounding volume hierarchy
US20160101570A1 (en) * 2014-10-09 2016-04-14 Autodesk, Inc. Multi-material three dimensional models
US20190030816A1 (en) * 2014-10-09 2019-01-31 Autodesk, Inc. Multi-material three dimensional models
US20170246812A1 (en) * 2014-10-29 2017-08-31 Hewlett-Packard Development Company, L.P. Converting at least a portion of a 3-d object into a format suitable for printing
US20180009168A1 (en) * 2015-01-30 2018-01-11 Hewlett-Packard Development Company, L.P. Three-dimensional object substructures
US20170372513A1 (en) * 2015-01-30 2017-12-28 Hewlett-Packard Development Company, L.P. Generating slice data from a voxel representation
US20170249782A1 (en) * 2015-01-30 2017-08-31 Hewlett-Packard Development Company, L.P . Generating slicing data from a tree data structure
US20180001566A1 (en) * 2015-01-30 2018-01-04 Hewlett-Packard Development Company, L.P. Three-dimensional object substructures
US10661506B2 (en) * 2015-01-30 2020-05-26 Hewlett-Packard Development Company, L.P. Indexing cells of N-dimensional objects
US20170371318A1 (en) * 2015-01-30 2017-12-28 Hewlett-Packard Development Company, L.P. Generating control data for sub-objects
US20170364316A1 (en) * 2015-01-30 2017-12-21 Hewlett-Packard Development Company, L.P. Material volume coverage representation of a three-dimensional object
US20180272429A1 (en) * 2015-02-06 2018-09-27 Dresser-Rand Company Methods for Additive Manufacturing of a Single Piece Piston
US20180264722A1 (en) * 2015-04-14 2018-09-20 Hewlett-Packard Development Company, L.P. Marking build material
US20160303803A1 (en) * 2015-04-14 2016-10-20 Shapeways, Inc. Multi-part counting system for three-dimensional printed parts
US20170365095A1 (en) * 2015-04-16 2017-12-21 Hewlett-Packard Development Company, L.P. Three-dimensional threshold matrix for three-dimensional halftoning
US20180032060A1 (en) * 2015-04-20 2018-02-01 Hewlett-Packard Development Company, L.P. Creating a voxel representation of a three dimensional (3-d) object
US20180052947A1 (en) * 2015-04-23 2018-02-22 Hewlett-Packard Development Company, L.P. Lattice structure representation for a three-dimensional object
US20180052447A1 (en) * 2015-04-28 2018-02-22 Hewlett-Packard Development Company, L.P. Structure using three-dimensional halftoning
US20180300947A1 (en) * 2015-04-30 2018-10-18 Saudi Arabian Oil Company Three-Dimensional Fluid Micromodels
US20160332388A1 (en) * 2015-05-12 2016-11-17 Seoul National University R&Db Foundation Method of forming transparent 3d object and transparent 3d object formed thereby
US20160337549A1 (en) * 2015-05-14 2016-11-17 Xerox Corporation 3d printer steganography
US20160358384A1 (en) * 2015-06-08 2016-12-08 Airbus Operations (S.A.S.) Damage detection and repair system and method using enhanced geolocation
US20170091965A1 (en) * 2015-09-29 2017-03-30 Yandex Europe Ag Method of and system for generating simplified borders of graphical objects
US20180361729A1 (en) * 2015-12-21 2018-12-20 Ord Solutions Inc. Large format 3d printing augmented with 3d scanning and anomoly tracking feedback
US20180299869A1 (en) * 2016-01-14 2018-10-18 Ricoh Company, Ltd. Modeling process device, modeling process system, and medium
US20180307443A1 (en) * 2016-01-29 2018-10-25 Hewlett-Packard Development Company, L.P. Error diffusion
US20180321658A1 (en) * 2016-01-29 2018-11-08 Hewlett-Packard Development Company, L.P. Identify a model that matches a 3d object
US20180365518A1 (en) * 2016-03-29 2018-12-20 Tencent Technology (Shenzhen) Company Limited Target object presentation method and apparatus
US20190138670A1 (en) * 2016-04-27 2019-05-09 Within Technologies Ltd. Methods and systems for generating lattice recommendations in computer-aided design applications
US20170323436A1 (en) * 2016-05-06 2017-11-09 L-3 Communications Security & Detection Systems, Inc. Systems and methods for generating projection images
US20190130642A1 (en) * 2016-05-24 2019-05-02 Technion Research & Development Foundation Limited Systems and methods for generating volumetric models
US20170368755A1 (en) * 2016-06-22 2017-12-28 Massachusetts Institute Of Technology Methods and Apparatus for 3D Printing of Point Cloud Data
US20190122427A1 (en) * 2016-07-26 2019-04-25 Hewlett-Packard Development Company, L.P. Indexing voxels for 3d printing
US10394221B2 (en) * 2016-08-12 2019-08-27 Microsoft Technology Licensing, Llc 3D printing using 3D video data
US20180046167A1 (en) * 2016-08-12 2018-02-15 Microsoft Technology Licensing, Llc 3D Printing Using 3D Video Data
US20180281294A1 (en) * 2016-08-26 2018-10-04 Wacker Chemie Ag Method for producing shaped bodies
US20190236850A1 (en) * 2016-09-08 2019-08-01 Sony Corporation Information processing device and information processing method
US20190243336A1 (en) * 2016-09-22 2019-08-08 The University Of British Columbia Geometric modelling for facilitating simulation for manufacturing operations
US20190039368A1 (en) * 2016-09-26 2019-02-07 Hewlett-Packard Development Company, L.P. 3D Print Selection Based On Voxel Property Association and Conflict Resolution
US20190228578A1 (en) * 2016-10-14 2019-07-25 Hewlett-Packard Development Company, L.P. Rebuilding three-dimensional models to provide simplified three-dimensional models
US20190251743A1 (en) * 2016-11-01 2019-08-15 Panasonic Intellectual Property Corporation Of America Display method and display device
US20180151254A1 (en) * 2016-11-30 2018-05-31 Electronics And Telecommunications Research Institute High-speed similar case search method and device through reduction of large scale multi-dimensional time series health data to multiple dimensions
US20180276887A1 (en) * 2016-12-16 2018-09-27 University Of Manitoba Medial Axis Extraction for Complex 3D Objects
US20190056716A1 (en) * 2017-01-25 2019-02-21 Hewlett-Packard Development Company, L.P. Producing instructions that control three-dimensional printing from voxels
US20190056717A1 (en) * 2017-01-27 2019-02-21 Hewlett-Packard Development Company, L.P. Predicting distributions of values of layers for three-dimensional printing
US20180240269A1 (en) * 2017-02-20 2018-08-23 My Virtual Reality Software As Method for visualizing three dimensional data
US20180276316A1 (en) * 2017-03-23 2018-09-27 Autodesk, Inc. Creating gradients of different materials for three-dimensional models in computer aided design applications
US20190130525A1 (en) * 2017-03-29 2019-05-02 Zhijing George Mou Methods and systems for real time 3d-space search and point-cloud registration using a dimension-shuffle transform
US20180307207A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180307206A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180304550A1 (en) * 2017-04-24 2018-10-25 Autodesk, Inc. Closed-loop robotic deposition of material
US20180319087A1 (en) * 2017-05-08 2018-11-08 Autodesk, Inc. Estimating physical property of 3d printed parts
US20180348735A1 (en) * 2017-06-02 2018-12-06 Autodesk, Inc. Agent-based slicing
US20180349220A1 (en) * 2017-06-05 2018-12-06 International Business Machines Corporation Proximity correction in three-dimensional manufacturing
US20180354196A1 (en) * 2017-06-07 2018-12-13 Xyzprinting, Inc. Warpage prevented printing method of 3d printer
US20180373227A1 (en) * 2017-06-22 2018-12-27 Autodesk, Inc. Building and attaching support structures for 3d printing
US20190001574A1 (en) * 2017-06-30 2019-01-03 Autodesk, Inc. Systems and methods for determining dynamic forces in a liquefier system in additive manufacturing
US20190054700A1 (en) * 2017-08-15 2019-02-21 Cincinnati Incorporated Machine learning for additive manufacturing
US20190061239A1 (en) * 2017-08-22 2019-02-28 Huilin LIU Novel separable 3d printer
US20190066391A1 (en) * 2017-08-30 2019-02-28 Go Ghost, LLC Method of modifying ray tracing samples after rendering and before rasterizing
US20190061277A1 (en) * 2017-08-31 2019-02-28 Xyzprinting, Inc. Slicing and printing method for colour 3d physical model with protective film
US20190061231A1 (en) * 2017-08-31 2019-02-28 Xyzprinting, Inc. 3d printing method using strengthened auxiliary wall
US20190111590A1 (en) * 2017-10-17 2019-04-18 Autodesk, Inc. Conformal cooling molds with lattice structures for injection molding
US20190180503A1 (en) * 2017-12-12 2019-06-13 Fujitsu Limited Estimation apparatus, estimation method, and non-transitory computer-readable storage medium for storing estimation program
US20200105058A1 (en) * 2018-01-30 2020-04-02 Gaia3D, Inc. Method for processing 3d data for use in web service and system using the same
US20200167997A1 (en) * 2018-01-30 2020-05-28 Gaia3D, Inc. Method of providing 3d gis web service

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11043042B2 (en) * 2016-05-16 2021-06-22 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3D object
US20210109499A1 (en) * 2016-10-27 2021-04-15 Desprez Llc System and method for generating a quote for fabrication of a part to be fabricated
US11754996B2 (en) * 2016-10-27 2023-09-12 Desprez Llc System and method for generating fabrication parameter of a part to be fabricated
US20230376005A1 (en) * 2016-10-27 2023-11-23 Desprez, Llc System and method for generating a quote for fabrication of a part to be fabricated
US11282288B2 (en) 2019-11-20 2022-03-22 Shape Matrix Geometric Instruments, LLC Methods and apparatus for encoding data in notched shapes
US11900629B2 (en) * 2019-12-11 2024-02-13 Nvidia Corporation Surface profile estimation and bump detection for autonomous machine
CN113414987A (en) * 2021-06-23 2021-09-21 哈尔滨理工大学 3D printing self-adaptive layering thickness method
CN116681841A (en) * 2023-08-03 2023-09-01 中国科学院长春光学精密机械与物理研究所 Quality evaluation method for tomographic reconstruction and storage medium

Also Published As

Publication number Publication date
WO2017200527A1 (en) 2017-11-23
US11043042B2 (en) 2021-06-22

Similar Documents

Publication Publication Date Title
US11043042B2 (en) Generating a shape profile for a 3D object
CN107635750B (en) Method and apparatus for determining the arrangement of components to be printed in a build envelope
KR102527255B1 (en) Ortho-projection-based texture atlas packing of 3D meshes
US10656625B2 (en) Method and apparatus for preserving structural integrity of 3-dimensional models when printing at varying scales
US11335074B2 (en) Arrangement determination for 3D fabricated parts
US9972128B2 (en) Methods and systems for generating polycubes and all-hexahedral meshes of an object
US10417821B2 (en) Method of simplifying a geometry model
US9390556B2 (en) Systems and methods for generating a large scale polygonal mesh
US10186037B2 (en) Object data representations for additive manufacturing
US10839598B2 (en) Indexing voxels for 3D printing
Lozano et al. An efficient algorithm to generate random sphere packs in arbitrary domains
US20190137974A1 (en) Method, assistance system and 3d-printer for computer-aided design of objects for additive manufacturing
US9311744B2 (en) System and method for generating an outer layer representation of an object
Wang et al. Cost-effective printing of 3D objects with self-supporting property
US11221609B2 (en) Determining object volumes in virtual object space
CN114043726B (en) Method and apparatus for 3D printing, storage medium, and program product
CN116301673A (en) Print volume estimation method, apparatus and readable medium for three-dimensional print model
CN110832551A (en) Associating object property data with a location
US11328100B2 (en) Regular grid recognition in a CAD model
Conti et al. Generation of oriented three‐dimensional Delaunay grids suitable for the control volume integration method
JP2005078416A (en) Method, device and program for generating analysis model and its recording medium
US10035297B2 (en) Apparatus and method for generating bitmap of 3-dimensional model
CN117799171A (en) Color three-dimensional model printing method and device and electronic equipment
Rodrigues et al. ECLES: A general method for local editing of parameters with linear constraints
WO2022081173A1 (en) 3d fabricated objects with lattice structures having tubes

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: HP PRINTING AND COMPUTING SOLUTIONS, S.L.U., SPAIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CORTES I HERMS, SEBASTIA;REEL/FRAME:047940/0435

Effective date: 20181109

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DEL ANGEL, ANA PATRICIA;ZENG, JUN;WHITE, SCOTT A;AND OTHERS;SIGNING DATES FROM 20160513 TO 20181116;REEL/FRAME:047940/0506

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE