WO2022227357A1 - 贴图三维模型的切片方法、打印方法、装置及设备 - Google Patents
贴图三维模型的切片方法、打印方法、装置及设备 Download PDFInfo
- Publication number
- WO2022227357A1 WO2022227357A1 PCT/CN2021/115458 CN2021115458W WO2022227357A1 WO 2022227357 A1 WO2022227357 A1 WO 2022227357A1 CN 2021115458 W CN2021115458 W CN 2021115458W WO 2022227357 A1 WO2022227357 A1 WO 2022227357A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- texture
- textured
- vertex
- model
- polygon
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 87
- 238000007639 printing Methods 0.000 title claims abstract description 46
- 238000013507 mapping Methods 0.000 title claims abstract description 25
- 238000004040 coloring Methods 0.000 claims abstract description 24
- 238000005070 sampling Methods 0.000 claims description 26
- 238000010146 3D printing Methods 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 17
- 239000011159 matrix material Substances 0.000 claims description 15
- 238000010586 diagram Methods 0.000 description 16
- 239000010410 layer Substances 0.000 description 12
- 238000004590 computer program Methods 0.000 description 9
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B22—CASTING; POWDER METALLURGY
- B22F—WORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
- B22F10/00—Additive manufacturing of workpieces or articles from metallic powder
- B22F10/80—Data acquisition or data processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
- B29C64/393—Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
- B33Y50/02—Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/008—Cut plane or projection plane definition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the embodiments of the present application relate to the technical field of three-dimensional printing, and in particular, to a slicing method, a printing method, an apparatus, and a device for mapping a three-dimensional model.
- the 3D printing system can print based on the 3D model designed by the designer to obtain the corresponding 3D object.
- the data of the 3D model can be sliced, each slice can define a part of a material layer, and the material layer will be solidified or agglomerated by the 3D printing system based on the relevant definition of the slice.
- the existing slicing algorithm cannot complete the slicing of the textured 3D model, so that the textured 3D model cannot be printed in color. Users need to manually color the textured 3D model according to their needs. The printing integrity of the textured 3D model is poor and the efficiency is low. Unable to meet user needs.
- the embodiments of the present application provide a slicing method, printing method, device and device for a textured 3D model.
- a textured 3D model the color corresponding to the contour area corresponding to each slice plane is determined through the texture information in the file, thereby realizing texture mapping. Color printing of 3D models with high printing integrity and high efficiency.
- an embodiment of the present application provides a method for slicing a textured 3D model, the method comprising: parsing a textured 3D model file to obtain each polygonal patch used to represent the textured 3D model, wherein at least one of the The polygon patch includes the texture information of the mapped three-dimensional model; each of the polygon patches is sliced by using the slice plane to obtain at least one outer contour line corresponding to the slice plane; according to the at least one outer contour line, Obtain an outline area, and fill the outline area; and generate full-color image data for coloring the filled outline area according to the texture information.
- obtaining the contour area according to the at least one outer contour line includes: shrinking each of the outer contour lines based on the set texture thickness to obtain an inner contour line; The XOR operation is performed on the inner contour lines to obtain each of the contour regions.
- the method further includes: for each outer contour line, based on the set texture thickness and the set texture For slice thickness, obtain the associated outer contour lines corresponding to a preset number of slice planes above and below the slice plane corresponding to the outer contour line.
- performing an XOR operation on each of the outer contour lines and their corresponding inner contour lines to obtain each of the contour regions includes: performing an XOR operation on the inner contour line and each of the associated outer contour lines respectively. , and then perform an exclusive OR operation with the corresponding outer contour lines to obtain each of the contour regions.
- generating full-color picture data for coloring the filled contour region according to the texture information including: mapping the filled contour region to the picture according to the resolution of the picture. obtain the pixel coordinates of each pixel of the filled outline area in the picture; according to the pixel coordinates corresponding to each point of the filled outline area, determine the space of each point of the filled outline area Coordinates; according to the spatial coordinates of each point in the filled contour area and the texture information, determine the color value of each pixel point in the filled contour area in the picture; according to the filled contour area The color value of each pixel of the area in the picture is used to generate full-color picture data for coloring the filled contour area.
- parsing the textured 3D model file includes: acquiring the textured 3D model file, where the textured 3D model file includes a texture library of the textured 3D model, vertex coordinates of each vertex, and a polygon index; according to the The vertex coordinates and the polygon index are used to determine the spatial coordinates of each vertex of each of the polygon patches; the texture library is traversed according to the polygon index to obtain the texture information of each vertex of each of the polygon patches.
- traverse the texture library according to the polygon index and obtain texture information of each vertex of each of the polygon patches, including:
- slicing each of the polygonal patches by using the slice plane to obtain at least one outer contour line corresponding to the slice plane including: performing an intersection operation on each of the polygonal patches and the slice plane. , obtain each intersection point; connect each of the intersection points end to end to obtain at least one of the outer contour lines corresponding to the slice plane.
- the method before performing an intersection operation on the polygon patch and the slice plane to obtain each intersection point, the method further includes: according to the spatial coordinates of the layering direction corresponding to the polygon patch, for each of the polygons.
- the patches are grouped and sorted to obtain a grouping relation matrix, wherein the grouping relation matrix includes each group of the polygon patches corresponding to each of the slice planes.
- performing an intersection operation on the polygon patches and the slice plane to obtain each intersection point includes: performing an intersection operation on each group of the polygon patches and the corresponding slice plane to obtain each intersection point.
- the method further includes: sampling each of the polygonal patches to obtain a point cloud information set, wherein the point cloud information
- the set includes spatial coordinates and texture information for each sample point.
- generating full-color image data for coloring the filled contour area according to the texture information includes: according to the point cloud information set and the corresponding points in the filled contour area. Spatial coordinates, determine the color value of each point in the filled contour area; generate a full-color image for coloring the filled contour area according to the color value of each point in the filled contour area data.
- the method further includes: for each polygonal patch, determining each adjacent polygonal patch of the polygonal patch, wherein the The adjacent polygonal patch and the polygonal patch are connected to one of the sides of the polygonal patch; according to each polygonal patch and each corresponding adjacent polygonal patch, a polygonal topology structure is generated.
- using the slice plane to slice each of the polygonal patches to obtain at least one outer contour line corresponding to the slice plane includes: using the slice plane to slice each of the polygonal patches to obtain The intersection point of the slice plane and each of the multi-deformable patches; according to the polygon topology, the connection sequence of each of the intersection points is determined, so as to obtain at least one outer contour line corresponding to the slice plane.
- filling the outline area includes: filling the outline area based on an inverse color filling algorithm.
- an embodiment of the present application further provides a slicing method for mapping a 3D model, the method comprising: acquiring slice image data, wherein the slice image data includes the method provided according to any embodiment corresponding to the first aspect of the present application.
- the full-color picture data generated by the slicing method of the textured three-dimensional model; and the printing control data is generated according to the slicing image data, so as to print the textured three-dimensional model based on the printing control data.
- an embodiment of the present application further provides a slicing device for mapping a 3D model, the device comprising: a parsing module for parsing a 3D textured model file to obtain each polygonal patch used to represent the 3D textured model, Wherein, at least one of the polygonal patches includes texture information of the textured 3D model; a contour line acquisition module is used to slice each of the polygonal patches by using a slice plane to obtain at least one corresponding slice plane Outer contour line; contour filling module, used for obtaining the contour area according to the at least one outer contour line, and filling the contour area; full-color image generation module, used for generating according to the texture information for Full-color image data for coloring the filled contour area.
- the contour filling module includes: a contour area acquiring unit, configured to obtain a contour area according to the at least one outer contour line; and a filling unit, configured to fill the contour area.
- the contour area acquiring unit includes: a shrinking subunit, used for shrinking each of the outer contour lines to obtain an inner contour line based on the set texture thickness; a contour area acquiring subunit, used for converting each of the The outer contour line and its corresponding inner contour line are XORed to obtain each of the contour regions.
- the apparatus further includes: an associated contour obtaining module, configured to obtain an inner contour by shrinking each of the outer contours based on the set texture thickness, for each outer contour, based on the The thickness of the map and the thickness of the slice are set, and the associated outer contour lines corresponding to a preset number of the slice planes above and below the slice plane corresponding to the outer contour line are obtained.
- the contour area acquisition subunit is specifically used for: after performing the same-or operation on the inner contour line and each of the associated outer contour lines respectively, and then performing an exclusive-or operation with the corresponding outer contour line to obtain each of the outer contour lines. the contour area.
- a full-color picture generation module which is specifically used to: map the filled contour region to the picture according to the resolution of the picture; and obtain each of the filled contour regions in the picture.
- the texture information determines the color value of each pixel of the filled contour region in the picture; according to the color value of each pixel of the filled contour region in the picture, a Full-color image data for coloring the filled contour area.
- the parsing module includes: a file acquisition unit for acquiring the textured 3D model file, wherein the textured 3D model file includes a texture library of the textured 3D model, vertex coordinates of each vertex, and a polygon index; an indexing unit, configured to determine the spatial coordinates of each vertex of each of the polygon patches according to the vertex coordinates and the polygon index; a texture information determination unit, configured to traverse the texture library according to the polygon index, and obtain each Texture information of each vertex of the polygon patch.
- the texture information determination unit which is specifically used for:
- the contour line obtaining module includes: an intersection obtaining unit, configured to perform an intersection operation between each of the polygonal patches and the slice plane to obtain each intersection point; a contour line obtaining unit, used for obtaining each of the intersection points. Connect end to end to obtain at least one of the outer contour lines corresponding to the slice plane.
- the device further includes: a grouping and sorting module, configured to perform an intersecting operation on the polygon patch and the slice plane to obtain each intersection point, according to the spatial coordinates of the layering direction corresponding to the polygon patch. , grouping and sorting each of the polygonal patches to obtain a grouping relationship matrix, where the grouping relationship matrix includes each group of the polygonal patches corresponding to each of the slice planes.
- the intersection obtaining unit is specifically configured to: perform an intersection operation between each group of the polygon patches and the corresponding slice plane to obtain each intersection point.
- the device further includes: a point cloud information set acquisition module, configured to sample each of the polygonal patches to obtain point cloud information after acquiring each polygonal patch used to represent the textured 3D model set, wherein the point cloud information set includes the spatial coordinates and texture information of each sampling point.
- the full-color image generation module is specifically configured to: determine the color value of each point in the filled outline area according to the point cloud information set and the space coordinates corresponding to each point in the filled outline area ; According to the color value of each point in the filled outline area, generate full-color image data for coloring the filled outline area.
- the device further includes: a topology generation module, configured to, after obtaining each polygonal patch used to represent the textured three-dimensional model, for each polygonal patch, determine each adjacency of the polygonal patch A polygonal patch, wherein the adjacent polygonal patch and the polygonal patch are connected to one of the sides of the polygonal patch; according to each polygonal patch and each corresponding adjacent polygonal patch, a polygon topology structure is generated .
- a topology generation module configured to, after obtaining each polygonal patch used to represent the textured three-dimensional model, for each polygonal patch, determine each adjacency of the polygonal patch A polygonal patch, wherein the adjacent polygonal patch and the polygonal patch are connected to one of the sides of the polygonal patch; according to each polygonal patch and each corresponding adjacent polygonal patch, a polygon topology structure is generated .
- the contour line acquisition module is specifically configured to: use the slice plane to slice each of the polygonal patches to obtain the intersection points of the slice plane and each of the polymorphic patches; according to the polygon topology structure, and determine the connection sequence of each of the intersection points, so as to obtain at least one outer contour line corresponding to the slice plane.
- the filling unit is specifically used for: filling the contour area based on an inverse color filling algorithm.
- an embodiment of the present application further provides a three-dimensional printing apparatus, including a memory and at least one processor; the memory stores computer-executed instructions; the at least one processor executes the computer-executed instructions stored in the memory, so that The at least one processor executes the method for slicing a textured 3D model provided in any embodiment of the present application.
- embodiments of the present application further provide a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when the computer-executable instructions are executed by a processor, are used to implement any The slicing method of the textured 3D model provided by the embodiment.
- an embodiment of the present application further provides a computer program product, including a computer program, which, when executed by a processor, implements the method for slicing a textured 3D model provided by any embodiment of the present application.
- each polygon patch is obtained by parsing the textured 3D model file, and at least one polygon patch includes the textured 3D model file.
- Map the texture information of the three-dimensional model and then use the slice plane to slice each polygonal patch, connect each intersection point end to end to obtain the outer contour line corresponding to the slice plane, and shrink the outer contour line to obtain the contour area, and the contour area Fill, and then based on the above texture information, generate full-color image data for coloring each contour area, and perform 3D printing based on the full-color image data.
- Automatic, color printing improves the printing efficiency and quality of textured 3D models.
- FIG. 1 is an application scenario diagram of a method for slicing a three-dimensional model for mapping provided by an embodiment of the present application
- FIG. 2 is a flowchart of a method for slicing a textured 3D model according to an embodiment of the present application
- FIG. 3 is a schematic diagram of a polygonal patch in the embodiment shown in FIG. 2 of the application;
- FIG. 4 is a schematic diagram of the intersection of the polygonal patch P and the slice plane in the embodiment shown in FIG. 3 of the application;
- FIG. 5 is a flowchart of step S201 in the embodiment shown in FIG. 2 of the present application.
- FIG. 6 is a schematic diagram of polygon patch sampling in the embodiment shown in FIG. 3 of the present application.
- FIG. 7 is a flowchart of a method for slicing a textured 3D model provided by another embodiment of the present application.
- FIG. 8 is a schematic diagram of a contour area in the embodiment shown in FIG. 7 of the present application.
- FIG. 9 is a schematic diagram of the contour area in the embodiment shown in FIG. 7 of the application.
- FIG. 10 is a flowchart of a method for printing a textured 3D model provided by an embodiment of the application
- FIG. 11 is a schematic structural diagram of a slicing device for mapping a three-dimensional model provided by an embodiment of the present application.
- FIG. 12 is a schematic structural diagram of a three-dimensional printing apparatus according to an embodiment of the present application.
- FIG. 1 is an application scenario diagram of the slicing method for mapping a three-dimensional model provided by an embodiment of the present application.
- a three-dimensional printing system 100 generally includes a modeling platform 110 and a three-dimensional printer 120, and the modeling platform 110 is installed on the There are modeling software, such as 3dmax software, and slicing software, such as s3d software, the designer can generate 3D model data through the modeling software, and then slice the 3D model data based on the slicing software, and the 3D printer 120 can be recognized and executed. The printing data, and then realize the printing of the 3D model.
- the modeling software and/or the slicing software can also be integrated in the 3D printer to realize functions such as modeling, slicing, and printing in an integrated manner.
- a 3D model without texture mapping such as a white 3D model
- the 3D model is often printed first, and then the 3D model is manually colored or mapped to obtain a colored 3D model.
- the printing efficiency of the above-mentioned textured 3D model is poor, and the texture takes a long time, so it is impossible to directly print the textured 3D model in color.
- the slicing method of the textured 3D model obtaineds each polygonal patch including texture information by parsing the textured 3D model file. Slice to obtain each outer contour line, and then obtain the contour area based on the outer contour line, and combine the above-mentioned texture information to generate the above-mentioned full-color image data for the contour area, so that based on the full-color image data corresponding to each contour area, realize The full-color printing of the textured 3D model is realized, and the automation degree and efficiency of the textured 3D model printing are improved.
- FIG. 2 is a flowchart of a method for slicing a textured 3D model according to an embodiment of the present application.
- the slicing method for a textured 3D model can be applied to a 3D printing device or a 3D printing system. As shown in FIG. 2 , the slicing method for a textured 3D model provided by this embodiment includes the following steps:
- Step S201 parsing the textured 3D model file to obtain each polygonal patch used to represent the textured 3D model.
- At least one of the polygon patches includes texture information of the textured 3D model.
- Texture information is information used to describe the color value of each point, such as grayscale value and color value, which can be represented by color models such as RGB and YUV.
- the textured 3D model file may be stored in the memory of the 3D printing device, or externally transmitted to the 3D printing device, for example, saved in the memory of the 3D printing device through a network, an interface, or the like.
- the textured 3D model file can define at least part of the 3D geometric model of the object, including at least a part of the object, such as the solid part, the shape and extent in the 3D coordinate system.
- the polygon patch is used to describe the texture and outline of the surface of the textured 3D model, and the polygon patch may include spatial coordinates and texture information of the corresponding part of the textured 3D model.
- the shape of the polygonal patch can be a triangle, a quadrilateral or the like.
- the keywords to be parsed in the textured 3D model file include vertex space coordinate v, vertex texture coordinate vt, polygon index f, texture library mtllib, diffuse light kd, and the like.
- a texture mapping between the spatial coordinates of each point on each polygonal patch of the textured 3D model and the corresponding texture information can be established, thereby facilitating the determination of the texture information of each point in subsequent steps.
- Step S202 using the slice plane to slice each of the polygonal patches to obtain at least one outer contour line corresponding to the slice plane.
- the slice direction corresponding to the slice plane may be the X, Y or Z axis direction, and there is a certain interval between adjacent slice planes.
- Each point on the outer contour line is the intersection of the corresponding slice plane and the polygonal patch.
- the slice plane defines each slice to be generated in a single layer fabricated layer by layer, and the contour defines the shape, extent and texture information of the solid part of the slice.
- the above-mentioned polygon patches are sliced through each slice plane to obtain the intersection points of the slice plane and the polygon patches, and each intersection point is connected end to end to obtain each outer contour line corresponding to each slice plane.
- one slice plane may correspond to one outer contour line, or may correspond to multiple outer contour lines.
- slice each of the polygonal patches based on the slice plane to obtain at least one outer contour line including:
- each intersection of the slice plane and each polygon patch is obtained, and each polygon patch and the slice plane are obtained.
- the intersection points are connected end to end, and the outer contour line corresponding to the polygon patch and the slice plane can be obtained.
- FIG. 3 is a schematic diagram of a polygonal patch in the embodiment shown in FIG. 2 of the present application.
- a polygonal patch is taken as an example of a triangular patch
- the triangular patch P includes a plurality of Vertices P 0 , P 1 and P 2
- the spatial coordinates of vertex P 0 are (X 0 , Y 0 , Z 0 )
- the texture information is (U 0 , V 0 )
- the spatial coordinates of P 1 are (X 1 , Y 1 , Z 1 )
- the texture information is (U 1 , V 1 )
- the spatial coordinates of P 2 are (X 2 , Y 2 , Z 2 )
- the texture information is (U 2 , V 2 ), FIG.
- the polygonal patch P is sliced by the slice plane.
- the slice direction of the slice plane in FIG. 4 is the Z-axis direction, and the polygonal patch P intersects with the slice plane Q. at point M and point N. Connect the slice plane with the intersections of multiple polygonal patches end to end to obtain the outer contour line corresponding to the slice plane.
- Step S203 obtaining a contour area according to the at least one outer contour line, and filling the contour area.
- the outer contour line defines the texture representation corresponding to the texture thickness of a voxel. Only using the contour line to describe the textured 3D model is not complete. Therefore, the contour line needs to be expanded to obtain the corresponding contour area.
- the contour area defines the texture representation of the textured 3D model corresponding to the slice with the set texture thickness. The contour area is used to describe the textured 3D model, which ensures the integrity of the textured pattern and improves the printing quality.
- the texture thicknesses corresponding to different contour regions may be the same, that is, each contour region corresponds to a constant texture thickness.
- the texture thicknesses corresponding to different contour areas may be different, that is, each outer contour line is expanded into each contour area with variable texture thickness.
- the filled contour area is the outer layer corresponding to the texture of the textured 3D model.
- the contour area may be filled based on a preset algorithm, wherein the preset algorithm may be an algorithm such as a direction-parallel filling algorithm, a contour-parallel filling algorithm, and an inverse color filling algorithm.
- the preset algorithm may be an algorithm such as a direction-parallel filling algorithm, a contour-parallel filling algorithm, and an inverse color filling algorithm.
- each contour area can be filled based on an inverse color filling algorithm.
- the specific steps are: distinguishing the contour area from the non-contour area, color-filling the contour area, and filling the non-contour area corresponding to the contour area, namely the filling area, Fills with the inverse of the color of the outline area.
- the contour area may be filled with black, and the non-contour area may be filled with white.
- Step S204 generating full-color image data for coloring the filled contour area according to the texture information.
- the texture information of each point can be determined based on the spatial coordinates of each point on the contour area, and then the above-mentioned full-color image data can be generated based on the texture information of each point, so as to color the contour area based on the full-color image data, Through 3D printing, a full-color textured 3D model can be obtained.
- the method further includes: determining the size and resolution of the full-color picture according to hardware parameters of the three-dimensional printing system, where the hardware parameters include the size of the printable area, the resolution of the three-dimensional printer, the spraying Parameters such as the arrangement and placement of ink heads and print heads. Therefore, based on the size and resolution of the full-color picture, the contour area and the full-color picture are mapped to obtain full-color picture data corresponding to the full-color picture.
- each polygon patch is obtained by parsing the textured 3D model file, and at least one polygon patch includes the texture information of the textured 3D model, and then uses The slice plane slices each polygonal patch, and connects each intersection point end to end to obtain the outer contour line corresponding to the slice plane. By shrinking the outer contour line, the contour area is obtained, and the contour area is filled.
- Generate full-color image data for coloring each contour area, and perform 3D printing based on the full-color image data you can directly obtain the color textured 3D model entity, realize the automatic and color printing of the textured 3D model, and improve the textured 3D model. Model print efficiency and quality.
- FIG. 5 is a flowchart of step S201 in the embodiment shown in FIG. 2 of the present application.
- step S201 includes the following steps:
- Step S2011 acquiring the textured 3D model file.
- the textured 3D model file includes a texture library of the textured 3D model, vertex coordinates of each vertex, and a polygon index.
- the format of the textured 3D model file can be OBJ (Object, target file format), PLY (Polygon File Format, polygon file format), AMF (Action Message Format, operation message format) and other formats.
- the textured 3D model file can be automatically obtained through a preset interface of the modeling software.
- the modeling software can be installed on the modeling platform of the 3D printing system. Or the user can manually upload the textured 3D model file to a 3D printing system or a 3D printer.
- Step S2012 Determine the spatial coordinates of each vertex of each of the polygon patches according to the vertex coordinates and the polygon index.
- the vertex coordinates may include the spatial coordinates of the vertex, and may also include the texture coordinates of the vertex.
- Space coordinates are used to describe the position information of each vertex in the textured 3D model.
- Texture coordinates are used to describe the position information of each vertex in the texture image of the textured 3D model.
- the polygon index is used to index the vertices corresponding to each polygon patch, and the spatial coordinates and texture coordinates of the vertices of each polygon can be obtained according to the polygon index.
- Step S2013 traverse the texture library according to the polygon index to obtain the texture information of each vertex of each of the polygon patches.
- the texture library includes texture files or texture information corresponding to each vertex of each polygon facet.
- the texture library can be traversed according to the spatial coordinates of each vertex based on the polygon index, so as to find texture information matching the spatial coordinates of the vertexes.
- the texture library can be searched based on the polygon index to determine whether there is a texture file corresponding to the current polygon patch; if not, the color value of each vertex of the polygon patch is set to white; if there is , the color value of each vertex of the polygonal patch is determined based on the texture file.
- the texture file is a file including texture information describing each vertex.
- the texture file corresponding to each vertex can be determined based on the spatial coordinates or vertex coordinates of the vertex, and the above-mentioned polygon index.
- the vertex texture coordinates of the vertex When the vertex texture coordinates of the vertex are included in the vertex coordinates of the vertex, it is directly based on the vertex texture coordinates. Extract the corresponding texture image from the texture file, and determine the texture information of the vertex according to the texture image; if the vertex texture coordinates are not included in the vertex coordinates, it means that the texture information of the point is not specified in the textured 3D model, and the default texture can be used
- the information is the texture information of the vertex, that is, the color value of the diffuse light in the corresponding texture file is used as the texture information of the vertex.
- the mapping relationship among the spatial coordinates of each vertex, the vertex texture coordinates and the color value is determined, so that in the subsequent steps, the color value of each point can be quickly located based on the spatial coordinates of the vertex, improving the The speed of generating full-color image data is improved, thereby increasing the printing speed.
- the textured 3D model file is parsed, including:
- Extract texture information If the keyword is mtllib, open the texture file with the suffix .mtl, and read the content of the texture file line by line. If the keyword at the beginning of the line is the first keyword, the first keyword will be used. The following texture image is loaded into the memory to determine the texture information based on the texture image; if the keyword at the beginning of the line is the second keyword, the diffuse light color value after the second keyword is loaded into the memory, To determine the color value of the corresponding vertex, the diffuse light color value.
- the keyword is v, it means that this row represents the spatial coordinates of the vertex, then the three floating-point numbers after the keyword can be used as the X value and the Y value of the spatial coordinate of the vertex, respectively. and Z values, and store them in the space coordinate container.
- the keyword is f, it means that this line represents a polygon patch, then parse the multiple sets of v/vt pairs after the keyword into the spatial coordinate index and texture coordinate of each vertex that constitutes the polygon patch index to determine the spatial coordinate of the vertex based on the spatial coordinate index, and to determine the vertex texture coordinate of the vertex based on the texture coordinate index.
- the method further includes: sampling each of the polygonal patches to obtain a point cloud information set, wherein the point The cloud information set includes the spatial coordinates and texture information of each sampling point.
- generating full-color image data for coloring the filled contour area according to the texture information includes: according to the point cloud information set and the corresponding points in the filled contour area. Spatial coordinates, determine the color value of each point in the filled contour area; generate a full-color image for coloring the filled contour area according to the color value of each point in the filled contour area data.
- the point cloud information set includes spatial coordinates and texture information of each sampling point.
- each polygon patch can be divided and adopted, including the sampling of each edge of the polygon patch and the sampling of the interior of the polygon patch, the point cloud sampling algorithm of each edge of the polygon patch and the internal point cloud sampling algorithm can be different.
- FIG. 6 is a schematic diagram of polygon patch sampling in the embodiment shown in FIG. 3 of the application.
- the spatial coordinates of the vertices P 0 , P 1 and P 2 of the triangular patch P pass through
- the above analysis process obtains that the length interval used by the preset settings, such as 0.05mm, 0.03mm or other values, is used to sample the sides of the triangular patch using a linear interpolation algorithm, so as to obtain the spatial coordinates of each sampling point.
- the internal point algorithm is used to calculate the spatial coordinates of each sampling point. Find the smallest rectangle that can contain the triangular patch, divide the rectangle according to the sampling length interval, and then judge whether each point is in the triangular patch or on the triangular patch. point, the spatial coordinates of the point are calculated.
- the color value of each sampling point is obtained, and the color value can be determined by the spatial coordinates and the texture library. If the texture library does not include the texture file of the polygon patch or the vertex coordinates of the polygon patch do not include the vertex texture coordinates, then the polygon The color value of each sampling point in the patch is the same as the color value of the vertex of the polygon patch; if the texture file of the polygon patch exists in the texture library and the vertex of the polygon patch includes the vertex texture coordinates, the vertex texture of the vertex can be used. Coordinates to determine the texture coordinates of the sampling points, so that based on the texture coordinates of the sampling points, the corresponding color value is determined from the texture image as the texture information of the sampling points.
- the calculation method of the texture coordinates of the sampling point p is: first calculate the area of the triangle patch P: Then calculate the components of the sampling point p on each side of the triangle patch: Then the spatial coordinates of the sampling point p are: (X 0 *S 0 +X 1 *S 1 +X 2 *S 2 ,Y 0 *S 0 +Y 1 *S 1 +Y 2 *S 2 ,Z 0 *S 0 +Z 1 *S 1 +Z 2 *S 2 ), the sampling point texture coordinates can be: (U 0 *S 0 +U 1 *S 1 +U 2 *S 2 ,V 0 *S 0 +V 1 * S 1 +V 2 *S 2 ).
- the method further includes: for each polygonal patch, determining each adjacent polygonal patch of the polygonal patch, wherein , the adjacent polygonal patches and the polygonal patches are connected to one of the sides of the polygonal patches; according to each polygonal patch and each corresponding adjacent polygonal patch, a polygon topology structure is generated.
- slicing each of the polygonal patches based on the slicing plane to obtain at least one outer contour line includes: slicing each of the polygonal patches by using the slicing plane to obtain the slicing plane and each of the polygonal patches.
- the intersection point of the polymorphic surface; according to the polygon topology, the connection sequence of each intersection point is determined, so as to obtain at least one outer contour line corresponding to the slice plane.
- the intersection of the initial polygonal patch and a certain slice plane can be obtained first, and then the intersections of each polygonal patch and the slice plane can be connected in turn in the order of adjacent polygonal patches. If the textured 3D model conforms to the popular If the textured 3D model has holes and other structures, it is necessary to connect the intersection points of each polygonal patch from end to end to obtain each polygon.
- the outer contour line corresponding to the patch is shown in Figure 4.
- FIG. 7 is a flowchart of a method for slicing a textured 3D model provided by another embodiment of the present application.
- the method for slicing a textured 3D model provided by this embodiment is based on the embodiment shown in FIG. is further refined, as shown in FIG. 7 , the slicing method of the textured 3D model provided by this embodiment includes the following steps:
- Step S701 parsing the textured 3D model file to obtain each polygonal patch used to represent the textured 3D model.
- Step S702 according to the spatial coordinates of the layered direction corresponding to the polygonal patches, group and sort each of the polygonal patches to obtain a grouping relationship matrix.
- the grouping relationship matrix includes each group of the polygon patches corresponding to each of the slice planes.
- the layering direction is usually the Z-axis direction, and the X-axis or Y-axis direction can also be used.
- the polygon patches intersecting with the slice plane are grouped and sorted to obtain the grouping of polygon patches corresponding to the slice plane.
- sequence number of each polygonal patch in its corresponding group can be determined according to the value of the spatial coordinate corresponding to the layering direction. The smaller the value of the spatial coordinate, the smaller the corresponding sequence number.
- the above-mentioned grouping relationship matrix can be established with the layer number of the slice plane and the sequence number of the group corresponding to the slice plane as indexes.
- Each polygon patch in the grouping corresponding to each slice plane is described in the grouping relation matrix.
- Table 1 is the grouping relationship matrix table in the embodiment shown in FIG. 7 of the present application.
- Table 1 in the grouping relationship matrix, the sets or groups of polygon patches corresponding to the slice planes of each layer are described, And each group of polygonal patches is sorted according to the value of the spatial coordinate corresponding to the slice direction from small to large, that is, the Z value of the polygonal patch f ia is less than the Z value of f ib , where a is less than b.
- Step S703 Perform an intersection operation between each group of the polygonal patches and the corresponding slice plane to obtain each intersection point.
- Step S704 connecting each of the intersection points end to end to obtain at least one of the outer contour lines corresponding to the slice plane.
- Step S705 based on the set texture thickness, shrink each of the outer contour lines to obtain inner contour lines.
- the set texture thickness may be set by the user, or a default value may be adopted, and the value of the set texture thickness is not limited in this application.
- each outer contour line is shrunk inward to set the thickness of the map, and an inner contour line corresponding to each outer contour line is obtained.
- Step S706 Perform an exclusive OR operation on each of the outer contour lines and their corresponding inner contour lines to obtain each of the contour regions.
- the contour area is composed of a group of areas corresponding to the inner contour line and the outer contour line, and the contour area can be obtained by performing an exclusive OR operation on the area corresponding to the outer contour line and the area corresponding to the inner contour line.
- FIG. 8 is a schematic diagram of the contour area in the embodiment shown in FIG. 7 of the application.
- the outer contour line corresponding to the intersection of the polygonal patch and the slice plane is C1
- the outer contour line C1 is directed to
- the inner contour line C2 is obtained by the inner shrinkage
- the contour area A1 is the area between the outer contour line C1 and the inner contour line C2.
- the method further includes: for each outer contour line, based on the set texture thickness and the set texture For slice thickness, obtain the associated outer contour lines corresponding to a preset number of slice planes above and below the slice plane corresponding to the outer contour line.
- performing an XOR operation on each of the outer contour lines and their corresponding inner contour lines to obtain each of the contour regions includes: performing an XOR operation on the inner contour line and each of the associated outer contour lines respectively. , and then perform an exclusive OR operation with the corresponding outer contour lines to obtain each of the contour regions.
- the preset number may be the number of slice planes corresponding to the set texture thickness, and may be a positive integer corresponding to the ratio of the set texture thickness to the slice thickness.
- the associated outer contour is the outer contour of other slice planes within the same set map thickness near the current outer contour.
- the inner contour line is XORed with each associated outer contour line, and the inner contour line is corrected to obtain a possible more oriented outer contour line.
- the new inner contour of the inner shrink. Therefore, an XOR operation is performed based on the new inner contour line and the outer contour line to obtain the contour area.
- the textured 3D model is formed by stacking multiple slice layers, when the thickness of the texture is not thick enough or the outline range of the upper and lower slices is too small, it is easy to expose the inside of the object, that is, the texture pattern is incomplete.
- the contour line that is, the associated outer contour line, adjusts the inner contour line of the slice plane of the current layer, so as to obtain the contour area with variable map thickness.
- FIG. 9 is a schematic diagram of the contour area in the embodiment shown in FIG. 7 of the present application. It can be seen from FIG. 8 and FIG. 9 that in this embodiment, except for the inner contour line C2 and the outer contour line C1 in FIG. In addition, it also includes an associated outer contour line C3, the associated outer contour line C3 is the outer contour line where the upper or lower slicing plane of the outer contour line C1 intersects the slicing plane of the upper layer or the lower layer and the polygon facet, through the corresponding inner contour line C2. After the same OR operation is performed between the area and the area corresponding to the associated outer contour line C3, a new inner contour line is obtained, and then a new contour area A2 is obtained. Compared with the contour area A1 in FIG. 8, the contour area A2 adds an additional Outline area A3.
- Step S707 filling the contour area based on an inverse color filling algorithm.
- Step S708 Map the filled contour area to the picture according to the resolution of the picture.
- the picture is a picture used for printing the textured 3D model, which may be a picture provided by 3D printing that can accommodate the maximum outer contour of the textured 3D model.
- each contour region is mapped to the picture based on the resolution of the picture.
- the size and resolution of the picture can be determined according to the hardware parameters of the three-dimensional printing system.
- the hardware parameters include parameters such as the size of the printable area, the resolution of the three-dimensional printer, and the arrangement and placement of the inkjet head and the print head.
- the contour area may be mapped to the picture based on the position of the polygon patch corresponding to the contour area and the resolution of the picture.
- Step S709 Obtain the pixel coordinates of each pixel of the filled contour region in the picture.
- the pixel coordinates are the positions of the corresponding pixel points in the picture for each point of the contour area.
- Step S710 Determine the space coordinates of each point in the filled outline area according to the pixel coordinates corresponding to each point in the filled outline area.
- each pixel is a contour area according to the pixel value or color value of each pixel. point in .
- the spatial coordinates of the point in the contour area corresponding to the pixel can be determined according to the pixel coordinates of the pixel and the resolution of the picture.
- Step S711 according to the spatial coordinates of each point in the filled contour area and the texture information, determine the color value of each pixel of the filled contour area in the picture.
- Target point set the texture information or color value of the target point as the color value of the pixel point.
- the point cloud information can be collected to search for the color value of each pixel.
- the color value of the pixel may be set to a color value corresponding to white.
- Step S712 Generate full-color image data for coloring the filled contour area according to the color value of each pixel of the filled contour area in the picture.
- each polygonal patch is sliced based on the slice plane to obtain each outer contour line, and the outer contour line is shrunk to obtain the corresponding
- the inner contour can be corrected based on the outer contour of multiple adjacent slice planes, so as to realize the contour area with variable texture thickness.
- each contour area can be colored to obtain a full-color textured 3D model, which realizes the color and automatic printing of the textured 3D model, with high printing efficiency and high accuracy.
- FIG. 10 is a flowchart of a method for printing a textured 3D model provided by an embodiment of the present application.
- the method for printing a textured 3D model provided by this embodiment is applicable to a 3D printer.
- the textured 3D model provided by this embodiment is applicable to a 3D printer.
- the printing method includes the following steps:
- Step S1001 acquiring slice image data.
- the sliced image data includes full-color image data generated according to the method for slicing a three-dimensional model for mapping provided in any of the embodiments corresponding to FIG. 2 , FIG. 5 , and FIG. 7 of the present application.
- Step S1002 generating print control data according to the slice image data, so as to print the textured 3D model based on the print control data.
- the method for printing a textured 3D model provided by the embodiments of the present application realizes full-color and automatic printing of a textured 3D model, with high printing efficiency and high accuracy.
- FIG. 11 is a schematic structural diagram of a slicing device for a textured 3D model provided by an embodiment of the present application.
- the slicing device for a textured 3D model provided by this embodiment includes: a parsing module 1110 , a contour obtaining module 1120 , a contour Filling module 1130 and full-color picture generation module 1140.
- the parsing module 1110 is configured to parse the textured 3D model file to obtain each polygonal patch used to represent the textured 3D model, wherein at least one of the polygonal patches includes texture information of the textured 3D model; contour lines
- the acquisition module 1120 is used for slicing each of the polygonal patches by using the slice plane to obtain at least one outer contour line corresponding to the slice plane;
- the contour filling module 1130 is used for, according to the at least one outer contour line, The contour area is obtained, and the contour area is filled;
- the full-color image generation module 1140 is configured to generate full-color image data for coloring the filled contour area according to the texture information.
- the contour filling module 1130 includes:
- a contour area obtaining unit used for obtaining a contour area according to the at least one outer contour line; a filling unit, used for filling the contour area.
- the contour area acquisition unit including:
- the shrinking sub-unit is used for shrinking each of the outer contour lines to obtain inner contour lines based on the set texture thickness; the contour area acquiring sub-unit is used for differentiating each of the outer contour lines and their corresponding inner contour lines OR operation to obtain each of the contour regions.
- the device further includes:
- the associated contour acquisition module is used for, after shrinking each of the outer contour lines to obtain the inner contour line based on the set texture thickness, for each outer contour line, based on the set texture thickness and the set slice thickness , and acquire the associated outer contour lines corresponding to a preset number of slice planes above and below the slice plane corresponding to the outer contour line.
- the contour area obtains the subunit, which is specifically used for:
- the full-color image generation module 1140 is specifically used for:
- map the filled contour area into the picture obtain the pixel coordinates of each pixel of the filled contour area in the picture; according to the filled contour area
- the pixel coordinates corresponding to each point determine the spatial coordinates of each point in the filled contour area; according to the spatial coordinates of each point in the filled contour area and the texture information, determine that the filled contour area is in
- the color value of each pixel in the picture according to the color value of each pixel of the filled outline area in the picture, a full-color picture for coloring the filled outline area is generated data.
- the parsing module 1110 includes:
- a file acquisition unit configured to acquire the textured 3D model file, wherein the textured 3D model file includes a texture library of the textured 3D model, vertex coordinates of each vertex, and a polygon index; coordinates and the polygon index, to determine the spatial coordinates of each vertex of each of the polygon patches; a texture information determination unit, used for traversing the texture library according to the polygon index, to obtain the coordinates of each vertex of each of the polygon patches texture information.
- the texture information determination unit which is specifically used for:
- the contour line obtaining module 1120 includes:
- an intersection acquisition unit used to perform an intersection operation between each of the polygonal patches and the slice plane to obtain each intersection
- a contour line acquisition unit used to connect each of the intersection points end to end to obtain at least one of the outer contour lines .
- the device further includes:
- the grouping and sorting module is used to group each of the polygonal patches according to the spatial coordinates of the layered direction corresponding to the polygonal patches before obtaining each intersection point by performing the intersection operation between the polygonal patches and the slice plane Sorting to obtain a grouping relationship matrix, wherein the grouping relationship matrix includes each group of the polygon patches corresponding to each of the slice planes.
- intersection acquisition unit is specifically used for:
- Each group of the polygon patches is intersected with the corresponding slice plane to obtain each intersection point.
- the device further includes:
- the point cloud information set acquisition module is used for sampling each polygon patch after obtaining each polygon patch used to represent the textured 3D model to obtain a point cloud information set, wherein the point cloud information set Including the spatial coordinates and texture information of each sample point.
- the full-color image generation module 1140 is specifically used for:
- the device further includes:
- a topology generation module configured to, after obtaining each polygonal patch used to represent the textured three-dimensional model, for each polygonal patch, determine each adjacent polygonal patch of the polygonal patch, wherein the adjacent polygonal The patch and the polygon patch are connected to one of the sides of the polygon patch; according to each polygon patch and each corresponding adjacent polygon patch, a polygon topology structure is generated.
- contour line obtaining module 1120 is specifically used for:
- padding unit specifically for:
- the contour area is filled based on an inverse color filling algorithm.
- the slicing device for a textured 3D model provided by this embodiment of the present application can execute the slicing method for a textured 3D model provided by any of the embodiments corresponding to FIG. 2 , FIG. 5 , and FIG. 7 of the present application, and has functional modules and beneficial effects corresponding to the execution method. .
- FIG. 12 is a schematic structural diagram of a three-dimensional printing device provided by an embodiment of the present application. As shown in FIG. 12 , the three-dimensional printing device includes: a memory 1210 , a processor 1220 and a computer program.
- the computer program is stored in the memory 1210 and is configured to be executed by the processor 1220 to implement the slicing method of the textured 3D model provided by any of the embodiments corresponding to FIG. 2 , FIG. 5 and FIG. 7 of the present application, and/or, A method for printing a textured 3D model provided by the embodiment shown in FIG. 10 .
- the memory 1210 and the processor 1220 are connected through a bus 1230 .
- An embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and the computer program is executed by a processor to implement any one of the embodiments corresponding to FIG. 2 , FIG. 5 , and FIG. 7 of the present application.
- the slicing method of the textured 3D model, and/or the printing method of the textured 3D model provided by the embodiment shown in FIG. 10 .
- the computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
- An embodiment of the present application provides a computer program product, including a computer program.
- the computer program is executed by a processor of a three-dimensional printing device to control a slicing device for mapping a three-dimensional model to implement the embodiments corresponding to FIG. 2 , FIG. 5 and FIG.
- the method for slicing a textured 3D model provided by any of the embodiments, and/or the method for printing a textured 3D model provided by the embodiment shown in FIG. 10 .
- the processor may be an integrated circuit chip with signal processing capability.
- the above-mentioned processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, referred to as: CPU), a network processor (Network Processor, referred to as: NP), and the like.
- CPU Central Processing Unit
- NP Network Processor
- the methods, steps, and logic block diagrams disclosed in the embodiments of this application can be implemented or executed.
- a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
- the disclosed apparatus and method may be implemented in other manners.
- the device embodiments described above are only illustrative.
- the division of modules is only a logical function division.
- there may be other division methods for example, multiple modules or components may be combined or integrated. to another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or modules, and may be in electrical, mechanical or other forms.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Materials Engineering (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Mechanical Engineering (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
Description
Claims (15)
- 一种贴图三维模型的切片方法,其特征在于,所述方法包括:解析贴图三维模型文件,得到用于表示所述贴图三维模型的各个多边形面片,其中,至少一个所述多边形面片包括所述贴图三维模型的纹理信息;利用切层平面对各个所述多边形面片进行切片,得到所述切层平面对应的至少一条外轮廓线;根据所述至少一条外轮廓线,得到轮廓区域,并对所述轮廓区域进行填充;根据所述纹理信息,生成用于对填充后的所述轮廓区域上色的全彩图片数据。
- 根据权利要求1所述的方法,其特征在于,根据所述至少一条外轮廓线,得到轮廓区域,包括:基于设定贴图厚度,对各条所述外轮廓线进行收缩得到内轮廓线;将各个所述外轮廓线与其对应的内轮廓线进行异或运算,得到各个所述轮廓区域。
- 根据权利要求2所述的方法,其特征在于,在基于设定贴图厚度,对各条所述外轮廓线进行收缩得到内轮廓线之后,所述方法还包括:针对每个外轮廓线,基于所述设定贴图厚度和设定切层厚度,获取所述外轮廓线对应的切层平面上下预设数量的各个切层平面对应的关联外轮廓线;相应的,将各个所述外轮廓线与其对应的内轮廓线进行异或运算,得到各个所述轮廓区域,包括:对所述内轮廓线与各个所述关联外轮廓线分别进行同或运算之后,再与相应的所述外轮廓线进行异或运算,得到各个所述轮廓区域。
- 根据权利要求1所述的方法,其特征在于,根据所述纹理信息,生成用于对填充后的所述轮廓区域上色的全彩图片数据,包括:根据图片的分辨率,将填充后的所述轮廓区域映射至所述图片中;获取填充后的所述轮廓区域在所述图片中的各像素点的像素坐标;根据填充后的所述轮廓区域各点对应的像素坐标,确定填充后的所述轮廓区域各点的空间坐标;根据填充后的所述轮廓区域中各点的空间坐标以及所述纹理信息,确定填充后的所述轮廓区域在所述图片中的各像素点的颜色值;根据填充后的所述轮廓区域在所述图片中的各像素点的颜色值,生成用于对填充后的所述轮廓区域上色的全彩图片数据。
- 根据权利要求1所述的方法,其特征在于,解析贴图三维模型文件,包括:获取所述贴图三维模型文件,其中,所述贴图三维模型文件包括所述贴图三维模型的纹理库、各个顶点的顶点坐标以及多边形索引;根据所述顶点坐标以及所述多边形索引,确定各个所述多边形面片的各个顶点的空间坐标;根据所述多边形索引遍历所述纹理库,得到各个所述多边形面片的各个顶点的纹理信息。
- 根据权利要求5所述的方法,其特征在于,根据所述多边形索引遍历所述纹理库,得到各个所述多边形面片的各个顶点的纹理信息,包括:根据所述多边形索引确定各个所述多边形面片对应的纹理文件;针对每个所述多边形面片的每个顶点,当所述顶点的顶点坐标包括顶点纹理坐标时,获取所述纹理文件中的纹理图像,并根据所述顶点的顶点纹理坐标,从所述纹理图像中确定所述顶点的纹理信息;和/或,针对每个所述多边形面片的每个顶点,当所述顶点的顶点坐标不包括顶点纹理坐标时,获取所述纹理文件中的漫反射光颜色值,并将所述顶点的纹理信息确定为所述漫反射光颜色值。
- 根据权利要求1-6任一项所述的方法,其特征在于,利用切层平面对各个所述多边形面片进行切片,得到所述切层平面对应的至少一条外轮廓线,包括:将各个所述多边形面片与所述切层平面进行交运算,得到各个交点;将各个所述交点首尾相连,得到所述切层平面对应的至少一条所述外轮廓线。
- 根据权利要求7所述的方法,其特征在于,在将所述多边形面片与所述切层平面进行交运算,得到各个交点之前,所述方法还包括:根据多边形面片对应的分层方向的空间坐标,对各个所述多边形面片进行分组排序,得到分组关系矩阵,其中,所述分组关系矩阵中包括每个所述切层平面对应的每组所述多边形面片;相应的,将所述多边形面片与所述切层平面进行交运算,得到各个交点,包括:将每组所述多边形面片与对应的所述切层平面进行交运算,得到各个交点。
- 根据权利要求1-6任一项所述的方法,其特征在于,在得到用于表示所述贴图三维模型的各个多边形面片之后,所述方法还包括:对各个所述多边形面片进行采样,得到点云信息集,其中,所述点云信息集中包括每个采样点的空间坐标和纹理信息;相应的,根据所述纹理信息,生成用于对填充后的所述轮廓区域上色的全彩图片数据,包括:根据所述点云信息集以及填充后的所述轮廓区域中各点对应的空间坐标,确定填充后的所述轮廓区域中各点的颜色值;根据填充后的所述轮廓区域中各点的颜色值,生成用于对填充后的所述轮廓区域上色的全彩图片数据。
- 根据权利要求1-6任一项所述的方法,其特征在于,在得到用于表示所述贴图三维模型的各个多边形面片之后,所述方法还包括:针对每个多边形面片,确定所述多边形面片的各个邻接多边形面片,其中,所述邻接多边形面片与所述多边形面片相接于所述多边形面片的其中一个边;根据各个多边形面片及其对应的各个邻接多边形面片,生成多边形拓扑结构;相应的,利用切层平面对各个所述多边形面片进行切片,得到所述切层平面对应的至少一条外轮廓线,包括:利用所述切层平面对各个所述多边形面片进行切片,得到所述切层平面与各个所述多变形面片的交点;根据所述多边形拓扑结构,确定各个所述交点的连接顺序,以得到所述切层平面对应的至少一条外轮廓线。
- 根据权利要求1-6任一项所述的方法,其特征在于,对所述轮廓区域进行填充,包括:基于反色填充算法,对所述轮廓区域进行填充。
- 一种贴图三维模型的打印方法,其特征在于,所述方法包括:获取切片图像数据,其中,所述切片图像数据包括根据权利要求1-13任一项所述的贴图三维模型的切片方法生成的全彩图片数据;根据所述切片图像数据生成打印控制数据,以基于所述打印控制数据进行贴图三维模型的打印。
- 一种贴图三维模型的装置,其特征在于,所述装置包括:解析模块,用于解析贴图三维模型文件,得到用于表示所述贴图三维模型的各个多边形面片,其中,至少一个所述多边形面片包括所述贴图三维模型的纹理信息;轮廓线获取模块,用于利用切层平面对各个所述多边形面片进行切片,得到所述切层平面对应的至少一条外轮廓线;轮廓填充模块,用于根据所述至少一条外轮廓线,得到轮廓区域,并对所述轮廓区域 进行填充;全彩图片生成模块,用于根据所述纹理信息,生成用于对填充后的所述轮廓区域上色的全彩图片数据。
- 一种三维打印设备,其特征在于,包括:存储器和至少一个处理器,所述存储器存储计算机执行指令;所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1-11任一项所述的贴图三维模型的切片方法,和/或,权利要求12所述的贴图三维模型的打印方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,所述计算机执行指令被处理器执行时用于实现如权利要求1-11任一项所述的贴图三维模型的切片方法,和/或,权利要求12所述的贴图三维模型的打印方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21938821.2A EP4331813A1 (en) | 2021-04-27 | 2021-08-30 | Slicing method, printing method, and device and apparatus for mapping three-dimensional model |
JP2023566695A JP2024515867A (ja) | 2021-04-27 | 2021-08-30 | マッピング三次元モデルのスライス方法、印刷方法、装置及びデバイス |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110462298.8A CN113183469B (zh) | 2021-04-27 | 2021-04-27 | 贴图三维模型的切片方法、打印方法、装置及设备 |
CN202110462298.8 | 2021-04-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022227357A1 true WO2022227357A1 (zh) | 2022-11-03 |
Family
ID=76979724
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/115458 WO2022227357A1 (zh) | 2021-04-27 | 2021-08-30 | 贴图三维模型的切片方法、打印方法、装置及设备 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4331813A1 (zh) |
JP (1) | JP2024515867A (zh) |
CN (1) | CN113183469B (zh) |
WO (1) | WO2022227357A1 (zh) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116372189A (zh) * | 2023-03-17 | 2023-07-04 | 南京航空航天大学 | 砂型增材制造多模型分割与图案填充打印方法 |
CN116543091A (zh) * | 2023-07-07 | 2023-08-04 | 长沙能川信息科技有限公司 | 输电线路的可视化方法、系统、计算机设备和存储介质 |
CN116824055A (zh) * | 2023-06-30 | 2023-09-29 | 深圳市魔数智擎人工智能有限公司 | 一种基于图片快速三维化交互式建模方法 |
CN117058299A (zh) * | 2023-08-21 | 2023-11-14 | 云创展汇科技(深圳)有限公司 | 基于射线检测模型内长方形长宽实现快速贴图的方法 |
CN117838306A (zh) * | 2024-02-01 | 2024-04-09 | 南京诺源医疗器械有限公司 | 基于成像仪的目标图像处理方法及系统 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113183469B (zh) * | 2021-04-27 | 2023-05-30 | 珠海赛纳三维科技有限公司 | 贴图三维模型的切片方法、打印方法、装置及设备 |
CN113715338B (zh) * | 2021-08-30 | 2023-08-18 | 深圳市纵维立方科技有限公司 | 三维模型的切片方法、打印方法及相关设备 |
CN113844034B (zh) * | 2021-09-30 | 2024-01-05 | 深圳市纵维立方科技有限公司 | 三维模型打孔处理方法、打印方法、相关设备和存储介质 |
CN114670450A (zh) * | 2022-04-07 | 2022-06-28 | 深圳拓竹科技有限公司 | 用于3d打印的方法、装置、设备、存储介质和程序产品 |
CN114834043B (zh) * | 2022-05-09 | 2023-09-05 | 华中科技大学鄂州工业技术研究院 | 一种激光三维加工模型切片数据处理方法 |
CN115592954B (zh) * | 2022-09-22 | 2024-05-17 | 哈尔滨工业大学 | 一种粘结剂喷射3d打印的切片生成方法 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ZA972205B (en) * | 1996-03-21 | 1997-09-17 | Real Time Geometry Corp | System and method for rapid shape digitizing and adaptive mesh generation. |
WO2015007770A1 (de) * | 2013-07-16 | 2015-01-22 | Schultheiss Gmbh | Verfahren und vorrichtung zum herstellen eines dreidimensionalen objekts sowie belichtungsmaskenerzeugungseinrichtung |
TW201838797A (zh) * | 2017-04-20 | 2018-11-01 | 三緯國際立體列印科技股份有限公司 | 彩色3d物件的著色輪廓內縮方法 |
WO2018197531A1 (de) * | 2017-04-28 | 2018-11-01 | Schmid Rhyner Ag | Vorrichtung und verfahren zum erzeugen einer texturierten beschichtung |
EP3599076A1 (en) * | 2018-07-25 | 2020-01-29 | OCE Holding B.V. | Method of printing with gloss control |
CN110757804A (zh) * | 2018-07-26 | 2020-02-07 | 中国科学院沈阳自动化研究所 | 一种基于纹理贴图三维模型的全彩色分层切片算法 |
CN111383351A (zh) * | 2018-12-29 | 2020-07-07 | 上海联泰科技股份有限公司 | 三维纹理贴图方法及装置、计算机可读存储介质 |
CN112102460A (zh) * | 2020-09-17 | 2020-12-18 | 上海复志信息技术有限公司 | 3d打印切片方法、装置、设备和存储介质 |
CN113183469A (zh) * | 2021-04-27 | 2021-07-30 | 珠海赛纳三维科技有限公司 | 贴图三维模型的切片方法、打印方法、装置及设备 |
-
2021
- 2021-04-27 CN CN202110462298.8A patent/CN113183469B/zh active Active
- 2021-08-30 EP EP21938821.2A patent/EP4331813A1/en active Pending
- 2021-08-30 JP JP2023566695A patent/JP2024515867A/ja active Pending
- 2021-08-30 WO PCT/CN2021/115458 patent/WO2022227357A1/zh active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ZA972205B (en) * | 1996-03-21 | 1997-09-17 | Real Time Geometry Corp | System and method for rapid shape digitizing and adaptive mesh generation. |
WO2015007770A1 (de) * | 2013-07-16 | 2015-01-22 | Schultheiss Gmbh | Verfahren und vorrichtung zum herstellen eines dreidimensionalen objekts sowie belichtungsmaskenerzeugungseinrichtung |
TW201838797A (zh) * | 2017-04-20 | 2018-11-01 | 三緯國際立體列印科技股份有限公司 | 彩色3d物件的著色輪廓內縮方法 |
WO2018197531A1 (de) * | 2017-04-28 | 2018-11-01 | Schmid Rhyner Ag | Vorrichtung und verfahren zum erzeugen einer texturierten beschichtung |
EP3599076A1 (en) * | 2018-07-25 | 2020-01-29 | OCE Holding B.V. | Method of printing with gloss control |
CN110757804A (zh) * | 2018-07-26 | 2020-02-07 | 中国科学院沈阳自动化研究所 | 一种基于纹理贴图三维模型的全彩色分层切片算法 |
CN111383351A (zh) * | 2018-12-29 | 2020-07-07 | 上海联泰科技股份有限公司 | 三维纹理贴图方法及装置、计算机可读存储介质 |
CN112102460A (zh) * | 2020-09-17 | 2020-12-18 | 上海复志信息技术有限公司 | 3d打印切片方法、装置、设备和存储介质 |
CN113183469A (zh) * | 2021-04-27 | 2021-07-30 | 珠海赛纳三维科技有限公司 | 贴图三维模型的切片方法、打印方法、装置及设备 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116372189A (zh) * | 2023-03-17 | 2023-07-04 | 南京航空航天大学 | 砂型增材制造多模型分割与图案填充打印方法 |
CN116372189B (zh) * | 2023-03-17 | 2023-12-15 | 南京航空航天大学 | 砂型增材制造多模型分割与图案填充打印方法 |
CN116824055A (zh) * | 2023-06-30 | 2023-09-29 | 深圳市魔数智擎人工智能有限公司 | 一种基于图片快速三维化交互式建模方法 |
CN116543091A (zh) * | 2023-07-07 | 2023-08-04 | 长沙能川信息科技有限公司 | 输电线路的可视化方法、系统、计算机设备和存储介质 |
CN116543091B (zh) * | 2023-07-07 | 2023-09-26 | 长沙能川信息科技有限公司 | 输电线路的可视化方法、系统、计算机设备和存储介质 |
CN117058299A (zh) * | 2023-08-21 | 2023-11-14 | 云创展汇科技(深圳)有限公司 | 基于射线检测模型内长方形长宽实现快速贴图的方法 |
CN117838306A (zh) * | 2024-02-01 | 2024-04-09 | 南京诺源医疗器械有限公司 | 基于成像仪的目标图像处理方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN113183469B (zh) | 2023-05-30 |
EP4331813A1 (en) | 2024-03-06 |
JP2024515867A (ja) | 2024-04-10 |
CN113183469A (zh) | 2021-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022227357A1 (zh) | 贴图三维模型的切片方法、打印方法、装置及设备 | |
US8711143B2 (en) | System and method for interactive image-based modeling of curved surfaces using single-view and multi-view feature curves | |
EP2969483B1 (en) | Slicing and/or texturing for three-dimensional printing | |
CN104134234B (zh) | 一种全自动的基于单幅图像的三维场景构建方法 | |
CN110757804B (zh) | 一种基于纹理贴图三维模型的全彩色分层切片算法 | |
CN107123164A (zh) | 保持锐利特征的三维重建方法及系统 | |
CN105989604A (zh) | 一种基于kinect的目标物体三维彩色点云生成方法 | |
CN110533770B (zh) | 一种面向隐式表达医学模型的3d打印切片方法 | |
CN103761397A (zh) | 用于面曝光增材成型的3d模型切片及投影面生成方法 | |
CN103279989A (zh) | 一种三维激光成像系统平面点云数据三角化处理方法 | |
CN110956699B (zh) | 一种三角形网格模型gpu并行切片方法 | |
CN115661374B (zh) | 一种基于空间划分和模型体素化的快速检索方法 | |
JP2002288687A (ja) | 特徴量算出装置および方法 | |
Petersson | Hole-cutting for three-dimensional overlapping grids | |
CN108537887A (zh) | 基于3d打印的草图与模型库三维视图匹配方法 | |
KR20180073914A (ko) | 3d 프린팅 시간 단축을 위한 상하 레이어 폴리라인 병합 기반 가변 슬라이싱 방법 | |
CN110400370B (zh) | 一种构建三维cad模型的语义级部件模板的方法 | |
CN109448093B (zh) | 一种风格图像生成方法及装置 | |
CN103238170B (zh) | 显示处理方法以及装置 | |
CN114290660A (zh) | 曲面分层式3d打印方法及系统 | |
US11348261B2 (en) | Method for processing three-dimensional point cloud data | |
CN115937466B (zh) | 一种融合gis的三维模型生成方法、系统及存储介质 | |
Nguyen et al. | High-definition texture reconstruction for 3D image-based modeling | |
CN108010084A (zh) | 一种深度摄像机重建及自动标定的方法、系统、设备 | |
US6518964B1 (en) | Apparatus, system, and method for simplifying annotations on a geometric surface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21938821 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023566695 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021938821 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021938821 Country of ref document: EP Effective date: 20231127 |