CN110191806B - Three-dimensional printing method and system for image segmentation - Google Patents

Three-dimensional printing method and system for image segmentation Download PDF

Info

Publication number
CN110191806B
CN110191806B CN201780083026.9A CN201780083026A CN110191806B CN 110191806 B CN110191806 B CN 110191806B CN 201780083026 A CN201780083026 A CN 201780083026A CN 110191806 B CN110191806 B CN 110191806B
Authority
CN
China
Prior art keywords
printer
data
dimensional
creating
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780083026.9A
Other languages
Chinese (zh)
Other versions
CN110191806A (en
Inventor
奥伦·考利斯曼
丹·普里-塔尔
罗伊·波拉特
亚龙·瓦克斯曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Simbionix Ltd
Original Assignee
Simbionix Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Simbionix Ltd filed Critical Simbionix Ltd
Publication of CN110191806A publication Critical patent/CN110191806A/en
Application granted granted Critical
Publication of CN110191806B publication Critical patent/CN110191806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K15/00Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers
    • G06K15/02Arrangements for producing a permanent visual presentation of the output data, e.g. computer output printers using printers
    • G06K15/18Conditioning data for presenting it to the physical printing elements
    • G06K15/1848Generation of the printable image
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/10Processes of additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/40068Modification of image resolution, i.e. determining the values of picture elements at new relative positions

Abstract

A system and method for converting imaging data (e.g., medical imaging data) into three-dimensional printer data. Imaging data may be received that describes, for example, a three-dimensional volume of a subject or patient. Using printer definition data describing a particular printer, 3D printer input data may be created from imaging data describing at least a portion of a three-dimensional volume.

Description

Three-dimensional printing method and system for image segmentation
Technical Field
The present invention relates to three-dimensional (3D) or solid-state printing systems, and more particularly, to creating 3D imaging definitions or data from 3D or other image data to be segmented and printing the data.
Background
A 3D printing system typically creates a three-dimensional object by combining successive layers of material into a cross-sectional pattern along the Z-axis (typically up and down in the direction of the layer formation) based on input computer data or definitions to form the three-dimensional object. Solid state printing systems include systems that build components by, for example, stereolithography, laser sintering, fused deposition modeling, selective deposition modeling (e.g., inkjet deposition), film transfer printing, and others.
The 3D printer may employ input data defining the objects to be printed. Examples of the 3D printer input data or the 3D definition file may include polygon (approximation of a surface having a triangle) data such as STL (stereolithography) format data, AMF (additive manufacturing file format), and VRML (virtual reality modeling language) format data.
It may be desirable to create a 3D object using image data such as 3D medical image data, e.g. digital imaging and communications in medical Data (DICOM), CT (computed tomography) data, MR or MRI (magnetic resonance imaging) data, ultrasound data or a stack of image slices. For example, a patient may be imaged and it may be desirable to print a 3D model of all or part of the patient's vascular system, heart, spine, etc. The conversion of 3D image data to 3D printer data may not be direct: for example, different 3D printers have different characteristics, and the conversion may not take these characteristics into account.
Disclosure of Invention
A system and method for converting imaging data (e.g., medical imaging data) into three-dimensional printer data. Imaging data may be received that describes, for example, a three-dimensional volume of a subject or patient. Using printer definition data describing a particular printer, 3D printer input data may be created from imaging data describing at least a portion of a three-dimensional volume.
Drawings
The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. For simplicity and clarity of illustration, elements illustrated in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. The specification, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:
FIG. 1 is a schematic block diagram illustrating a computing device according to an embodiment of the present invention.
FIG. 2 is a schematic block diagram illustrating a system according to an embodiment of the present invention.
FIG. 3 is a flow chart describing a method according to one embodiment of the present invention.
Fig. 4A and 4B illustrate a two-dimensional (2D) representation of a 3D volume having voids shown in fig. 4A filled to produce the volume shown in fig. 4B, according to one embodiment of the present invention.
Fig. 5A and 5B illustrate a 2D representation of a 3D volume having a boundary between a tooth and a mandible as shown in fig. 5A removed to create a gap in fig. 5B according to one embodiment of the present invention.
Fig. 6A and 6B illustrate a 2D representation of a 3D volume with bone segments according to one embodiment of the present invention.
Fig. 7A, 7B and 7C show the mold before and after inflation according to one embodiment of the invention.
It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of some embodiments. However, it will be understood by those of ordinary skill in the art that some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, units and/or circuits have not been described in detail so as not to obscure the discussion.
Discussion herein using terms such as, for example, "processing," "computing," "calculating," "determining," "establishing", "analyzing", "checking", or the like, may refer to an operation and/or process of a computer, computing platform, computing system, or other electronic computing device that manipulates and/or transforms data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information storage medium (which may store instructions to perform the operation and/or process).
References to "one embodiment," "an embodiment," "example embodiment," "various embodiments," etc., indicate that the embodiment so described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Although it may be, repeated use of the phrase "in one embodiment" does not necessarily refer to the same embodiment. Some embodiments may include features from different descriptions of embodiments: for example, the systems and methods may use thickening and/or gap creation and/or resolution adjustment, etc.
The logic, modules, devices, and interfaces described herein may perform functions that may be implemented in hardware and/or code. The hardware and/or code may include software, firmware, microcode, processors, state machines, chipsets, or combinations thereof designed to perform the functions.
Embodiments of the invention may perform "print-oriented" segmentation ("POS") or data conversion. Embodiments may convert imaging data (e.g., representing a 3D volume) into 3D printer data by inputting imaging data describing a 3D volume of a subject (e.g., a human patient), and in a POS process, using or according to printer definition data created from the imaging data 3D printer input data describing at least a portion of the three-dimensional volume (e.g., describing a particular printer desired for printing, printer settings for a particular print job, printing materials for the job, or other parameters). The imaging data may be segmented, which typically involves converting the medical imaging data to a mask, a 3D matrix, the 3D matrix comprising, for example, 1's where the 3D material should be printed, and, for example, 0's where the material should not be printed. Other suitable data structures may be used. Segmentation may include segmenting the volume into portions to be printed, and/or selecting certain portions to be printed (e.g., based on user input). The segmentation may be performed, for example, by thresholding and/or specific known tools or algorithms designed for specific types of structures. For example, known tools or modules may segment to segment the imaging data into different bones, and known tools or modules may segment to segment the imaging data into different soft tissue organs. The segmentation may result in one or more masks-e.g., one mask per vertebra or one mask per tooth. Known transformation or segmentation methods may include, for example, known tooth segmentation (e.g., "induced tooth segmentation from CT images using level set with shape and intensity prior", hui Gao and Oksam Chae, pattern recognition, vol. 43, no. 7, no. 2010, no. 7, pages 2406-2417) and vessel segmentation (e.g., "A review of 3D vessel lumen segmentation techniques.
The splitting is typically performed before conversion into printer data. The printer may then print and create the 3D object using the printer input data. The segmentation and data conversion of the prior art are generally independent of whether the segmented data is to be 3D printed and/or take into account printer characteristics, and do not take into account whether the segmented data is to be 3D printed and/or take into account printer characteristics, resulting in poor print quality. For example, a notification of insufficient splitting may be provided when the printer receives such a print job because, for example, the part is too thin or a connection part is missing. Rather, embodiments of the present invention may perform a segmentation or data conversion process that takes into account that the output is for a particular 3D printer with a set of specifications. A 3D model can be created that uses 3D printer specifications (constraints and/or advantages) to achieve the most appropriate print model in an efficient manner. This may reduce or eliminate problems such as weak portions, inaccurate parts, or "pixelation" models, which may occur if a particular printer is not considered. Embodiments of the invention are particularly advantageous when used with medical grade volumes such as defined by DICOM, MR or CT data, but may work with other data such as non-medical imaging data.
FIG. 1 depicts a system according to one embodiment of the present invention. The imaging system 90 may image a subject 92 (e.g., a human patient) to produce image data 94 (representing the subject or a portion of the subject), such as 3D medical image data, e.g., CT data, MR or MRI data, ultrasound data, image slice stacks, or other image data, and may be, for example, a CT system or an MRI system. Other medical or non-medical imaging systems may be used. Although medical imaging or imaging data is described herein, other imaging systems may be used and imaging data used herein may be created manually without imaging a particular object. For example, the input data may be a 3D object. The 3D object may include 3D imaging data, mesh data, volumetric objects, polygonal mesh objects, point clouds, functional representations of 3D objects, cad files, 3D pdf files, STL files, and/or any input that may represent 3D objects. The 3D imaging data may include medical imaging data including CT data, cone beam CT ("CBCT") imaging data, MRI data, and/or MRA imaging data (e.g., MRI with contrast agent), or ultrasound imaging data. The 3D object may be anatomy (e.g., complex anatomy), industrial data, or any 3D object.
The imaging system 90 may send or transmit the image data 94 to a computer 20, such as a personal computer, workstation, or the like. The computer 20 may process the image data 94 in view of or according to the printer definition data 22 and output 3D printer input data 11 (e.g., 3D mesh, 3D definition file, PLY (polygonal file format), STL (stereolithography), AMF (ASTM additive manufacturing file format), VRML (virtual reality modeling language)) to the 3D printer 10, which may print the data as a physical real world 3D printed object 12. Although image data 94, printer definition data 22, and 3D printer input data 11 are shown as being stored by computer 20 (e.g., in memory 40 of fig. 2), such data may be stored or shared by the devices of fig. 1, including imaging system 90 and printer 10, if appropriate. For example, the imaging system 90 may generate image data 94 (which may be generated in another manner and may be from another source) and may send or transmit the image data 94 to the computer 20, and the computer 20 may create the 3D printer input data 11 and send or transmit the data 11 to the 3D printer 10.
The 3D printer 10 may use various methods to produce a physical, real-world 3D printed object 12 using input or build materials (e.g., photopolymers or other materials), the printing of which may be defined or discussed with respect to the X/Y/Z axis, which defines the vertical relationship of a set of deposited material layers.
Each of the imaging system 90, computer 20, and 3D printer 10 may include computing components, such as some version of the components described in fig. 2, and may be connected via, for example, a network 60. In other embodiments, not all of the components shown may be used. For example, imaging data may not be input directly from the imaging system 90 to the computer 20, but rather, for example, via a flash drive, disk, or the like. The components may be combined: for example, the functionality of the computer 20 described as converting the image or imaging data 94 into 3D printer data 11 may be incorporated into the 3D printer 10.
In one embodiment, the 3D printer 10 may include, for example, a cartridge 14, the cartridge 14 including a supply of object creation build material 16 that may be selectively dispensed onto a tray 18. Tray 18 holds solid printed material for building 3D object 12 during the build process. In one embodiment, the solid printing material may be any photocurable material known in the art or hereafter devised. Examples of printing or object creating materials suitable for use with various embodiments of the present invention include, for example, photocurable materials, plastics, and metals. Solid printing material 16 may be dispensed from cartridge 14 through one or more dispensers, inkjet or selectively openable valves 15 on the bottom wall of the cartridge. A moving device, such as shuttle 19, may move the dispenser or valve 15 back and forth generally along the x-axis and/or y-axis of the solid state printing device 10, and may move the dispenser or valve 15 up and down along the z-axis to create a layer. Other embodiments of the present invention include alternative means for dispensing the printing material into the tray. Other embodiments do not require the use of selective deposition or the particular components shown; for example, the 3D printer 10 may use a system to deposit the binder material layer by layer onto a powder bed with an inkjet print head, or SLS (selective laser sintering), where a laser sinters the powder material.
FIG. 2 depicts computing components that may be part of, for example, printing system 90, computer 20, and 3D printer 10. Computing device 30 may include a controller 32, which controller 32 may be, for example, a central processing unit processor (CPU), a chip or any suitable computing or computable device, a general purpose processing unit (GPU) 33, an operating system 50, memory 34, storage 36, an input device 38, and an output device 40.
Operating system 50 may be or include any code segment designed and/or configured to perform tasks related to coordinating, scheduling, arbitrating, supervising, controlling, or otherwise managing operations of computing device 30, such as the execution of a scheduler. The memory 34 may be or include, for example, random Access Memory (RAM), read Only Memory (ROM), dynamic RAM (DRAM), synchronous DRAM (SD-RAM), double Data Rate (DDR) memory chips, flash memory, volatile memory, non-volatile memory, cache memory, buffers, short term memory units, long term memory units, or other suitable memory units or storage units. The memory 34 may be or may include a plurality of, possibly different, memory units. The executable code or software 52 may be any executable code, such as an application, program, process, task, or script. The executable code or software 52 may include, for example, segmentation algorithms or processes, processes that convert image data to 3D printing specification data, and the like. Executable code 52 may be executed by controller 32 under the control of operating system 50. For example, the executable code 52 may be an application program that performs the methods discussed herein. The controller 32 may be configured to perform the methods discussed herein, for example, by executing code 52.
The storage device 36 may be or include, for example, a hard disk drive, a floppy disk drive, a Compact Disk (CD) drive, a CD recordable (CD-R) drive, a Universal Serial Bus (USB) device, or other suitable removable and/or fixed storage unit.
The input device 38 may be or include a mouse, keyboard, touch screen or pad or any suitable input device. It should be appreciated that any suitable number of input devices may be connected. Output device 40 may include, for example, one or more displays, speakers, and/or any other suitable output device. It will be appreciated that any suitable number of output devices may be operably connected.
Embodiments of the invention may include an article such as a computer or processor non-transitory readable medium, or a computer or processor non-transitory storage medium such as, for example, memory, a disk drive, or a USB flash memory, encoding, including, or storing instructions, e.g., computer-executable instructions, that when executed by a processor or controller, cause the processor or controller to perform the methods disclosed herein.
FIG. 3 is a flow chart depicting a method according to one embodiment of the invention. In operation 300, imaging information and/or printer information may be input. For example, 3D volume (e.g., image data 94, such as 3D image or imaging data) and printer specification or sharpness data 22 (e.g., describing a particular printer, materials used, etc.) may be input. The input may also be, for example, data describing image data, such as resolution. In operation 310, an initial segmentation, processing, or conversion may be performed. The initial segmentation may convert the 3D volume into one or more masks 24, and the masks 24 may be initial masks. Segmentation may include identifying features such as in vivo features (e.g., different bones, organs, teeth, etc.) and partitioning one or more masks according to the different features. Various segmentation algorithms and specific tools, such as threshold segmentation, may be used. The initial processing 320 may include processing such as resolution changes, filling in voids, eroding, or removing boundaries between touch masks. Certain operations may be combined: for example, the segmentation 310 may incorporate resolution changes. The initial processing 320 may take into account printer specifications or definition data 22.
In operation 330, the output of the previous operation (e.g., one or more masks) may be evaluated for, for example, quality, intensity, and/or printability. In operation 340, if the mask or a portion of the mask is not printable, or fails certain quality tests, the mask may be adjusted, changed, or fixed without potentially affecting accuracy, e.g., "fix" may be the same resolution as that of the printer. The evaluation and the changing or adjusting may be iterative or repeated. Typically, evaluation 330 is performed again after adjustment 340.
In operation 350, if the mask is printable, post-processing may be performed on the mask, for example, known operations such as smoothing or morphological operations (such as thinning or etching). In some embodiments, the output of the post-processing may be an evaluation, operation 330, to determine whether the post-processing resulted in a printable mask or a non-printable mask. In operation 360, the mask may be converted into data describing at least a portion of the 3D volume that may be input or sent to a printer; for example, the mask may be converted to a grid. In operation 370, the mesh may be printed by a 3D printer.
In one embodiment of the invention, image data 94 (e.g., 3D image or imaging data) may be converted to 3D printer input data 11 (e.g., a grid) while taking into account or according to printer specification or definition data 22. For example, printer input data may be created from (e.g., converted from) image or imaging data 94, image or imaging data 94 may describe at least a portion of the 3D volume, or allow a 3D printer to print at least a portion of the 3D volume. The imaging data may be, for example, a volume including 3D color or grayscale data that may need to be segmented (e.g., a DICOM image of CT or MR, a series of two-dimensional (2D) images). The 3D printer may then print one or more 3D objects corresponding to the printer input data as well as the imaging data or 3D volume. The printer specification or definition data 22 generally describes the particular printer or printer type that the user intends to use for a particular print job. The printer specification or definition data 22 may describe the operating parameters, resolution, tolerances, accuracy, etc. of a particular 3D printer and/or a particular model of the printer. The printer specification or definition data 22 may include data on the particular printed material being used by the target printer (e.g., the printer intended to be used or the printer selected by the user to be used), and certain printers may use different materials. Thus, the printer specification or definition data 22 may include data describing a particular printer and its variations of material intended for a particular job. The conversion may include segmentation: features such as in vivo features (e.g., organs, vascular systems, portions of organs, bones, etc.) are identified and one or more masks 24, for example, are created. Each mask 24 may represent a different object within, for example, a volume to be printed, an organ, a body part to be printed, a bone, etc. The image data 94 may describe or depict a three-dimensional volume of a subject (e.g., a patient, although in some embodiments the non-biological object may be a subject). However, the image data 94 need not be complete 3D data; for example, a set of 2D slices may form the image data 94 and may describe a 3D volume. For example, CT data may be input, which may be a bed of 2D slices, where the distance or spacing between each slice is known.
The image data may be divided or segmented into one or more portions (e.g., organs, tissues) desired to be printed prior to conversion into printer data; external image data (e.g., an internal body volume that is not printed, such as an organ or tissue not associated with the user) may be excluded. The division or segmentation may result in one or more imaging data segments or portions (e.g., masks), and each of these imaging data segments or portions may be converted into printer input, specification, or definition data, each segment representing an object to be printed. The mask 24 or other data structure may be used or the mask 24 or other data structure may be generated or created as the output of the segmentation. The mask 24 may be such that only the portion of the imaging data defined by the mask is converted to 3D printer input data. For example, the image data may be a volume, such as a patient's chest, including tissue that the user wants to convert to a 3D object (e.g., heart), and tissue that the user does not want to convert to a 3D object (e.g., lung, connective tissue, bone, skin, etc.). The mask 24 may define the object or organ to be printed. In one embodiment, the mask may be a data structure representing a volume or region having the same size or volume as described by image data 94, with a Boolean value (e.g., 1/0, true/false, although other values may be used) describing each voxel in image data 94 if each voxel in image data 94 should be converted to a 3D object. In one embodiment, the mask may initially include all false/0 values, and the voxel/part to be printed may have a value that changes to true/1. If voxels are not used, mask 24 may operate on image data 94 in another manner, such as including Boolean values corresponding to portions of image data 94, or another method. For example, a 1 may indicate inclusion in a 3D object and a 0 may indicate exclusion. Multiple masks 24 may be used, each mask corresponding to one object: for example, one mask may correspond to and result in printing one chamber of the heart, while another mask may correspond to and result in printing another chamber of the heart.
In one embodiment, a workstation such as computer 20 may receive printer specifications such as printer definition data 22 from printer 10, or may include printer definition data 22, or sets of printer definition data 22, for example, with a set of printer definition data 22 for each of a plurality of specific printers or printer models. The computer 20 (or the means for performing segmentation) may have a segmentation tool or algorithm, such as automatic bone segmentation, for example stored in the memory 34 or the storage device 36.
The computer 20 may include, or may receive, printer definition data 22, such as receiving printer definition data 22 from the printer 10 using, for example, a driver installed on the computer 10. The files or software stored on computer 10 may include data that may be installed or updated, for example, for each supported printer; such data need not be received from the printer. Printer definition data 22 may include data for each printer that a user may use or select, such as spatial resolution, layer thickness \ Z-axis resolution data (e.g., distance between layers printed by the printer), printing tolerances (e.g., accuracy), printer model or type, printer technology used by the target printer (e.g., SLA, digital Part Materialization (DPM), fused Deposition Modeling (FDM), etc.), materials used (which are used to derive or look up material strength; material strength is referred to as an individual parameter), and so forth.
A user providing input to the computer 20 may select certain 3D printers from the corresponding printer definition data 22 (e.g., the system may receive user input via, for example, input device 38), and may input individual material selections to be used with the printer in some embodiments, or may input printer definition data 22.
The computer 20 may receive a volume to be processed and/or segmented, such as image data 94, from, for example, the imaging system 90 or from another source. In one embodiment, the data conversion, including the segmentation, from the image data 94 to the printer input data 11 may be performed on the computer 20, but in other embodiments, the processing may be performed on, for example, the 3D printer 10. Computer 20 may use segmentation or partitioning tools and other algorithms (e.g., executed by controller 32 or on controller 32) and output one or more masks 24.
In one embodiment, the user may choose how the segmentation is to proceed: for example, the user may decide, based on a part of the heart, that the heart should be segmented to produce one physical 3D object, or may decide that the heart should be segmented to produce a plurality of physical 3D objects. The user may, for example, click (using a pointing device) on the displayed area to provide seed points, and known algorithms, such as the connected component algorithm, may use the seeds to select features such as organs or bones. Using known algorithms, a mask may be created, for example one for each organ, bone or organ part (e.g. heart chamber). Other user inputs to the segmentation process may be received. A user may provide such input to the computer 20 via, for example, the input device 38, and the system may receive user input via, for example, the input device 38.
Various known segmentation algorithms or processes may be used. For example, a threshold may be used, where for each voxel in the input volume it is segmented if it is within the desired range of values, or above or below the threshold-e.g., it is marked as true or "printed" in the mask. Other suitable segmentation algorithms may be used.
The mask 24 may be the output of the segmentation or segmentation process and may include, for example, 3D boolean data that may indicate where the 3D model or model portion exists with reference to the volume.
Each mask 24 may be, for example, a data object representing an object that may be in a set of related objects. For example, the first mask may be a first portion of a heart and the second mask may be a second portion of the heart. In some embodiments, no mask or segmentation is required. For example, the conversion process discussed herein may operate on data without segmentation.
Computer 20 may use or apply a mask to image data 94 and convert the data to a 3D print volume definition or 3D model construct, e.g., create or compute a 3D print volume definition, 3D model construct, mesh, etc., e.g., in the form of 3D printer input data 11, and send the data to printer 10. The printer may then print the 3D object using, for example, the 3D printer input data 11.
One or more operations may be used alone or in combination to convert image data to printer input data, or to convert image data to printer input data simultaneously. For example, POS processing may include increasing resolution, adjusting layer thickness, padding or border thickening, processing complex mask structures, separating structures with real world connections, and connecting components for printing.
In various embodiments, as part of the initial segmentation (e.g., operation 310), a tool-specific bone segmentation or other segmentation may be performed. For example, known thresholding algorithms may use bone values (e.g., minThreshVal =100hu, maxthreshval =1000hu based on CT modality, hu = hou units to transform voxels in the imaging data. As part of the segmentation process, the bone segmentation algorithm may return or identify an internal portion of the bone. The segmentation process for such results may be, for example, an alpha extension (alpha extension) algorithm on the graph to distinguish desired bones, other bones, and background. The alpha expansion algorithm may include algorithms that optimize data separation (e.g., "bone" and "non-bone").
Fig. 6A shows a 2D representation of a 3D volume 600 with identified bone segments 610. Fig. 6A shows the result of threshold-based segmentation of the connected component selected from the results. Fig. 6B shows a 2D representation of a 3D volume 600 with identified bone segments 610. Fig. 6B illustrates a print-oriented segmentation in which gaps and holes 605 are filled to create a more printable and robust object, according to one embodiment.
Other segmentations may be performed, such as separating portions of an organ (e.g., a chamber of the heart) or separating different organs.
Adjusting resolution
In various embodiments, as part of the initial processing (e.g., operation 320 above), the 3D printer input data may have a modified or desired resolution (e.g., volumetric resolution) related to the capabilities and resolution of the printer. For example, the printer and its associated input data may have a resolution described, for example, as printer tolerances, inter-layer distance, size of voxels, number of x, y, or z lines or voxels, and the like. Creating 3D printer input data may include adjusting the resolution of the output mask according to printer definition data describing the printer or describing printer settings. For example, after the segmentation is obtained, the segmentation results may be smoothed at a higher resolution (e.g., the same or higher than printer precision).
The initial input volume may be, for example, medical input data or image data 94 (e.g., CT, MR data). The volume may be composed of voxels having a particular size (e.g., depending on modality, medical protocol, scanning apparatus, etc.). The resolution of the initial medical input data volume or image data 94 may be changed, which in turn may change the resolution of the mask.
Printer tolerances may be taken into account when changing the resolution. In one embodiment, the process may consider printer tolerances (e.g., accuracy of printing on the print plane) and layer thickness (e.g., accuracy of the printer in the z-direction) as printer resolution. The 3D printer typically prints layer by layer based on, for example, grid, or voxel data. The height of each layer is the thickness of the layer. The 3D printer should place the material at a specific x, y position, e.g. according to the voxel. However, 3D printers may not be accurate, but rather place the material at x + T, y + T, where T is a tolerance (e.g., T may be in the range of millimeters or micrometers). The tolerance may exist in the z/thickness/layer direction and may have a different value than the tolerance of x and y. The tolerances for a particular printer or printer model (e.g., a set of tolerances for X/Y/Z) are known. The tolerances and layer thicknesses may be known or estimated by the user and may be considered.
The layer thickness and tolerance may constitute "printer resolution", similar to volume resolution. The 3D printer should have information or "know" whether the material at any particular point in space should be printed. The minimum size (or resolution) of a dot in space (in the 3D print build area) is the layer thickness and tolerance. Better POS can be achieved if the input medical volume resolution is "tuned" or conforms to the printer resolution (e.g., layer thickness and tolerances) that is more favorable for printing.
For example, if the 3D printer selected or described by the printer definition data has a higher resolution than the volume to be imaged, interpolation may be performed to increase the resolution of the volume so that each voxel in the volume is the same or higher precision than the printer.
A function such as resizean (resizeactor) may be used, which may be any function known in the art that interpolates medical imaging volumes or masks to increase or decrease resolution (e.g., if the input volume has a higher resolution than the printer). Com includes multi-modal non-rigid presentation algorithm image registration displayed on mathworks.
Such a function may be accepted as input VolumeInput, which may be a volume to be resized, which may be a 3D mask or a 3D volume, and resizeaVector which may be a resizing parameter indicating the change in each dimension.
The resize vector may be or include a dimensionless number indicating a resolution factor between the printer resolution and the volumetric resolution. For example, if the printer resolution is (1, 0.5) units of [ mm (other dimensions are possible, of course) ] and the volume resolution is (2, 2) units of [ mm ] (other dimensions are possible, of course), then the resize vector is (2, 4) without dimension (no units).
In one embodiment, resizeavector = VolumerResolution/printerresolution.
In one embodiment, if it is assumed that:
VolumeInput has the following resolutions Rx, ry, rz, as well as width, depth, height (number of rows, columns, and number of slices);
data comes from, for example, DICOM files:
rx and Ry refer to DICOM tag (0028, 0030) PixelSpacing [ mm ]
Rz can be DICOM label (0018, 0050) slicekness or the difference between slice positions as described by DICOM label (0020, 1041) (both units are mm).
The printer tolerance may be the layer thickness in [ mm ] for PrTol and PrLay, but may be other dimensions such as inches, microns, etc.
In such an example, resize vector = [ Rx/PrTol, ry/PrTol, rz/PrLay ] (there may be different values for each dimension).
In one embodiment, resizeFun may be a known function, such as Matlab function Vq = interp3 (X, Y, Z, volumeInput, xq, yq, zq).
Among them are for example:
[ X, Y, Z ] = grid (0 Rx: width X Rx,0 Ry: depth X Ry,0 Rz: height X Rz;
[ Xq, yq, zq ] = grid (0 Resizevector (1): width x Rx,
0: resizeavector (2): the depth x Ry is the maximum depth of the film,
0: resizeavector (3): height × Rz);
in one embodiment, [ X, Y, Z ] = grid (X, Y, Z), which returns 3-D grid coordinates based on the coordinates contained in vectors X, Y, and Z. The grid represented by X, Y, and Z has a size length (Y) times a length (X) times a length (Z). The mesh may be a known function such as the Matlab function described above as required for preprocessing of the interp3 function.
In one embodiment, volumeInput may have resolutions Rx, ry, rz. The printer tolerances may be PrTol (e.g., printer "axial" dimension, printer resolution in the z-direction) and layer thickness PrLay (e.g., rz of the printer). In one implementation:
ResizeVector=[Rx/PrTol,Ry/PrTol,Rz/PrLay]
there may be different values for each dimension. Other or different formulas, other data structures, and other sequences may be used as well as all formulas, data structures, and particular sequences described herein.
Filling or boundary thickening/printability evaluation/fixing or changing mask
In various embodiments, creating the 3D printer input data may include evaluating the 3D mask to determine whether it is printable according to or in view of the printer definition data, and/or curing detected problems, such as an increase or decrease in thickness of a portion of the printer input data, or a feature or structure to be printed (increasing or decreasing the thickness of a corresponding portion of the 3D object). Creating 3D printer input data may include creating voids or holes in the solid areas according to the printer definition data, for example, to reduce the amount of material used.
When hollow or thin regions are segmented or printed, the 3D printed model in these regions may not be strong enough. Embodiments of the present invention may detect such regions and may perform automatic morphological operations to increase model strength.
The 3D printed object 12 should not break or collapse. If the boundary (wall thickness) of the 3D printed object 12 is too thin and/or made of a material that cannot hold itself in a particular model (related to material strength), the model may break during or shortly after printing. If the strength and/or other properties of the printed material 16 are known during the segmentation or data conversion, the width or thickness of the portion of the 3D printed object 12 may be determined, for example, during the POS. Similar processes of "wall thickness analysis" and "boundary" thinning or widening may exist on the mesh or 3D printer data 11. This is a known and fast process, and in some embodiments, may be performed in the POS, on the mask, and no later than as a grid. Intensity analysis performed as a POS process, e.g., on a mask, may yield results that better match the raw data, undergo fewer transformations, than intensity analysis performed on a grid.
The printer definition data 22 (e.g., printer technology) may affect the material. The printer definition data 22 may affect padding and other processes in which, for example, thickening, thinning, adding and removing material may be done in increments of printer resolution.
An intensity analysis (e.g., strAn, discussed below) may be performed on the mask, model or volume to be printed (e.g., completeMask-which may be defined as the final mask to be printed), and typically after segmentation.
Printability evaluations may be performed at different levels of accuracy. For example, a general model printability check (e.g., a "yes" or "no" answer — whether the model can be printed or not) may be used, or the results may be location specific, e.g., a data structure such as a "heatmap" indicates which regions of the model can be printed. Known algorithms may use known characteristics of the material used for construction to determine whether a part is too weak after construction. For example, a portion of a mold of a certain thickness may be determined to be too fragile if made of plastic, but not if made of metal. A part of the mould connected to the larger part may be determined to be too weak if made of plastic, but not if made of metal, and this may of course depend on the size of the connecting part.
The mathematical model of the heatmap may be, for example: the MaskInput may be a Boolean mask, e.g., an initial mask, used as input to generate an intensity analysis process (e.g., using StrAn, discussed below) that returns a data structure such as a "heat map," called, for example, heatMap, that describes whether the input mask is print-ready and, if not print-ready, in which area.
HeatMap = StrAn (MaskInput). HeatMap may be a volume of the same size as MaskInput: with values such as 0 in the absence of material and floating point numbers to indicate intensity, e.g., determined based in part on the amount of material in neighboring voxels and local structure.
In some embodiments, heatMap (∼ MaskInput) =0; (anywhere there is no mask, the heat map is zero-not "interesting").
As part of the example output, any voxel in the HeatMap that meets or is HeatMap > StrThres can or is allowed to print. The input of the "yes" (print) or "no" (cannot print) option has the same mathematical model-only if all voxels in the HeatMap > StrThres, the model can be printed.
StrAn may be any known strength analysis function or process, such as, for example, wall thickness analysis, finite element analysis (FEA, also known as Finite Element Method (FEM)), or other tools. Such methods include known techniques and may operate on a grid.
A different approach for StrAn processing may include evaluating printability, for example, by making a grid from mask data and performing a strength analysis using algorithms known in the art. An example flow of evaluation may include, for example:
the initial mask is converted to a grid (e.g., using known methods such as marching cubes; and a grid analysis is performed (e.g., using known methods such as wall thickness analysis or FEA).
The intensity analysis may depend on the model (structure, wall width, etc.) and the printer (printer type, printer material). This can be evaluated during StrAn, and StrThres can be determined by these factors.
In one embodiment, a new or additional mask (e.g., a WeakMask) may be created to mark areas to be printed that were not to be printed in the original print definition, but were found to border weak areas. Thus, weak areas below a certain threshold (e.g., a thickness threshold or a predicted intensity threshold) may be marked as new masks, for example:
WeakMask=StrAn<StrThres
the WeakMask may be a new mask defined by each voxel in StrAn (or "HeatMap," which may indicate which regions of the model may be printed and which regions need to be adjusted due to, for example, intensity problems) that is below the StrThres value.
Morphological dilation (MorphoFun) can be defined as, for example:
MorphoMask=DilationFun(WeakMask)
the MorphoMask may be a mask generated after performing a dilation process (dialionfit) on the above-defined weikmask.
And the MorphoMask may be combined or combined with CompleteMask:
CompleteMask=MorphoMask U CompleteMask
to perform dilation for each true voxel (e.g., indicating build material that should be deposited) in the mask or print data, each false voxel surrounding or adjacent to that true voxel is set to true; this may be done to a certain thickness, so in each direction, integer X voxels adjacent to true voxels are set to true. In dilation, typically, the voxels or regions that are altered to indicate that material should be printed are connected to, abut, or are adjacent to the relevant voxel or boundary. Conversely, for a fill function, the voxels or regions that are altered to indicate that material should be printed do not necessarily connect to, abut, or be adjacent to the relevant voxels or boundaries.
In one embodiment, this may be iterative: the resulting Completemask can be analyzed again for weaknesses and the process can be performed again until all models are above StrThres.
Another embodiment of the enhancement mask may be similar to the process discussed above, but with the following types of expansion functions:
DilationFun(WeakMask,StrAn)
the treatment may swell the WeakMask according to StrAn (the weaker the region, the more swelling will be "obtained").
Another embodiment of the enhancement mask may be similar to the process discussed above, but using a MorphoFun defined as a dilation function instead of a morphological fill function defined as a dilation function, the morphological fill function being defined as:
MorphoMask=Fillfun(WeakMask)
a simple embodiment may enhance the model by thickening the borders. One embodiment may define the expansion of a spherical element having a radius R as a (+) R. To obtain a model that is as close as possible to the anatomy on the one hand and printable on the other hand, the inflation may be in a direction that is "unimportant" to the model or to the use of the model.
Fig. 7A, 7B and 7C show the model before expansion (fig. 7A, initial) and after expansion (fig. 7B, initial expansion, fig. 7C outward). For the initially thin mold 700 shown in FIG. 7A, one option for thickening the mold may be to expand it inward. In this way, the model may be accurate in terms of outer boundary, while it is also thick enough. This approach may be used if the external structure of the model is important (in the example of fig. 7, the size and structure of the aorta is important). Instead, outward expansion is possible, so the internal structure is accurate. As shown in fig. 7C, this approach is required if a device (e.g., a medical tool) should be inserted into or inside the printed model, and only the dimensions of the internal structure should be preserved.
Filling gaps
In one embodiment, the initial processing or segmentation (e.g., operation 310 or operation 320) or the adjustment or post-processing (e.g., operation 340 or operation 350) may include filling a void, which may be a hollow portion of the 3D object, and where no printing material is deposited. For example, a padding algorithm may be used to obtain or create a more robust structure. Fig. 4A and 4B illustrate an example filling action showing a 2D representation of a 3D volume 400 having voids 410, as shown in fig. 4A, filled to produce the volume shown in fig. 4B. The mask representing the volume 400 may have its data changed to convert the void data to print material data.
In some embodiments, a user and/or a known segmentation tool or algorithm may decide which region to fill and then a filling algorithm or method may fill the region, e.g., as part of operation 310 or operation 320 described above. For example, if a module designed to segment a particular type of structure of such a bone is used, and the module determines that there is a void in the segmented bone, the module may use a known filling algorithm. The user may click or select the region to be filled using a graphical user interface, and then a known filling algorithm may fill the region.
In some embodiments, after segmentation or initial segmentation (e.g., operation 310 above) and possibly other processing (e.g., initial processing 320 described above), an automated process may determine (e.g., evaluation 330 described above) areas that need to be filled, e.g., to increase intensity. The filling may then be done as part of the above adjustment operation 340, for example.
In one exemplary embodiment, the fill mask process may be performed by filling each of a plurality of slices or portions (e.g., flat or relatively flat portions that make up a volume) separately using any function known in the art for image filling. Such a function may be, for example, the Matlab function imfill: BW2= imfill (BW 'holes'). The function may fill a hole in the input binary image BW, where a hole is a set of background pixels that cannot be completed by filling the background from the edges of the image. BW may be the original mask (e.g., 2D mask) in a particular slice.
While 3D filling may be a simple action (e.g., filling the current or all voids), it may be difficult for a user to predict the result if performed on the entire 3D volume. If the void has a small hole, the result of the 3D filling may be to fill the entire volume. Alternatively, embodiments in which 2D filling is performed for each slice may be more involved, and only the region of interest may be filled. This action can be performed in any direction, such as axial filling and traversing all z-slices, or sagittal filling. One embodiment is performed in an axial orientation, as for some applications it works well for most situations.
Removing or creating boundaries, corrosion
In one embodiment, the initial processing, segmentation, or post-processing may include removing the boundaries, or creating gaps or isolated regions between the touch masks, such as corrosion at the boundaries, or creating gaps or gap regions, such as by the width of the printer resolution, or defining the data according to the printer. As with other processes, the removal of material may be accomplished by changing the mask to change the "print" portion to "no print," e.g., changing the mask so that the gap is defined by material that is not printed (e.g., zero). Printer definition data may be used as input to determine a minimum printer resolution unit, which determines the gap width. For example, this may be done to remove the contacting or abutting features of each mask, but for these features it is desirable that the features do not contact or separate after printing: these portions may be separated manually by manipulating the data. This may be done, for example, when there is actually a gap or physical separation in the imaged object, but where the printer resolution may result in the components of each mask touching or abutting, and it is desirable that the components not touch or separate after printing. Masks that are in contact with each other can be identified and the material of the masks etched to prevent contact. In one embodiment, the algorithm may determine that there is a gap or gap between different regions of the mask, and thus determine that the gap should remain in the printed 3D model.
In some embodiments, a user and/or a known segmentation tool or algorithm may decide which boundary or material to remove, and then an appropriate known algorithm or method may remove or erode the material, e.g., as part of operation 310 or operation 320 described above. For example, if a module designed to segment a particular type of structure of such a bone is used, and the module determines that adjacent regions are different bones, the module may use a known erosion algorithm. The user may click or select the region to be separated using a graphical user interface, and then known erosion algorithms may separate the regions. The etching may be performed as part of post-processing 350.
The mask may have a complex structure such that a small gap exists between adjacent regions. This may also be the case if different masks (e.g. one for the teeth and another for the mandible) are printed from the same model and are very close to or in contact with each other. The result of doing so may be a 3D model with connected or contacted masks, or connected or contacted portions of the same mask in areas where connections should not occur if a separable model is desired (e.g., different parts of the heart). If the 3D printer resolution is known, the minimum difference between the segmented components can be determined. In the final 3D printed object, physically adjacent heart chambers may be separated or separated by a narrow gap where no build material is deposited, so that physically connected (in real life) heart chambers may be disassembled by the person examining the final build job.
Layer thickness and printer tolerances may constitute printer resolution. If it is desired to separate components printed as a batch or print job, the minimum difference between the masks should be printer resolution in one embodiment. If this is known during the POS process, the distance between the segmentation components (in the imaged volume: e.g., in the voxel world, the volume or the imaged world) may be changed. In some embodiments, default parameters (e.g., printer parameters, materials, etc.) may be used without binding to the actual printer or material, which may improve segmentation and/or 3D printing on processes that do not consider the printer or material at all.
Fig. 5A and 5B show a 2D representation of a 3D volume 500 having a boundary 510 (shown in fig. 5A) between a tooth 520 and a mandible 530, removed or eroded in fig. 5B to create a gap or gap region 520, the gap representing a portion of the 3D model without printed material. The mask representing the volume 500 may have its data changed to convert the boundary data to non-print material or gap data.
For example, if MaskA and MaskB are two adjacent masks (e.g., teeth and mandible in fig. 5), both masks may be printed with a printer having a spatial resolution of R = Rx, ry, rz. In one embodiment, rx and Ry are printer accuracies or tolerances, and Rz is layer thickness. Printer precision is typically defined in the X and Y directions. The mask may not be aligned in terms of rotation and its axis or x/y/z axis with the printer axis or x/y/z axis and may require correction or rotation, for example, using a rotation transformation matrix T. For example, the X direction in the printer may be aligned with the Z direction of the mask. Because the erosion function (e.g., the function of removing material on the mask and creating a gap) may use R (from the printer), erroneous parameters may be obtained. The correct writing method may be: the printer has a spatial resolution of R '= R' x, R 'y, R' z. R = T (R')) when T is the rotation transformation matrix between the mask and the printer.
To generate a print model separating MaskA and MaskB, a morphological operation may be performed. For example, the morphological operations may employ a mask and create a new mask by convolution with some object (in the specific example below a sphere with radius R), as is well known in the art.
For example, in one embodiment, if defined:
erosion of spherical elements of radius R (denoted herein as "A Θ R")
MaskA^'=MaskAΘR
MaskA^'=MaskBΘR
MaskA 'and MaskB' are the masks that are closest or most similar to MaskA and MaskB, which can be printed with a printer having a tolerance R. For example, if the specification of a given printer, a minimal amount of material is removed from the MaskA and/or MaskB at its boundaries so that they do not physically touch when printed, which may mean that MaskA' is the most similar mask to the MaskA of the given printer and that material has been removed. As with other variations discussed herein, this generally means modifying a data structure (e.g., a mask) that defines an object. In one embodiment, operations such as the following may be used:
DistX = bwdist (MaskX) is defined according to Matlab functions, which create a distance map DistX from edges of MaskX. Bwdist can compute the euclidean distance transform of the binary image BW. For each pixel in BW, the distance transform may assign a number that is the distance between the pixel and the nearest non-zero pixel of BW.
R may be the tolerance of the printer (assuming equal tolerances in all directions)
MaskA and MaskB are two masks, which should be separated by a distance R.
To create two masks MaskA 'and MaskB' separated by R, the following operations are performed:
DistA=bwdist(MaskA)
DistB=bwdist(MaskB)
MaskA' = MaskA & DistB > R/2 (& is AND operation)
MaskB'=MaskB&DistA>R/2
Creating gaps/refinements
In one embodiment, the segmentation 310 or initial processing 320 may include creating 3D printer input data, which may include, for example, creating gaps or hollow regions in solid areas from printer definition data, or refining regions in printer definition data, or otherwise removing material. Typically, material is removed to reduce the amount of material used (e.g., reduce cost, reduce waste, make printing faster), or to reduce weight, or for other reasons. The material may be removed to such an extent that the strength or stability is not compromised.
Known algorithms may be used to detect or identify regions that should be made hollow or with voids to identify connected component regions, which may include any blocks or regions of the 3D object that do not contact other blocks or regions of the mask. In one embodiment, any such block or region should be hollow.
If the areas of the split area, mask, or 3D printer input data are solid or completely filled, and no solid areas of material are needed, printed material may be wasted on the interior areas. Typically, such unnecessary or redundant fill areas are internal and do not interface with the external areas of the printed object. If the model is known to be used for printing, the already filled area can be punched or voided or the created space emptied (without build material) to remove build material to a minimum thickness, for example as follows:
the CompleteMask may become or be converted into a shell mask (e.g., shellMask) with a small boundary thickness (e.g., initThick).
The CompleteMask may be a mask after "regular" singulation and is desirably hollowed out. The ShellMask (e.g., boolean mask or matrix) is the CompleteMask shell, e.g., it is true only on the CompleteMask shell (e.g., 1). The wall thickness of the ShellMask may be InitThick (e.g., 1 mm) at the beginning of the process. The shellsask can be expanded, or made wider or thicker, as necessary (e.g., depending on the intensity analysis, such as StrAn).
An intensity analysis (e.g., strAn) can then be performed on the ShellMask.
In some embodiments, after refinement and evaluation (e.g., operation 330), if the model is determined to be too weak or unprintable, corrective thickening or dilation may be performed, e.g., using dilation via a dialionfun function, the methods disclosed herein for increasing the thickness of portions of printer input data, or other methods, but typically only on interior portions of the mask (e.g., portions of the model that do not face the outside world).
One option to know what is inside the model is to take the CompleteMask and calculate the gradient on the distance edge transition.
D = bwdist (-completesask) (e.g., matlab bwdist function using a completesask negative value).
The Matlab Gradient function of Gradient (D) -distance transform will give the outer direction. The other direction is inward.
One embodiment includes removing portions of the inner mask that are not critical to the structure of the model, which reduces the printed material, but also maintains the strength of the model. Further, removing material, adding material, or other processing may be performed repeatedly or iteratively in conjunction with quality, printability, or strength evaluations to determine how much material to remove. In this way, "simulation" can be performed. For example, an aorta filled according to an initial model or mask may have a portion of its interior removed. This may be fed back into operation 330 described above, where the output or mask may be evaluated for intensity, for example. This may be performed repeatedly or iteratively until the model strength falls below a threshold, e.g., there is at least one region in HeatMap < StrThres. In such embodiments, the previous iteration or version of the step (e.g., when all heatmaps still > = StrThres) is a model with as little material amount as possible, which maintains the required strength and structure. The printer definition data may be input into the gap creation process or the thickening or thinning process (and thus these processes may be performed according to the printer definition data) by intensity analysis of printability analysis occurring after the gap creation, thickening or thinning using the printer definition data.
In some embodiments, the iterations in removing (e.g., refining, creating gaps, or creating gaps) may include iterating until the evaluating (e.g., operation 330 above) determines that the model or mask is too weak, and then using the previous iteration determined to be printable as the model. In other embodiments, the iterations in removing may include iterations until the evaluation determines that the model or mask is too weak and then a thickening process is used in conjunction with the evaluation process to create the printable mask or model.
Other methods may be used.
While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents may occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention. Various embodiments have been presented. Each of these embodiments may, of course, include features from the other embodiments presented, and embodiments not specifically described may include various features described herein.

Claims (12)

1. A method for converting imaging data into three-dimensional printer data, the method comprising:
receiving imaging data descriptive of a three-dimensional volume of a subject;
segmenting the imaging data to produce one or more three-dimensional masks, each three-dimensional mask representing an object to be printed, each three-dimensional mask comprising a three-dimensional matrix indicating locations where material should be printed and locations where material should not be printed; and
creating 3D printer input data describing at least a portion of the three-dimensional volume from the one or more three-dimensional masks using printer definition data describing a particular printer, wherein creating the 3D printer input data comprises creating voids in solid areas according to the printer definition data.
2. The method of claim 1, comprising printing a 3D object using the printer and the 3D printer input data.
3. The method of claim 1, wherein creating 3D printer input data comprises adjusting, according to the printer definition data, one or both of: a resolution of the imaging data; and a resolution of the one or more three-dimensional masks.
4. The method of claim 1, wherein creating 3D printer input data comprises increasing or decreasing a thickness of a portion to be printed according to the printer definition data.
5. The method of claim 1, wherein creating 3D printer input data comprises creating gap regions between regions in the printer definition data.
6. The method of claim 1, wherein creating 3D printer input data comprises evaluating the one or more three-dimensional masks using the printer definition data, and changing the three-dimensional masks in an iterative manner if the evaluation determines that the one or more masks need to be changed.
7. A system for converting imaging data into three-dimensional printer data, the system comprising:
a memory; and
a processor configured to:
receiving imaging data descriptive of a three-dimensional volume of a subject;
segmenting the imaging data to produce one or more three-dimensional masks, each three-dimensional mask representing an object to be printed, each three-dimensional mask comprising a three-dimensional matrix indicating locations where material should be printed and locations where material should not be printed; and
creating three-dimensional (3D) printer input data describing at least a portion of the three-dimensional volume from the one or more three-dimensional masks using printer definition data describing a particular printer, wherein creating the 3D printer input data comprises creating voids in solid areas according to the printer definition data.
8. The system of claim 7, comprising a 3D printer to print 3D objects using the 3D printer input data.
9. The system of claim 7, wherein creating 3D printer input data comprises adjusting, according to the printer definition data, one or both of: a resolution of the imaging data; and a resolution of the one or more three-dimensional masks.
10. The system of claim 7, wherein creating 3D printer input data comprises increasing or decreasing a thickness of a portion to be printed according to the printer definition data.
11. The system of claim 7, wherein creating 3D printer input data comprises creating gap regions between regions in the printer definition data.
12. The system of claim 7, wherein creating 3D printer input data comprises evaluating one or more masks using the printer definition data, and changing the masks in an iterative manner if the evaluation determines that the one or more masks need to be changed.
CN201780083026.9A 2016-11-23 2017-11-23 Three-dimensional printing method and system for image segmentation Active CN110191806B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662425948P 2016-11-23 2016-11-23
US62/425,948 2016-11-23
PCT/IL2017/051278 WO2018096536A1 (en) 2016-11-23 2017-11-23 Method and system for three-dimensional print oriented image segmentation

Publications (2)

Publication Number Publication Date
CN110191806A CN110191806A (en) 2019-08-30
CN110191806B true CN110191806B (en) 2022-11-15

Family

ID=62147639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780083026.9A Active CN110191806B (en) 2016-11-23 2017-11-23 Three-dimensional printing method and system for image segmentation

Country Status (6)

Country Link
US (2) US10885407B2 (en)
EP (1) EP3544817A4 (en)
JP (2) JP7339887B2 (en)
CN (1) CN110191806B (en)
IL (1) IL266829B (en)
WO (1) WO2018096536A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10723079B2 (en) * 2017-08-23 2020-07-28 Lawrence Livermore National Security, Llc Fast, efficient direct slicing method for lattice structures
US11584368B2 (en) * 2018-09-24 2023-02-21 Intel Corporation Evaluating risk factors of proposed vehicle maneuvers using external and internal data
US11195418B1 (en) * 2018-10-04 2021-12-07 Zoox, Inc. Trajectory prediction on top-down scenes and associated model
US11169531B2 (en) * 2018-10-04 2021-11-09 Zoox, Inc. Trajectory prediction on top-down scenes
US11127138B2 (en) * 2018-11-20 2021-09-21 Siemens Healthcare Gmbh Automatic detection and quantification of the aorta from medical images
GB2581957B (en) * 2019-02-20 2022-11-09 Imperial College Innovations Ltd Image processing to determine object thickness
CN111805894B (en) * 2020-06-15 2021-08-03 苏州大学 STL model slicing method and device

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5326659A (en) * 1992-03-05 1994-07-05 Regents Of The University Of California Method for making masks
US6259962B1 (en) * 1999-03-01 2001-07-10 Objet Geometries Ltd. Apparatus and method for three dimensional model printing
US6248426B1 (en) * 1999-05-26 2001-06-19 Russell G. Olson Construction paper for constructing a three-dimensional shape from a printable foldable surface
AU2003279508A1 (en) * 2002-11-12 2004-06-03 Objet Geometries Ltd. Three-dimensional object printing
US20060054039A1 (en) * 2002-12-03 2006-03-16 Eliahu Kritchman Process of and apparratus for three-dimensional printing
US7589868B2 (en) * 2002-12-11 2009-09-15 Agfa Graphics Nv Method and apparatus for creating 3D-prints and a 3-D printing system
FR2902218A1 (en) * 2006-06-07 2007-12-14 Gen Electric METHOD FOR PROCESSING TOMOSYNTHESIS IMAGES FOR DETECTION OF RADIOLOGICAL SIGNS
CA2605234C (en) * 2007-10-03 2015-05-05 Semiconductor Insights Inc. A method of local tracing of connectivity and schematic representations produced therefrom
US9870629B2 (en) * 2008-06-20 2018-01-16 New Bis Safe Luxco S.À R.L Methods, apparatus and systems for data visualization and related applications
ES2436421T3 (en) * 2008-12-19 2014-01-02 Agfa Graphics N.V. Method to reduce image quality defects in three-dimensional printing
JP5541652B2 (en) * 2009-03-31 2014-07-09 キヤノン株式会社 Recording apparatus and recording method
JP2011194730A (en) * 2010-03-19 2011-10-06 Fujifilm Corp Printing apparatus
US9529939B2 (en) * 2010-12-16 2016-12-27 Autodesk, Inc. Surfacing algorithm for designing and manufacturing 3D models
JP5408207B2 (en) * 2011-08-25 2014-02-05 コニカミノルタ株式会社 Solid object shaping apparatus and control program
KR102058955B1 (en) * 2011-11-17 2019-12-26 스트라타시스 엘티디. System and method for fabricating a body part model using multi-material additive manufacturing
US10025882B2 (en) * 2012-08-14 2018-07-17 Disney Enterprises, Inc. Partitioning models into 3D-printable components
KR101315032B1 (en) * 2012-11-08 2013-10-08 주식회사 메가젠임플란트 Method and system for generating implant image
EP2925525B1 (en) * 2012-11-29 2018-10-24 Hewlett-Packard Development Company, L.P. Methods for printing with a printhead
US9129435B2 (en) * 2012-12-04 2015-09-08 Fuji Xerox Co., Ltd. Method for creating 3-D models by stitching multiple partial 3-D models
KR101378875B1 (en) * 2013-03-19 2014-03-27 사회복지법인 삼성생명공익재단 Method, apparatus and system for manufacturing phantom customized to patient
JP6075809B2 (en) 2013-07-29 2017-02-08 Necソリューションイノベータ株式会社 3D printer device, 3D printing method, and manufacturing method of three-dimensional structure
US9636871B2 (en) * 2013-08-21 2017-05-02 Microsoft Technology Licensing, Llc Optimizing 3D printing using segmentation or aggregation
US9579850B2 (en) * 2013-09-05 2017-02-28 The Boeing Company Three dimensional printing of parts
US9846949B2 (en) * 2013-11-27 2017-12-19 Hewlett-Packard Development Company, L.P. Determine the shape of a representation of an object
US9588726B2 (en) * 2014-01-23 2017-03-07 Accenture Global Services Limited Three-dimensional object storage, customization, and distribution system
KR20150091945A (en) * 2014-02-04 2015-08-12 삼성메디슨 주식회사 Apparatus and method for processing medical images and computer-readable recording medium
US9669585B2 (en) * 2014-02-11 2017-06-06 Adobe Systems Incorporated Method and apparatus for embedding a 2-dimensional image in a 3-dimensional model
JP6385148B2 (en) 2014-06-10 2018-09-05 キヤノン株式会社 Three-dimensional printing apparatus, information processing apparatus, and control method
TWI531920B (en) * 2014-08-08 2016-05-01 三緯國際立體列印科技股份有限公司 Dividing method of three-dimension object and computer system
JP2016099648A (en) * 2014-11-18 2016-05-30 大日本印刷株式会社 Data reduction device for three-dimensional object molding
KR102332927B1 (en) * 2014-12-05 2021-11-30 주식회사 케이티 Method for recommending 3d printing according to slicing direction in cloud environment, server and computing device
JP6438290B2 (en) 2014-12-12 2018-12-12 キヤノン株式会社 Imaging apparatus and control method thereof
US9840045B2 (en) * 2014-12-31 2017-12-12 X Development Llc Voxel 3D printer
US10421238B2 (en) 2014-12-31 2019-09-24 Makerbot Industries, Llc Detection and use of printer configuration information
US10589466B2 (en) * 2015-02-28 2020-03-17 Xerox Corporation Systems and methods for implementing multi-layer addressable curing of ultraviolet (UV) light curable inks for three dimensional (3D) printed parts and components
GB2536062A (en) 2015-03-06 2016-09-07 Sony Computer Entertainment Inc System, device and method of 3D printing
DE102015204237A1 (en) 2015-03-10 2016-09-15 Siemens Healthcare Gmbh Method and device for producing a model finding object
WO2016152356A1 (en) 2015-03-24 2016-09-29 国立大学法人 筑波大学 Model, manufacturing system, information processing device, manufacturing method, information processing method, program, and recording medium
CN104772905B (en) * 2015-03-25 2017-04-05 北京工业大学 A kind of ADAPTIVE MIXED supporting construction generation method under distance guiding
US9589327B2 (en) * 2015-06-10 2017-03-07 Samsung Electronics Co., Ltd. Apparatus and method for noise reduction in depth images during object segmentation
KR20170029204A (en) * 2015-09-07 2017-03-15 한국전자통신연구원 Method of 3d printing for larger object than output space and apparatus using the same
CN106553345B (en) * 2015-09-29 2018-11-30 珠海赛纳打印科技股份有限公司 A kind of Method of printing and print control unit of more material 3D objects
EP3156926B1 (en) * 2015-10-16 2020-11-25 Accenture Global Services Limited 3-d printing protected by digital rights management
JP6643044B2 (en) * 2015-10-30 2020-02-12 キヤノン株式会社 Information processing apparatus, control method, and program
JP6723733B2 (en) * 2015-11-28 2020-07-15 キヤノン株式会社 Control device, management system, control method, and program
US10761497B2 (en) * 2016-01-14 2020-09-01 Microsoft Technology Licensing, Llc Printing 3D objects with automatic dimensional accuracy compensation
CN105946244B (en) * 2016-06-03 2018-05-18 湖南华曙高科技有限责任公司 Improve method, system and the three-dimensional body manufacturing equipment of the three-dimensional body accuracy of manufacture
CN107506650A (en) * 2016-06-14 2017-12-22 索尼公司 Message processing device and information processing method

Also Published As

Publication number Publication date
IL266829B (en) 2021-04-29
EP3544817A1 (en) 2019-10-02
US20210089848A1 (en) 2021-03-25
JP2020501954A (en) 2020-01-23
CN110191806A (en) 2019-08-30
US20180144219A1 (en) 2018-05-24
WO2018096536A1 (en) 2018-05-31
JP7339887B2 (en) 2023-09-06
EP3544817A4 (en) 2020-07-22
IL266829A (en) 2019-07-31
JP2023116734A (en) 2023-08-22
US10885407B2 (en) 2021-01-05
US11334777B2 (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN110191806B (en) Three-dimensional printing method and system for image segmentation
US10643331B2 (en) Multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging
JP7324268B2 (en) Systems and methods for real-time rendering of complex data
CN106663309B (en) Method and storage medium for user-guided bone segmentation in medical imaging
US7616794B2 (en) System and method for automatic bone extraction from a medical image
EP3035287A1 (en) Image processing apparatus, and image processing method
KR20160056829A (en) Semantic medical image to 3d print of anatomic structure
US9129391B2 (en) Semi-automated preoperative resection planning
WO2013019775A1 (en) Method and apparatus for correction of errors in surfaces
EP3711016B1 (en) Systems and methods for segmenting images
CN115131345B (en) CT image-based focus detection method and device and computer-readable storage medium
Marais et al. Detecting the brain surface in sparse MRI using boundary models
JP7462188B2 (en) Medical image processing device, medical image processing method, and program
Chougule et al. Conversions of CT scan images into 3D point cloud data for the development of 3D solid model using B-Rep scheme
Battiato et al. Cortical bone classification by local context analysis
WO2018159052A1 (en) Image processing device, image processing method, and image processing program
Lopes et al. Biomodels reconstruction based on 2D medical images
Zhangl et al. Computerized generation of realistic pulmonary nodule phantoms in helical CT images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant