WO2022086608A1 - Optimisation de l'analyse d'un objet imprimé en 3d par intégration d'objets virtuels géo-inscrits - Google Patents
Optimisation de l'analyse d'un objet imprimé en 3d par intégration d'objets virtuels géo-inscrits Download PDFInfo
- Publication number
- WO2022086608A1 WO2022086608A1 PCT/US2021/041359 US2021041359W WO2022086608A1 WO 2022086608 A1 WO2022086608 A1 WO 2022086608A1 US 2021041359 W US2021041359 W US 2021041359W WO 2022086608 A1 WO2022086608 A1 WO 2022086608A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- dataset
- tangible
- right eye
- left eye
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
- B29C64/393—Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
Definitions
- 3D printing is useful in medicine because it provides the surgeon with a patient-specific tangible anatomical model.
- This patent teaches a method, software and apparatus for improving analysis of a 3D printed object.
- the method comprises: performing a scan of a structure to yield a 3D dataset; selecting a first portion of the 3D dataset for 3D printing; performing 3D printing of the first portion to yield a 3D printed object; performing geo-registration of the 3D dataset to the 3D printed object; determining a second portion of the 3D dataset to be displayed on a head display unit; and displaying, in the head display unit (HDU), an image for left eye based on the left eye view point, the left eye viewing angle and the second portion and an image for the right eye based on the right eye view point, the right eye viewing angle and the second portion wherein a user viewing the left eye image and right eye image sees a 3D image.
- HDU head display unit
- the rendering process is further described in US Patent 8,384,771, METHOD AND APPARATUS FOR THREE DIMENSIONAL VIEWING OF IMAGES, which is incorporated by reference in its entirety.
- Some embodiments comprise modifying the geo-registered 3D dataset by using a georegistered tool. Some embodiments comprise modifying the geo-registered 3D dataset by using a virtual tool. Some embodiments comprise modifying the geo-registered 3D dataset by performing voxel manipulation. Some embodiments comprise modifying the geo-registered 3D dataset by incorporating a flow visualization feature.
- Some embodiments comprise wherein the second portion contains of a subset of the first portion. Some embodiments comprise performing alteration of the visual representation of the second portion by changing at least one of the group consisting of: color; grayscale; and, transparency.
- Some embodiments comprise wherein the HDU displays annotations for the 3D printed object. Some embodiments comprise wherein the first portion represents a movable object and wherein the HDU displays the first portion dynamically.
- Some embodiments comprise wherein a tangible feature on the 3D printed object is marked and the corresponding digital feature on the 3D dataset is localized.
- Some embodiments comprise performing a deformation of the 3D dataset prior to performing 3D printing of the first portion.
- Some embodiments comprise registering a second 3D dataset of the structure wherein the second 3D dataset is acquired at a different time point onto the 3D printed object; and displaying the second 3D dataset on the HDU.
- Some embodiments comprise registering a second 3D dataset of the structure; and displaying the second 3D dataset on the HDU.
- Some embodiments comprise wherein the second 3D dataset comprises an anatomic feature.
- the second 3D dataset comprises a pathologic feature.
- the second 3D dataset comprises surgical hardware onto the 3D printed object.
- Some embodiments comprise wherein the second 3D dataset comprises a modified version of the 3D dataset.
- Some embodiments comprise wherein the second 3D dataset comprises a simulated dataset.
- Some embodiments comprise computer-readable storage device comprising:
- instructions which, when executed by a computer, cause the computer to carry out the steps of utilizing a 3D dataset wherein the 3D dataset is acquired via a scan of a structure; selecting a first portion of the 3D dataset for 3D printing; sending a file containing the first portion to a 3D printer to yield a 3D printed object; performing geo-registration of the 3D dataset to the 3D printed object; determining a second portion of the 3D dataset to be displayed on a head display unit; and displaying, in the head display unit (HDU), an image for left eye based on the left eye view point, the left eye viewing angle and the second portion and an image for the right eye based on the right eye view point, the right eye viewing angle and the second portion wherein a user viewing the left eye image and right eye image sees a 3D image.
- HDU head display unit
- Some embodiments comprise an apparatus comprising: an IO device; and an image processor in communication with the IO device, the image processors comprising a program stored on computer-readable non-transitory media, the program comprising: instructions that utilize a 3D dataset wherein the 3D dataset is acquired via a scan of a structure; instructions that select a first portion of the 3D dataset for 3D printing; instructions that send a file containing the first portion to a 3D printer to yield a 3D printed object; instructions that perform georegistration of the 3D dataset to the 3D printed object; instructions that determine a second portion of the 3D dataset to be displayed on a head display unit; and instructions that cause the head display unit (HDU) to display an image for left eye based on the left eye view point, the left eye viewing angle and the second portion and an image for the right eye based on the right eye view point, the right eye viewing angle and the second portion wherein a user viewing the left eye image and right eye image sees a 3D image.
- HDU head display unit
- Figure 1 illustrates the processing strategy for this patent.
- Figure 2 illustrates viewing options.
- Figure 3 A illustrates a 3D printed model of the heart.
- Figure 3B illustrates what a user would see when looking at the heart through a head display unit.
- Figure 3C illustrates what a user would see when looking at the heart through a head display unit with the display of a modified volume.
- Figure 4A illustrates a 3D printed model of the heart and a geo-registered tool.
- Figure 4B illustrates what a user would see when looking at the heart through a head display unit.
- Figure 5 A illustrates a processing block for rendering voxels of a 3D dataset, which lie outside of the 3D printed object.
- Figure 5B illustrates the 3D printed object and the geo-registered virtual object wherein the geo-registered object lies completely outside of the 3D printed object.
- Figure 6A illustrates a processing block for displaying voxels on the HDU which lie partially inside and partially outside of the 3D printed object.
- Figure 6B illustrates the 3D printed object and the geo-registered virtual object wherein the geo-registered object lies partially inside and partially outside of the 3D printed object.
- Figure 7A illustrates a processing block for displaying voxels on the HDU which lie entirely inside of the 3D printed object.
- Figure 7B illustrates the 3D printed object and the geo-registered virtual object wherein the virtual object lies entirely inside of the 3D printed object.
- Figure 8 A illustrates a processing block for performing geo-registration of 4D datasets to a 3D printed object.
- Figure 8B illustrates a 3D printed object and a geo-registered virtual object.
- Figure 9A illustrates viewing the geo-registered tool and 3D printed object without looking through the HDU.
- Figure 9B illustrates viewing the geo-registered tool and 3D printed object through the HDU.
- Figure 10 illustrates a 3D printed object from a first volume co-registered to a 3D printed object from a second volume.
- Figure 11 illustrates registering a range of virtual objects to a 3D printed object.
- the flow diagrams do not depict the syntax of any particular programming language. Rather, the flow diagrams illustrate the functional information one of ordinary skill in the art requires to fabricate circuits or to generate computer software to perform the processing required in accordance with the present invention. It should be noted that many routine program elements, such as initialization of loops and variables and the use of temporary variables, are not shown. It will be appreciated by those of ordinary skill in the art that unless otherwise indicated herein, the particular sequence of steps described is illustrative only and can be varied without departing from the spirit of the invention. Thus, unless otherwise stated the steps described below are unordered meaning that, when possible, the steps can be performed in any convenient or desirable order.
- Figure 1 illustrates the processing strategy for this patent.
- 100 illustrates obtaining a scan of a structure (e.g., CT scan of an knee) and store the 3D dataset.
- 101 illustrates selecting a portion of the 3D dataset for 3D printing (e.g., femur, tibia, fibula, ACL, PCL).
- 102 illustrates performing 3D printing of the selected portion.
- 103 illustrates performing geo-registration of the 3D dataset to the 3D printed object.
- 104 illustrates determining at least one unselected portion (i.e., not included in the 3D printed object) to be displayed on a head display unit (HDU) (e.g., patella, MCL, LCL complex).
- HDU head display unit
- 105 illustrates displaying, in the head display unit (HDU), an image for left eye based on the left eye view point, the left eye viewing angle and the at least one unselected portion and an image for the right eye based on the right eye view point, the right eye viewing angle and the at least one unselected portion wherein a user viewing the left eye image and right eye image sees a 3D image, (e.g., see US Patent 8,384,771, 9,349,183, 9,473,766, 10,795,457),
- Figure 2 illustrates viewing options.
- the user can rotate and move the 3D printed object.
- the user can move the head, turn the head and change gaze direction. While this is being performed, the HDU will provide rendered image of the virtual object, which is geo-registered to the tangible 3D printed object.
- the user can modify the 3D dataset based on a variety of conventional viewing strategies, such as modifying the visual representation, such as changing the color and tranparency of a segmented structure.
- the user can perform filtering of the 3D dataset, which is described in US Patent 8,384,771, which is incorporated by reference in its entirety. For example, the user can view the virtual image in grayscale and the 3D printed object in color.
- the user can modify the virtual object through a range of advanced viewing strategies. Ths user can implement a double windowing technique via US Patent 10,586,400, PROCESSING 3D MEDICAL IMAGES TO ENHANCE VISUALIZATION, which is incorporated by reference in its entirety.
- the user can implement an interaction of 3D dataset with geo-registered tools, as described in US Patent 10,712,837, USING GEO -REGISTERED TOOLS TO MANIPULATE THREE- DIMENSIONAL MEDICAL IMAGES, which is incorporated by reference in its entirety. Examples of geo-registered tools include, but are not limited to the following: knife; scissors; platform; forceps; staples; and, a wide range of other types of surgical tools.
- the user can perofrm interaction of 3D dataset with virtual tools, as described in PCT/US19/47891, A VIRTUAL TOOL KIT FOR 3D IMAGING, which is incorporated by reference in its entirety.
- the user can perform ’’ghost imaging” per US Patent Application 16/010,925, INTERACTIVE PLACEMENT OF A 3D DIGITAL REPRESENTATION OF A SURGICAL DEVICE OR ANATOMIC FEATURE INTO A 3D RADIOLOGIC IMAGE FOR PRE-OPERATIVE PLANNING, which is incorporated by reference in its entirety.
- the user can insert flow visualization features, as described in US Patent Applications 16/506,073, A METHOD FOR ILLUSTRATING DIRECTION OF BLOOD FLOW VIA POINTERS, and 16/779,658, 3D IMAGING OF VIRTUAL FLUIDS AND VIRTUAL SOUNDS, which are incorporated by reference in their entirety.
- the user can perform voxel manipulation strategies, per US Patent Application 16/195,251, INTERACTIVE VOXEL MANIPULATION IN VOLUMETRIC MEDICAL IMAGING FOR VIRTUAL MOTION, DEFORMABLE TISSUE, AND VIRTUAL RADIOLOGICAL DISSECTION, which is incorporated by reference in its entirety.
- Figure 3A illustrates a 3D printed model of the heart.
- 300 illustrates the 3D printed model of the heart.
- 301 illustrates the aortic valve.
- 302 illustrates the pulmonary valve.
- Figure 3B illustrates what a user would see when looking at the heart through a head display unit.
- 300 illustrates the 3D printed model of the heart, which is the real world, tangible object.
- 303 illustrates the pulmonary artery, which is a virtual object seen only when looking through the head display unit.
- 304 illustrates the aorta, which is a virtual object seen only when looking through the head display unit.
- Figure 3C illustrates what a user would see when looking at the heart through a head display unit with the display of a modified volume.
- 300 illustrates the 3D printed model of the heart, which is the real world, tangible object.
- 305 illustrates a geo-registered tool of a knife, which contains a non-cutting portion 306 and a cutting portion 307.
- the geo-registered knife is described further in US Patent 10,712,837, USING GEO -REGISTERED TOOLS TO MANIPULATE THREE-DIMENSIONAL MEDICAL IMAGES, which is incorporated by reference in its entirety. This object is held in one’s hand is is used to modify the virtual image which is geo-registered to the 3D printed object.
- 308 illustrates the shifted position of the pulmonary artery, which is a virtual object seen only when looking through the head display unit. Note that it is shifted due to the fact that it was virtually cut by the geo-registered tool.
- 309 illustrates the shifted position of the aorta, which is a virtual object seen only when looking through the head display unit. Note that it is shifted due to the fact that it was virtually cut by the geo-registered tool.
- Figure 4A illustrates a 3D printed model of the heart. 400 illustrates the 3D printed model of the heart.
- 401 illustrates the aortic valve.
- 402 illustrates the pulmonary valve.
- 403 illustrates the geo-registered tool.
- Figure 4B illustrates what a user would see when looking at the heart through a head display unit.
- 400 illustrates the 3D printed model of the heart, which is the real world, tangible object.
- 401 illustrates the aortic valve.
- 402 illustrates the pulmonary valve.
- 403 illustrates the geo-registered tool of a pointer, which is described further in US Patent 10,712,837, USING GEO-REGISTERED TOOLS TO MANIPULATE THREE-DIMENSIONAL MEDICAL IMAGES, which is incorporated by reference in its entirety.
- This object is held in one’s hand is is used to modify the virtual image which is geo-registered to the 3D printed object.
- 404 illustrates a virtual hole, which allowes peering in through the surface of the 3D printed object to see deeper structures. Note that the deeper structures can only be seen when looking through the augmented reality head display unit (HDU). This is further described in PCT/US19/47891, A VIRTUAL TOOL KIT FOR 3D IMAGING andUS Patent Application 15/949,202, SMART OPERATING ROOM EQUIPPED WITH SMART SURGICAL DEVICES, both of which are incorporated by reference in their entirety.
- 405 illustrates a virtual image of the mitral valve.
- the HDU can display virtual images located inside or outside of the 3D printed object wherein the virtual images are geo-registered to the 3D printed object.
- the virtual objects move along with the 3D printed object, just as if they were part of the object.
- the user has the ability to manipulate the virtual objects during the course of inspection of the geo-registered object. Doing this would be useful because the user would be able to understand the anatomy both inside the 3D printed object nad the anatomy outside the 3D printed object better.
- Figure 5A illustrates a processing block for rendering voxels of a 3D dataset, which lie outside of the 3D printed object.
- 500 illustrates a processing block for performing georegistration such that a 3D printed object and the 3D dataset are geo-registered.
- the 3D dataset could be a CT scan of the abdomen.
- the 3D printed object could a kidney mass.
- 501 illustrates a processing block for excluding voxels in the 3D dataset correspond to the 3D printed object. To continue this example, the kidney mass has already been printed. Therefore, those voxels in the 3D dataset corresponding to the kidney mass can be excluded.
- 502 illustrates a processing block for display the remaining portion (or a subset of the remaining portion) on the HDU.
- the remaining structures are displayed.
- the displayed structure could be the spleen, which lies completely outside and away from the kidney.
- Figure 5B illustrates the 3D printed object and the geo-registered virtual object wherein the geo-registered object lies completely outside of the 3D printed object.
- 503 illustrates a left kidney mass, which is the 3D printed object.
- 504 illustrates a spleen, which is the portion of the 3D dataset displayed on the HDU. Note that the spleen is completely outside and away from the kidney.
- the amount of transparency could be adjusted per user preference.
- the spleen could be rendered partially transparent.
- the spleen could be rendered nontransparent. This would provide the user who is studying a 3D printed object with context of the adjacent organs, which is therefore useful.
- Figure 6A illustrates a processing block for displaying voxels on the HDU which lie partially inside and partially outside of the 3D printed object.
- 600 illustrates a processing block for performing geo-registration such that a 3D printed object and the 3D dataset are georegistered.
- the 3D dataset could be a CT scan of the chest.
- the 3D printed object could a lung mass.
- 601 illustrates a processing block for keeping at least some voxels in the 3D dataset correspond to the 3D printed object.
- the lung has already been printed and it is partially transparent (e.g., tangible clear material).
- 602 illustrates a processing block for displaying a selected group of voxels on the HDU.
- the lung mass is displayed on the HDU. Note that in this example a portion of the lung mass is located inside of the 3D printed object and a portion of the lung mass is located outside of the 3D printed object. This can occur if a lung cancer originates inside of the lung and grows outside of the lung and into the chest wall.
- Figure 6B illustrates the 3D printed object and the geo-registered virtual object wherein the geo-registered object lies partially inside and partially outside of the 3D printed object.
- 603 illustrates a lung, which is a 3D printed transparent object.
- 604 illustrates a lung mass, which is the virtual object displayed in the HDU.
- Another innovative idea is to print a generic transparent organ (e.g., a model organ) and then have patient-specific pathology be displayed on the HDU. This would save time by not having to print a patient specific organ every time. It would also improve communications between physicians and patients. It would also help in preoperative planning. The amount of transparency could be adjusted (both for the 3D printed object and the 3D virtual object displayed on the HDU) per user preference.
- Figure 7A illustrates a processing block for displaying voxels on the HDU which lie entirely inside of the 3D printed object.
- 700 illustrates a processing block for performing georegistration such that a 3D printed object and the 3D dataset are geo-registered.
- the 3D dataset could be a CT scan of the chest.
- the 3D printed object could a lung mass.
- 701 illustrates a processing block for displaying a subset of voxels in the 3D dataset that lie within the 3D printed object.
- the lung has already been printed and it is partially transparent (e.g., tangible clear material).
- 702 illustrates a processing block for displaying a selected subset of voxels on the HDU. To continue this example, the lung nodule is displayed on the HDU.
- Figure 7B illustrates the 3D printed object and the geo-registered virtual object wherein the virtual object lies entirely inside of the 3D printed object.
- 703 illustrates a lung, which is a 3D printed transparent object.
- 704 illustrates a lung nodule, which is the virtual object displayed only on the HDU.
- Another innovative idea is to print a generic transparent organ (e.g., a model organ) and then have patient-specific pathology be displayed on the HDU. This would save time by not having to print a patient specific organ every time. It would also improve communications between physicians and patients. It would also help in preoperative planning. The amount of transparency could be adjusted (both for the 3D printed object and the 3D virtual object displayed on the HDU) per user preference. Other examples include performing 3D printing of a femur and performing add-on virtual objects (e.g., fracture, tumor, infection, etc.) visualized on the HDU.
- add-on virtual objects e.g., fracture, tumor, infection, etc.
- Figure 8A illustrates a processing block for performing geo-registration of 4D datasets to a 3D printed object.
- 800 illustrates a processing block of performing 3D printing of a structure during one time point of a 4D dataset.
- 801 illustrates a processing block of performing georegistration of the 4D dataset with the 3D printed object. Note that the preferred embodiment is to perform geo-registration of the virtual object at the same one phase that was printed.
- 802 illustrates a processing block of displaying the remainder of the time points of the structure on the extended reality display.
- Figure 8B illustrates a 3D printed object and a geo-registered virtual object.
- 803 illustrates the 3D printed object, which is a heart (e.g. from a CT Scan).
- the tangible 3D printed object will be printed in the phase where the object (e.g., heart) is in its smallest size.
- 804 illustrates the pulmonic valve.
- 805 illustrates the aorta.
- 806 illustrates the outer contour of the virtual object of the heart in an enlarged phase, which is displayed on the HDU.
- 807 illustrates the outer contour of the virtual object of the heart in a maximally enlarged phase (end diastole), which is displayed on the HDU.
- the user can hold (via hand or via tool) the 3D printed heart and watch it enlarge over multiple heart beats on the HDU.
- Figure 9A illustrates viewing the geo-registered tool and 3D printed object without looking through the HDU.
- 900 illustrates the 3D printed object, which is the heart.
- 901 illustrates the aortic valve portion of the 3D printed object. Note that the aorta was not printed.
- 902 illustrates the pulmonic valve portion of the 3D printed object. Note that the pulmonary valve was not printed.
- 903 illustrates the geo-registered tool, which in this case is a geo-registered platform.
- Figure 9B illustrates viewing the geo-registered tool and 3D printed object through the HDU.
- 904 illustrates the HDU with a left eye image and right eye image.
- 905 illustrates the left eye view of the 3D printed object of the heart (e.g., printed from a CT scan) as it would be seen through the transparent HDU. Note that the heart is seen with and without the HDU because it is an actual tangible object.
- 906 illustrates the left eye view of the pulmonary artery, which is a virtual object geo-registered to the 3D printed object. Note that the pulmonary artery virtual object is only visible when looking through the HDU.
- 907 illustrates the left eye view of the aorta, which is a virtual object geo-registered to the 3D printed object.
- the aorta virtual object is only visible when looking through the HDU.
- 908 illustrates the left eye view of a georegistered tool, which in this case is a platform.
- 909 illustrates the right eye view of the 3D printed object of the heart (e.g., printed from a CT scan) as it would be seen through the transparent HDU. Note that the heart is seen with and without the HDU because it is an actual tangible object.
- 910 illustrates the right eye view of the pulmonary artery, which is a virtual object geo-registered to the 3D printed object. Note that the pulmonary artery virtual object is only visible when looking through the HDU.
- 911 illustrates the right eye view of the aorta, which is a virtual object geo-registered to the 3D printed object.
- aorta virtual object is only visible when looking through the HDU.
- 912 illustrates the right eye view of a georegistered tool, which in this case is a platform.
- this invention could work with a wide range of HDUs (e.g., HoloLens, Magic Leap and others).
- new displays e.g., contact lenses
- this registration technique is equally applicable.
- Figure 10 illustrates a 3D printed object from a first volume co-registered to a 3D printed object from a second volume.
- 1000 illustrates the tangible 3D printed object, which in this example is a breast tumor obtained from a first volume (e.g., after treatment with chemotherapy).
- 1001 illustrates the intangible virtual object of the breast tumor obtained from a second volume (e.g., prior to treatment with chemotherapy).
- the preferred embodiment would be to display the virtual object partially transparent. This is useful because it allows a user to hold a 3D printed object of the patient specific imaging finding (e.g., patient’s own breast cancer) in their hand and visualize on the HDU what it looked like at a different time point.
- the patient specific imaging finding e.g., patient’s own breast cancer
- this example is of a single imaging finding at two different time points. Multiple time points can be used. Additionally, datasets from different patients can be used. For example, a femur can be printed from a first patient. Then, pathology (e.g., fracture) from a different patient can be registered to and superimposed on the 3D printed object, such as is further explained in the next figure.
- pathology e.g., fracture
- Figure 11 illustrates registering a range of virtual objects to a 3D printed object.
- the purpose of this figure is to teach that a variety of datasets and virtual objects can be used to improve image analysis of the 3D printec object.
- a first example is registering an anatomic feature from a first 3D dataset to the 3D printed object made from a second dataset.
- An example would be registering an anatomic feature of a tendon (e.g., Achilles tendon) from a first patient onto a 3D printed calcaneus from a second patient.
- a tendon e.g., Achilles tendon
- US Patent Application 16/010,925 INTERACTIVE PLACEMENT OF A 3D DIGITAL REPRESENTATION OF A SURGICAL DEVICE OR ANATOMIC FEATURE INTO A 3D RADIOLOGIC IMAGE FOR PRE-OPERATIVE PLANNING.
- a second example is registering an pathologic feature from a first 3D dataset to the 3D printed object made from a second dataset.
- An example would be registering an pathologic feature of a ruptured tendon (e.g., ruptured Achilles tendon) from a first patient onto a 3D printed calcaneus from a second patient.
- a ruptured tendon e.g., ruptured Achilles tendon
- US Patent Application 16/010,925 INTERACTIVE PLACEMENT OF A 3D DIGITAL REPRESENTATION OF A SURGICAL DEVICE OR ANATOMIC FEATURE INTO A 3D RADIOLOGIC IMAGE FOR PREOPERATIVE PLANNING.
- a third example is registering a surgical hardware from a first 3D dataset to the 3D printed object made from a second dataset.
- An example would be registering an side plate and screws from a first 3D dataset onto a 3D printed femur from a second patient.
- a fourth example is performing a modified version of the original 3D dataset and registering the modified version to the 3D printed object made from the original dataset.
- An example would be printing a 3D object (e.g., kidney) from a patient’s CT scan of the abdomen. Then, performing image processing of the CT scan of the abdomen, such that the dataset is modified.
- a 3D object e.g., kidney
- Examples of image processing include techniques described in: US Patent Application 16/752,691, IMPROVING IMAGE QUALITY BY INCORPORATING DATA UNIT ASSURANCE MARKERS, which is incorporated by reference in its entirety and US Patent Application 16/195,251, INTERACTIVE VOXEL MANIPULATION IN VOLUMETRIC MEDICAL IMAGING FOR VIRTUAL MOTION, DEFORMABLE TISSUE, AND VIRTUAL RADIOLOGICAL DISSECTION, which is incorporated by reference in its entirety.
- a fifth example is generating a simulated dataset and then performing registration of the simulated dataset onto a 3D printed object.
- a simulated dataset can be generated by methods described in US Patent Application 16/703,629, RADIOLOGIST-ASSISTED MACHINE LEARNING WITH INTERACTIVE, VOLUME SUBTENDING 3D CURSOR, which is incorporated by reference in its entirety and US Patent Application 16/736,731, RADIOLOGIST-ASSISTED MACHINE LEARNING WITH INTERACTIVE, VOLUME SUBTENDING 3D CURSOR, which is incorporated by reference in its entirety.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Architecture (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Processing Or Creating Images (AREA)
Abstract
Dans ce brevet, l'invention concerne un procédé et un appareil pour améliorer l'analyse d'un objet imprimé en 3D. Une première partie d'un ensemble de données 3D est sélectionnée pour une impression 3D. L'ensemble de données 3D est géo-inscrit sur l'objet imprimé en 3D. Une seconde partie de l'ensemble de données 3D est affichée sur un visiocasque (HDU). L'invention concerne également des caractéristiques avancées supplémentaires. Un exemple consisterait à effectuer une impression 3D d'un cœur à partir d'un tomodensitogramme et à afficher l'aorte et la veine cave inférieure sur l'HDU géo-inscrit sur le cœur imprimé en 3D.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/075,799 US11090873B1 (en) | 2020-02-02 | 2020-10-21 | Optimizing analysis of a 3D printed object through integration of geo-registered virtual objects |
US17/075,799 | 2020-10-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022086608A1 true WO2022086608A1 (fr) | 2022-04-28 |
Family
ID=81290026
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/041359 WO2022086608A1 (fr) | 2020-10-21 | 2021-07-13 | Optimisation de l'analyse d'un objet imprimé en 3d par intégration d'objets virtuels géo-inscrits |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022086608A1 (fr) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140307067A1 (en) * | 2006-12-28 | 2014-10-16 | David Byron Douglas | Method and apparatus for three dimensional viewing of images |
US20180168730A1 (en) * | 2015-03-25 | 2018-06-21 | Zaxis Labs | System and method for medical procedure planning |
-
2021
- 2021-07-13 WO PCT/US2021/041359 patent/WO2022086608A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140307067A1 (en) * | 2006-12-28 | 2014-10-16 | David Byron Douglas | Method and apparatus for three dimensional viewing of images |
US20180168730A1 (en) * | 2015-03-25 | 2018-06-21 | Zaxis Labs | System and method for medical procedure planning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9956054B2 (en) | Dynamic minimally invasive surgical-aware assistant | |
Kersten-Oertel et al. | The state of the art of visualization in mixed reality image guided surgery | |
Abhari et al. | Training for planning tumour resection: augmented reality and human factors | |
Bichlmeier et al. | Contextual anatomic mimesis hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality | |
US9848186B2 (en) | Graphical system with enhanced stereopsis | |
Vidal et al. | Principles and applications of computer graphics in medicine | |
JP2021523784A (ja) | 患者に貼付される光学コードを用いた患者の画像データと患者の実景との位置合わせ | |
US20210072844A1 (en) | Interactive 3d cursor for use in medical imaging | |
Barcali et al. | Augmented reality in surgery: a scoping review | |
CN101809628A (zh) | 解剖数据的可视化 | |
US20180310907A1 (en) | Simulated Fluoroscopy Images with 3D Context | |
EP2803044B1 (fr) | Appareil de traitement d'images | |
US11395701B1 (en) | Method of selecting a specific surgical device for preoperative planning | |
Kockro et al. | Image-guided neurosurgery with 3-dimensional multimodal imaging data on a stereoscopic monitor | |
US20220346888A1 (en) | Device and system for multidimensional data visualization and interaction in an augmented reality virtual reality or mixed reality environment | |
Halabi et al. | Virtual and augmented reality in surgery | |
Tietjen et al. | Enhancing slice-based visualizations of medical volume data. | |
US11090873B1 (en) | Optimizing analysis of a 3D printed object through integration of geo-registered virtual objects | |
US20200261157A1 (en) | Aortic-Valve Replacement Annotation Using 3D Images | |
US20230054394A1 (en) | Device and system for multidimensional data visualization and interaction in an augmented reality virtual reality or mixed reality image guided surgery | |
Glombitza et al. | Virtual surgery in a (tele-) radiology framework | |
WO2022086608A1 (fr) | Optimisation de l'analyse d'un objet imprimé en 3d par intégration d'objets virtuels géo-inscrits | |
Schenkenfelder et al. | Elastic registration of abdominal MRI scans and RGB-D images to improve surgical planning of breast reconstruction | |
Meehan et al. | Virtual 3D planning and guidance of mandibular distraction osteogenesis | |
Preim et al. | 3D-Interaction Techniques for Planning of Oncologic Soft Tissue Operations. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21883474 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21883474 Country of ref document: EP Kind code of ref document: A1 |