US11382603B2 - System and methods for performing biomechanically driven image registration using ultrasound elastography - Google Patents
System and methods for performing biomechanically driven image registration using ultrasound elastography Download PDFInfo
- Publication number
- US11382603B2 US11382603B2 US15/972,324 US201815972324A US11382603B2 US 11382603 B2 US11382603 B2 US 11382603B2 US 201815972324 A US201815972324 A US 201815972324A US 11382603 B2 US11382603 B2 US 11382603B2
- Authority
- US
- United States
- Prior art keywords
- operative
- volume
- intra
- target organ
- ultrasound
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000002113 ultrasound elastography Methods 0.000 title claims abstract description 93
- 238000000034 method Methods 0.000 title claims abstract description 62
- 238000002604 ultrasonography Methods 0.000 claims abstract description 205
- 210000000056 organ Anatomy 0.000 claims abstract description 156
- 238000013334 tissue model Methods 0.000 claims abstract description 77
- 239000000463 material Substances 0.000 claims abstract description 72
- 210000001519 tissue Anatomy 0.000 claims description 91
- 238000009826 distribution Methods 0.000 claims description 71
- 239000000523 sample Substances 0.000 claims description 33
- 230000011218 segmentation Effects 0.000 claims description 21
- 238000006073 displacement reaction Methods 0.000 claims description 18
- 238000002591 computed tomography Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 12
- 230000009466 transformation Effects 0.000 claims description 12
- 238000013329 compounding Methods 0.000 claims description 11
- 210000004185 liver Anatomy 0.000 claims description 7
- 210000002307 prostate Anatomy 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 3
- 238000002595 magnetic resonance imaging Methods 0.000 claims description 2
- 238000002059 diagnostic imaging Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 6
- 238000013135 deep learning Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 230000003902 lesion Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000013188 needle biopsy Methods 0.000 description 3
- 238000002271 resection Methods 0.000 description 3
- 210000004872 soft tissue Anatomy 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000002091 elastography Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000002600 positron emission tomography Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002608 intravascular ultrasound Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012978 minimally invasive surgical procedure Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5246—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/485—Diagnostic techniques involving measuring strain or elastic properties
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
- G06T7/344—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/37—Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2063—Acoustic tracking systems, e.g. using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4244—Evaluating particular parts, e.g. particular organs liver
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/43—Detecting, measuring or recording for evaluating the reproductive systems
- A61B5/4375—Detecting, measuring or recording for evaluating the reproductive systems for evaluating the male reproductive system
- A61B5/4381—Prostate evaluation or disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52036—Details of receivers using analysis of echo signal for target characterisation
- G01S7/52042—Details of receivers using analysis of echo signal for target characterisation determining elastic properties of the propagation medium or of the reflective target
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10104—Positron emission tomography [PET]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30056—Liver; Hepatic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30081—Prostate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present invention relates to registration of pre-operative volumetric medical images to intra-operative ultrasound images, and more particularly, to registering pre-operative volumetric medical images and intra-operative ultrasound images using a biomechanically driven image registration based on both B-mode ultrasound and ultrasound elastography.
- Ultrasound is used to guide minimally invasive surgical procedures.
- ultrasound can be used to guide interventions such as prostate needle biopsies and staging for liver lesion resection.
- information provided by ultrasound can be enhanced by fusing the ultrasound with high resolution pre-operative medical images, such as computed tomography (CT) or magnetic resonance (MR).
- CT computed tomography
- MR magnetic resonance
- ultrasound guided prostate needle biopsies can be improved by intra-operative registration with MR findings of suspected cancer nodules.
- ultrasound guided staging for liver lesion resection can be improved by fusion with pre-operative CT images.
- pre-operative CT or MR In order for pre-operative CT or MR to be effectively used to supplement ultrasound guidance, a quantitative registration transformation must be computed to fuse the images into a common coordinate space. In clinical practice, the images are often merely aligned mentally by the clinician after consulting both images visually. This can lead to inaccuracies and high inter-user and intra-user variability. Accordingly, computer-based techniques to compute the registration between pre-operative CT or MR and intra-operative ultrasound are advantageous.
- computer-based registration of pre-operative volumes (such as CT or MR) to intra-operative ultrasound volumes is a challenging task due to organ deformation which occurs as a result of soft tissue motion, for example from the application of the ultrasound probe, such that correspondence between features in the pre-operative volumes and features in the ultrasound volumes can be difficult to achieve.
- the present invention provides a method and system for registration of pre-operative volumetric medical images to intra-operative ultrasound images.
- Embodiments of the present invention register pre-operative volumetric medical images and intra-operative ultrasound images using a biomechanically driven image registration based on both B-mode ultrasound and ultrasound elastography.
- a method for registering a pre-operative 3D medical image volume of a target organ to intra-operative ultrasound images comprises: acquiring an intra-operative 3D B-mode ultrasound volume and an intra-operative 3D ultrasound elastography volume; determining patient-specific boundary conditions for a biomechanical tissue model of a target organ using the intra-operative 3D B-mode volume; determining patient-specific material properties for the biomechanical tissue model of the target organ using the 3D ultrasound elastography volume; and deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model with the patient-specific material properties with the deformation of the target organ in the pre-operative 3D medical image volume constrained by the patient-specific boundary conditions.
- acquiring an intra-operative 3D B-mode ultrasound volume and an intra-operative 3D ultrasound elastography volume comprises: simultaneously acquiring a stream of 2D B-mode ultrasound images and a stream of 3D ultrasound elastography images by interleaving frames acquired using a B-mode ultrasound acquisition protocol and frames acquired using an ultrasound elastography acquisition protocol during a sweep of the target organ with an ultrasound probe; generating the intra-operative 3D B-mode ultrasound volume by compounding the stream of 2D B-model ultrasound images; and generating the intra-operative 3D ultrasound elastography volume by compounding the stream of 2D ultrasound elastography images.
- the method further comprises tracking the ultrasound probe in 3D space using a tracking system, resulting in tracking information associated with each frame acquired by the ultrasound probe, wherein the intra-operative 3D B-mode ultrasound volume is generated by compounding the stream of 2D B-model ultrasound images using the tracking information associated each frame of the stream of 2D B-model ultrasound images and the intra-operative 3D ultrasound elastography volume is generated by compounding the stream of 2D ultrasound elastography images using the tracking information associated each frame of the stream of 2D ultrasound elastography images.
- acquiring an intra-operative 3D B-mode ultrasound volume and an intra-operative 3D ultrasound elastography volume comprises acquiring the intra-operative 3D B-mode ultrasound volume and the intra-operative ultrasound elastography volume using a 3D ultrasound probe.
- the biomechanical tissue model is constructed on an a pre-operative tissue domain defined by a segmentation of the target organ in the pre-operative medical image volume
- determining patient-specific boundary conditions for a bio-mechanical tissue model of a target organ using the intra-operative 3D B-mode volume comprises: establishing surface correspondences on a surface of the target organ segmented in the pre-operative medical image volume and a surface of the target organ segmented in the intra-operative 3D B-mode ultrasound volume; and designating boundary conditions for the biomechanical tissue model that constrain the deformation of the target organ in the pre-operative 3D medical image volume based on displacements between the surface correspondences on the surface of the target organ segmented in the pre-operative medical image volume and the surface of the target organ segmented in the intra-operative 3D B-mode ultrasound volume.
- determining patient-specific material properties for the biomechanical tissue model of the target organ using the 3D ultrasound elastography volume comprises: (a) warping the pre-operative tissue domain using the bio-mechanical tissue model with a current tissue material property distribution for the pre-operative tissue domain based on the patient-specific boundary conditions, resulting in a model-deformed distribution of the tissue material property; (b) calculating an incremental update to the tissue material property distribution based on a difference between the model-deformed distribution of the tissue material property and a distribution of the tissue material property in the intra-operative 3D ultrasound elastography volume; and (c) updating the current tissue material property distribution for the pre-operative tissue domain based on the calculated incremental update to the tissue material property distribution.
- determining patient-specific material properties for the biomechanical tissue model of the target organ using the 3D ultrasound elastography volume further comprises: repeating steps (a), (b), and (c) for a plurality of iterations.
- repeating steps (a), (b), and (c) for a plurality of iterations comprises repeating steps (a), (b), and (c) for a plurality of iterations until the difference between the model-deformed distribution of the tissue material property and the distribution of the tissue material property in the intra-operative 3D ultrasound elastography volume is less than a threshold value.
- repeating steps (a), (b), and (c) for a plurality of iterations comprises repeating steps (a), (b), and (c) for a plurality of iterations until a predetermined maximum number of iterations is reached.
- the tissue material property distribution is a tissue stiffness distribution.
- deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model with the patient-specific material properties with the deformation of the target organ in the pre-operative 3D medical image volume constrained by the patient-specific boundary conditions comprises: deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model with the current tissue material property distribution for the pre-operative tissue domain resulting from a final iteration of the plurality of iterations with the deformation of the target organ in the pre-operative 3D medical image volume constrained by the patient-specific boundary conditions.
- the method further comprises: performing an initial rigid alignment of the pre-operative medical image volume and the intra-operative 3D B-mode ultrasound volume prior to determining the patient-specific boundary conditions, determining the patient-specific material properties, and deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model.
- the method further comprises: overlaying the deformed target organ in the pre-operative 3D medical image volume on at least one of the intra-operative 3D B-mode ultrasound volume or one or more intra-operative 2D B-mode ultrasound images.
- the method further comprises: deforming a map of target points defined for the target organ in the pre-operative medical image volume using a non-rigid transformation corresponding to the deformation of the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model; and overlaying the deformed map of target points defined for the target organ on at least one of the intra-operative 3D B-mode ultrasound volume or one or more intra-operative 2D B-mode ultrasound images.
- the method further comprises: mapping the B-mode ultrasound image to a coordinate system of the pre-operative medical image volume based on a non-rigid transformation corresponding to the deformation of the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model.
- the pre-operative medical image volume is one of a computed tomography volume or a magnetic resonance imaging volume.
- the target organ is one of a liver or a prostate.
- an apparatus for registering a pre-operative 3D medical image volume of a target organ to intra-operative ultrasound images comprises a processor and a memory storing computer program instructions, which when executed by the processor cause the processor to perform operations comprising: acquiring an intra-operative 3D B-mode ultrasound volume and an intra-operative 3D ultrasound elastography volume; determining patient-specific boundary conditions for a biomechanical tissue model of a target organ using the intra-operative 3D B-mode volume; determining patient-specific material properties for the biomechanical tissue model of the target organ using the 3D ultrasound elastography volume; and deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model with the patient-specific material properties with the deformation of the target organ in the pre-operative 3D medical image volume constrained by the patient-specific boundary conditions.
- a non-transitory computer readable medium stores computer program instructions for registering a pre-operative 3D medical image volume of a target organ to intra-operative ultrasound images.
- the computer program instructions when executed by a processor, cause the processor to perform operations comprising: acquiring an intra-operative 3D B-mode ultrasound volume and an intra-operative 3D ultrasound elastography volume; determining patient-specific boundary conditions for a biomechanical tissue model of a target organ using the intra-operative 3D B-mode volume; determining patient-specific material properties for the biomechanical tissue model of the target organ using the 3D ultrasound elastography volume; and deforming the target organ in the pre-operative 3D medical image volume using the biomechanical tissue model with the patient-specific material properties with the deformation of the target organ in the pre-operative 3D medical image volume constrained by the patient-specific boundary conditions.
- FIG. 1 illustrates an overview of an intra-operative workflow for registering a pre-operative medical imaging volume to intra-operative ultrasound (US) images according to an embodiment of the present invention
- FIG. 2 illustrates a method for registering pre-operative medical image date and intra-operative US data according to an embodiment of the present invention
- FIG. 3 illustrates a method of determining patient-specific boundary conditions and material properties for a biomechanical tissue model of a target organ and performing non-rigid registration driven by the biomechanical tissue model according to an embodiment of the present invention
- FIG. 4 illustrates a method of iteratively computing a patient-specific material property distribution according to an embodiment of the present invention.
- FIG. 5 is a high-level block diagram of a computer capable of implementing the present invention.
- the present invention relates to a method and system for computer-based registration of pre-operative volumetric medical images to intra-operative ultrasound images.
- Embodiments of the present invention are described herein to give a visual understanding of the registration method.
- a digital image is often composed of digital representations of one or more objects (or shapes).
- the digital representation of an object is often described herein in terms of identifying and manipulating the objects.
- Such manipulations are virtual manipulations accomplished in the memory or other circuitry/hardware of a computer system. Accordingly, is to be understood that embodiments of the present invention may be performed within a computer system using data stored within the computer system.
- pre-operative medical imaging data such as computed tomography (CT) or magnetic resonance (MR), and intra-operative ultrasound (US)
- CT computed tomography
- MR magnetic resonance
- US intra-operative ultrasound
- One simple existing technique for registering pre-operative medical imaging data and intra-operative US involves computing a rigid transformation to align the target organ using point or surface features.
- a rigid transformation is typically insufficient to accurately align organ structures in the pre-operative medical image data and the intra-operative US due to soft tissue deformation which occurs during US imaging.
- More sophisticated methods utilize some kind of deformable registration, such as a biomechanical tissue model.
- Embodiments of the present invention provide an improvement over existing methods for registering pre-operative medical images an intra-operative US images by using a biomechanical tissue model with patient-specific boundary conditions and patient-specific material properties for the biomechanical model being automatically determined from compounded B-mode US and ultrasound elastography (USE) image streams. This provides an advantage of improved accuracy as compared with previous registration techniques.
- embodiments of the present invention provide an improvement of generating the two US images streams by interleaving the B-mode and USE image frames while sweeping the target organ intra-operatively, so that the two US image streams are implicitly co-registered due to the high sampling rate of US.
- this registration/fusion or a pre-operative medical imaging volume and an intra-operative US volume can be performed in two phases of: 1) an initial rigid alignment, and 2) a non-rigid alignment/deformation.
- Embodiments of the present invention perform the non-rigid deformation by establishing point-based correspondences between geometry of a target organ in the pre-operative medical imaging volume and geometry of the target organ from an intra-operative B-mode US volume and then deforming the pre-operative organ geometry using a biomechanical tissue model while constraining the deformation such that the correspondences in the deformed pre-operative organ geometry match those in the B-mode US volume.
- Embodiments of the present invention utilize patient-specific tissue stiffness values for the biomechanical tissue model, which are acquired from a 3D USE volume and used to guide the model deformation.
- the computer-based registration method takes as input a pre-operative medical image volume (e.g., CT or MR) and a set of tracked ultrasound frames and outputs a non-rigid transformation which encodes the alignment of the pre-operative medical image data with the intra-operative ultrasound data, which can be used to enable fused image displays in real-time during an intervention procedure.
- FIG. 1 illustrates an overview of an intra-operative workflow for registering a pre-operative medical imaging volume to intra-operative US images according to an embodiment of the present invention. As illustrated in the embodiment of FIG. 1 , at step 100 , tracked B-mode US images and US electrography (USE) images are acquired.
- USE US electrography
- a user sweeps a tracked ultrasound probe 102 across a target organ to acquire 2D B-mode US images and 2D USE images.
- the B-mode US images and USE images can be acquired simultaneously by the US probe 102 acquiring an interleaved stream of B-mode and USE images during the sweep. Due to the high sampling rate of US, this results in compounded B-mode and USE volumes that are implicitly co-registered.
- the US probe 102 is tracked in 3D space using a tracking system. For example, an optical tracker or electromagnetic tracker may be used to track the 3D position and orientation of the US probe 102 .
- the B-mode images are compounded based on the tracking information to generate a 3D B-mode US volume and the USE images are compounded to generate a 3D USE volume.
- an initial rigid alignment between the pre-operative medical image volume and the 3D B-mode US volume is calculated.
- the tissue deformation of the target organ in the pre-operative medical image volume is computed to using a biomechanical tissue model to non-rigidly register the pre-operative medical image data to the intra-operative US data.
- Patient-specific boundary conditions for the biomechanical tissue model that are used to constrain the deformation of the pre-operative target are determined using the 3D B-mode US volume.
- Patient-specific material properties, such as a spatially varying distribution of tissue stiffness values, are determined using the 3D USE volume.
- FIG. 2 illustrates a method for registering pre-operative medical image date and intra-operative US data according to an embodiment of the present invention.
- the method of FIG. 2 can be used to provide computer-based registration and fusion of pre-operative medical image data and intra-operative US data during an US guided intervention on a target organ.
- the target organ may be the liver or the prostate, but the present invention is not limited thereto and may be used for any organ for which an US guided intervention is performed.
- US guided interventions for which this method may be applied include US guide staging for liver lesion resection and US guided prostate needle biopsies, but the present invention is not limited thereto.
- a pre-operative medical image volume of the target organ of the patient is received.
- the pre-operative medical image volume typically provides dense anatomical or functional data.
- the pre-operative 3D medical image volume is acquired prior to the surgical procedure.
- the 3D medical image volume can be acquired using any imaging modality, such as computed tomography (CT), magnetic resonance (MR), or positron emission tomography (PET).
- CT computed tomography
- MR magnetic resonance
- PET positron emission tomography
- the pre-operative 3D medical image volume can be received directly from an image acquisition device, such as a CT scanner or MR scanner, or can be received by loading a previously stored 3D medical image volume from a memory or storage of a computer system.
- the pre-operative 3D medical image volume in a pre-operative planning phase, can be acquired using the image acquisition device and stored in the memory or storage of the computer system.
- the pre-operative 3D medical image can then be loaded from the memory or storage system during the surgical procedure.
- the pre-operative medical image volume may be acquired directly before or even during a procedure, for example using a C-arm image acquisition device or intra-operative MR.
- the pre-operative 3D medical image volume includes a target anatomical object, such as a target organ.
- the target organ can be the liver or the prostate.
- the pre-operative volumetric imaging data can provide for a more detailed view of the target anatomical object, as compared to intra-operative US images.
- the target organ is segmented in the pre-operative 3D medical image volume and used to construct of 3D point representation in the tissue domain in the pre-operative coordinates, typically as a mesh or a 3D point cloud.
- the 3D point representation is used to define the domain of the biomechanical tissue model.
- a machine learning based segmentation algorithm may be used to segment the target organ in the pre-operative medical image volume.
- a marginal space learning (MSL) based framework may be employed, e.g., using the method described in U.S. Pat. No. 7,916,919, entitled “System and Method for Segmenting Chambers of a Heart in a Three Dimensional Image,” which is incorporated herein by reference in its entirety.
- a semi-automatic segmentation technique such as, e.g., graph cut or random walker segmentation can be used.
- a deep-learning based segmentation technique can be used, such as a deep-learning based segmentation technique described in International Patent Publication No. WO 2018/015414 A1, entitled “Method and System for Artificial Intelligence Based Medical Image Segmentation,” which is incorporated herein by reference in its entirety.
- an intra-operative 3D B-mode US volume and an intra-operative 3D USE volume are acquired.
- the intra-operative 3D B-mode US volume and an intra-operative 3D USE volume can be acquired by acquiring a stream of intra-operative 2D B-mode US images and a stream of intra-operative 2D USE images using a tracked US probe and generating the intra-operative 3D B-mode US volume and intra-operative 3D USE volume by compounding the 2D B-mode US images and the 2D USE images, respectively.
- Any type of US probe can be used to acquire the intra-operative US images, depending on the target organ.
- the US probe may be an external US probe, an intravascular ultrasound (IVUS) probe, an endorectal ultrasound (ERUS) probe, etc.
- a tracking system is used to track the US probe in 3D space.
- an optical tracker or an electromagnetic tracker can be used to track the US probe.
- the tracking system providing tracking information corresponding to the position and orientation of the US probe for each frame (US image) acquired by the US probe.
- a sweep of the target organ is performed using the US probe to obtain a stream of 2D US images that cover the target organ.
- the stream of 2D B-mode US images and the stream of 2D USE images are acquired simultaneously in a single interleaved stream by the US probe during the sweep of the target organ.
- the US probe interleaves the B-mode frame acquisition protocol and the USE frame acquisition protocol, such that the stream of US images received from the US probe alternates B-mode and USE frames.
- the intra-operative 3D B-mode US volume is generated by compounding the stream of 2D B-mode US images based on the tracking information associated with the 2D B-mode US images.
- the intra-operative 3D USE volume is generated by compounding the stream of 2D USE images based on the tracking information associated with the 2D USE images.
- the US probe may be a 3D US probe that acquires 3D US volumes.
- the intra-operative 3D B-mode US volume and intra-operative 3D USE volume are received from the US probe, which captures the intra-operative 3D B-mode US volume and the intra-operative 3D USE volume as native 3D volumes.
- an initial rigid alignment of the pre-operative medical image volume and the intra-operative 3D B-mode US volume is performed.
- the target organ is segmented in the 3D B-mode US volume to provide a 3D point representation of the target organ in the US coordinates.
- the segmented 3D point representation can be a 3D mesh or a 3D point cloud.
- a machine learning based segmentation algorithm or any other segmentation algorithm may be used to segment the target organ in the 3D B-mode US volume.
- a marginal space learning (MSL) based framework may be employed, e.g., using the method described in U.S. Pat. No.
- a semi-automatic segmentation technique such as, e.g., graph cut or random walker segmentation can be used.
- a deep-learning based segmentation technique can be used, such as a deep-learning based segmentation technique described in International Patent Publication No. WO 2018/015414 A1, entitled “Method and System for Artificial Intelligence Based Medical Image Segmentation,” which is incorporated herein by reference in its entirety.
- the initial rigid alignment may be performed using known rigid registration techniques to find a rigid registration that minimizes differences between corresponding point or surface features on the segmented target organ in the pre-operative medical image volume and the segmented target organ in the intra-operative 3D USE volume.
- non-rigid registration is performed using a biomechanical tissue model with patient-specific boundary conditions and patient-specific material properties based on the intra-operative 3D B-mode US volume and the intra-operative 3D USE volume.
- the biomechanical tissue model simulates biomechanical deformation of the tissue of the target organ.
- the biomechanical tissue model is used to drive the non-rigid registration by computing deformation between the target organ in the pre-operative medical image volume and the intra-operative US acquisition.
- Patient-specific boundary conditions for the biomechanical tissue model are computed based on the intra-operative 3D USE volume and are used to constrain the simulated deformation of the pre-operative target organ such that positions of surface correspondences match on the deformed pre-operative target organ and the intra-operative target organ.
- Patient-specific material properties for the biomechanical tissue model such as a distribution of spatially varying values of a tissue property, are determined for the pre-operative tissue domain based on the intra-operative 3D USE volume.
- a distribution of tissue stiffness Youngng's modulus
- FIG. 3 illustrates a method of determining patient-specific boundary conditions and material properties for a biomechanical tissue model of a target organ and performing non-rigid registration driven by the biomechanical tissue model according to an embodiment of the present invention.
- the method of FIG. 3 can be used to implement step 208 of FIG. 2 .
- patient-specific boundary conditions are determined for the biomechanical tissue model using the intra-operative 3D B-mode US volume.
- point correspondences between the target organ segmentations in the pre-operative medical image volume and the 3D B-mode US volume are used as boundary conditions to the biomechanical tissue model.
- the biomechanical tissue model is implemented using a finite element model (FEM) solution for the equations of motions describing the organ deformation.
- FEM finite element model
- the organ domain for the biomechanical tissue model is discretized as a tetrahedral mesh from the geometry of the segmented surface of the target organ in the pre-operative medical image volume.
- the biomechanical tissue model can use 3D Navier-Cauchy equations for the tissue displacement field at static equilibrium:
- E 2 ⁇ ( 1 + ⁇ ) ⁇ ( 1 - 2 ⁇ ⁇ ) ⁇ ⁇ ( ⁇ ⁇ ⁇ ) + E 2 ⁇ ( 1 + ⁇ ) ⁇ ⁇ 2 ⁇ ⁇ + F 0 ( 1 )
- E Young's modulus
- ⁇ Poisson's ratio
- ⁇ the 3D displacement vector at a point in the tissue
- F the applied body force distribution.
- the biomechanical tissue model seeks to solve for the displacements at each point of the tissue domain such that this equation is satisfied.
- K is the stiffness matrix containing contributions from the material properties and constitutive equation (1)
- ⁇ is the vector of mesh nodal displacements
- f contains a vector of applied boundary conditions.
- patient-specific boundary conditions for f are generated using the information in the B-mode US volume and also patient-specific material properties are generated using the information in the USE volume.
- displacement conditions are calculated for boundary nodes in f based on vectors mapping from boundary point locations in the segmentation of the target organ in the pre-operative medical image volume to the locations of the corresponding boundary points in the segmentation of the target organ in the inter-operative 3D B-mode US volume.
- the boundary conditions constrain the deformation of the pre-operative target organ computed by the biomechanical tissue model such that the locations of the point/surface correspondences of the deformed pre-operative target organ match those on the intra-operative target organ.
- patient-specific material properties are determined using the intra-operative 3D USE volume.
- the material property distribution in the pre-operative imaging is typically unknown and must be guessed initially.
- a homogenous material property e.g., Young's modulus
- the patient-specific boundary conditions are then used in biomechanical tissue model to warp the pre-operative tissue domain to match the US coordinates, after which the warped pre-operative tissue distribution of the tissue properties can be directly compared to the known tissue property distribution in the intra-operative 3D USE volume.
- the pre-operative properties are then updated to better match the data in the 3D USE volume and the model deformation is recomputed using the updated pre-operative tissue property distribution. These steps can be iterated until the warped pre-operative tissue property distribution matches the spatial distribution of the tissue property in the USE volume.
- FIG. 4 illustrates a method of iteratively computing a patient-specific material property distribution according to an embodiment of the present invention.
- a patient-specific distribution of tissue stiffness i.e., Young's modulus
- an US sweep is performed with interleaved B-mode and USE frame acquisition protocols in order to acquire a stream of 2D B-mode US images and a stream of 2D USE images.
- the 2D B-mode images are compounded to generate the 3D US B-mode volume 406 and the 2D USE images are compounded to generate the 3D USE volume 408 .
- the patient-specific boundary conditions Prior to computing the patient-specific stiffness property distribution for the biomechanical tissue model, the patient-specific boundary conditions are computed based on the US B-mode volume 406 , as described above in connection with step 302 of FIG. 3 . Steps 410 , 412 , and 414 of FIG. 4 are then iterated to compute the patient-specific stiffness property distribution for the pre-operative tissue domain.
- a model-deformed elastography image is generated by warping the current distribution for the stiffness property in the pre-operative tissue domain using the biomechanical model based on the patient-specific boundary conditions.
- a predetermined initial tissue stiffness distribution can be used to initialize the tissue stiffness distribution in the pre-operative tissue domain.
- the initial tissue stiffness distribution may be a uniform distribution with a value for tissue stiffness selected from literature used at all spatial locations (nodes) of the pre-operative tissue domain.
- the embodiment of FIG. 4 shows the example of solving for the distribution of Young's modulus, E. Let E p i , be the current guess for the stiffness distribution in the pre-operative domain at iteration i.
- E d i T ( E p i , ⁇ i ) (4)
- T the warping operation using the model displacement field
- ⁇ i includes all other parameters in the model (boundary conditions, etc.).
- the model-deformed tissue stiffness distribution is compared to the measured USE volume 408 .
- the current tissue stiffness distribution in the domain of the pre-operative medical image volume is updated based on the difference ⁇ E i between the model-deformed tissue stiffness distribution and the actual tissue stiffness distribution in the USE volume.
- the method then returns to step 410 and repeats steps 410 , 412 , and 414 until a stop condition indicating convergence of the tissue stiffness distribution is reached.
- a stop condition indicating convergence of the tissue stiffness distribution is reached.
- the stop condition can be reached when the difference ⁇ E i between the model-deformed tissue stiffness distribution and the actual tissue stiffness distribution in the USE volume is less than a certain threshold value.
- the stop condition can be reached when the difference between a predetermined maximum number of iterations has been reached. If the stop condition is not yet reaches, the method continues to step 414 and performs another iteration.
- step 416 at which at least a portion of the pre-operative medical image volume is deformed using the biomechanical model with the patient-specific boundary conditions and the patient-specific stiffness values.
- the target organ in the pre-operative medical image volume is deformed using the biomechanical model with the patient-specific material properties (determined in step 304 ) and the patient-specific boundary conditions (determined in step 302 ).
- the biomechanical tissue model with the patient-specific material properties and the patient-specific boundary conditions is used to perform a final biomechanical warping of the pre-operative data.
- the nodal displacements from the biomechanical model solution provide a non-rigid transformation which describes the organ deformation between the pre-operative and US imaging scans.
- the registration results are output.
- the resulting non-rigid transformation can be used to transform the target organ structure from the pre-operative medical image volume to the coordinate system of the intra-operative US scans.
- the transformed target organ from the pre-operative medical image volume can then be overlaid on the 3D B-mode US volume or on one or more 2D US images acquired during the US guided intervention.
- the target organ can be automatically overlaid on newly acquired intra-operative US images in real time as the US images are acquired by the US probe using the non-rigid transformation.
- the displacement field computed for the target organ can also be used to target maps of the target organ, such as maps of suspected nodules or planning annotations, from the pre-operative coordinate system to the US coordinate system to overlay such target maps on the 3D B-mode US volume or on one or more 2D US images acquired during the US guided intervention.
- target maps may be automatically overlaid on 2D US images as the US images are acquired.
- the displacements may also be used to directly warp image information itself from the pre-operative medical image volume to be overlaid on the 3D B-mode US volume or on one or more 2D US images.
- An inverse of the computed non-rigid transformation may be used to transform the target organ and/or image data from the US scans to the coordinate system of the pre-operative medical image volume to be overlaid on the pre-operative medical image volume.
- Fused images resulting from overlaying information from the pre-operative medical image data onto the US images or from overlying information from the intra-operative US data onto the pre-operative medical image data can be displayed on a display device of a computer system and used to guide the US guide intervention.
- Computer 502 contains a processor 504 , which controls the overall operation of the computer 502 by executing computer program instructions which define such operation.
- the computer program instructions may be stored in a storage device 512 (e.g., magnetic disk) and loaded into memory 510 when execution of the computer program instructions is desired.
- 1, 2, 3, and 4 may be defined by the computer program instructions stored in the memory 510 and/or storage 512 and controlled by the processor 504 executing the computer program instructions.
- One or more image acquisition device 520 such as an ultrasound probe, CT scanner, MR scanner, PET scanner, etc., can be connected to the computer 502 to input image data to the computer 502 . It is possible that the image acquisition device 520 and the computer 502 communicate wirelessly through a network.
- the computer 502 also includes one or more network interfaces 506 for communicating with other devices via a network.
- the computer 502 also includes other input/output devices 508 that enable user interaction with the computer 502 (e.g., display, keyboard, mouse, speakers, buttons, etc.).
- Such input/output devices 508 may be used in conjunction with a set of computer programs as an annotation tool to annotate volumes received from the image acquisition device 520 .
- a set of computer programs as an annotation tool to annotate volumes received from the image acquisition device 520 .
- FIG. 5 is a high level representation of some of the components of such a computer for illustrative purposes.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Quality & Reliability (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
where E is Young's modulus, ν is Poisson's ratio, μ is the 3D displacement vector at a point in the tissue, and F is the applied body force distribution. The biomechanical tissue model seeks to solve for the displacements at each point of the tissue domain such that this equation is satisfied. In an advantageous implementation, linear basis functions define on the tetrahedral elements of the target organ mesh (in the pre-operative domain) can be used and the Galerkin weighted residual method can be performed to construct a linear system of equations with the form:
Kμ=f (2)
where K is the stiffness matrix containing contributions from the material properties and constitutive equation (1), μ is the vector of mesh nodal displacements, and f contains a vector of applied boundary conditions. According to an advantageous embodiment of the present invention, patient-specific boundary conditions for f are generated using the information in the B-mode US volume and also patient-specific material properties are generated using the information in the USE volume.
Aμ=b (3)
which is solved for the nodal displacements which satisfy the boundary conditions and material properties. By defining the displacement boundary conditions using displacements between point/surface correspondences (i.e., corresponding boundary nodes) on the pre-operative target organ segmentation and the intra-operative target organ segmentation, and solving for the nodal displacements which satisfy these boundary conditions, the boundary conditions constrain the deformation of the pre-operative target organ computed by the biomechanical tissue model such that the locations of the point/surface correspondences of the deformed pre-operative target organ match those on the intra-operative target organ.
E d i =T(E p i,θi) (4)
where T is the warping operation using the model displacement field and θi includes all other parameters in the model (boundary conditions, etc.). Accordingly, in this step, the biomechanical model and boundary conditions are used to warp the current guess for the tissue stiffness distribution in the pre-operative domain to the coordinate system of the US.
ΔE i =E d i −E a i (5)
where Ea i is the actual stiffness distribution in the intra-operative 3D USE volume.
E u i =k 1 ΔE i +k 2(ΔE i −ΔE i−1) (6)
where the parameters k1 and k2 are selected to control convergence speed and stability of the algorithm. The tissue stiffness distribution in the pre-operative model domain is then updated to set the tissue stiffness distribution in the pre-operative domain for the next iteration as:
E p i+1 =T −1(E p i +E u i,θi). (7)
Claims (19)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/972,324 US11382603B2 (en) | 2018-05-07 | 2018-05-07 | System and methods for performing biomechanically driven image registration using ultrasound elastography |
CN201910375461.XA CN110458872B (en) | 2018-05-07 | 2019-05-07 | System and method for performing biomechanically driven image registration using ultrasound elastography |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/972,324 US11382603B2 (en) | 2018-05-07 | 2018-05-07 | System and methods for performing biomechanically driven image registration using ultrasound elastography |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190336109A1 US20190336109A1 (en) | 2019-11-07 |
US11382603B2 true US11382603B2 (en) | 2022-07-12 |
Family
ID=68384355
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/972,324 Active 2039-04-20 US11382603B2 (en) | 2018-05-07 | 2018-05-07 | System and methods for performing biomechanically driven image registration using ultrasound elastography |
Country Status (2)
Country | Link |
---|---|
US (1) | US11382603B2 (en) |
CN (1) | CN110458872B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10929708B2 (en) | 2018-12-10 | 2021-02-23 | International Business Machines Corporation | Deep learning network for salient region identification in images |
US11995854B2 (en) * | 2018-12-19 | 2024-05-28 | Nvidia Corporation | Mesh reconstruction using data-driven priors |
US20210174523A1 (en) * | 2019-12-10 | 2021-06-10 | Siemens Healthcare Gmbh | Method for registration of image data and for provision of corresponding trained facilities, apparatus for doing so and corresponding computer program product |
EP4111997A1 (en) * | 2019-12-18 | 2023-01-04 | Brainlab AG | Using a current workflow step for control of medical data processing |
CN112070731B (en) * | 2020-08-27 | 2021-05-11 | 佛山读图科技有限公司 | Method for guiding registration of human body model atlas and case CT image by artificial intelligence |
US20230181165A1 (en) * | 2021-12-15 | 2023-06-15 | GE Precision Healthcare LLC | System and methods for image fusion |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6775404B1 (en) | 1999-03-18 | 2004-08-10 | University Of Washington | Apparatus and method for interactive 3D registration of ultrasound and magnetic resonance images based on a magnetic position sensor |
US20040234113A1 (en) * | 2003-02-24 | 2004-11-25 | Vanderbilt University | Elastography imaging modalities for characterizing properties of tissue |
US7916919B2 (en) | 2006-09-28 | 2011-03-29 | Siemens Medical Solutions Usa, Inc. | System and method for segmenting chambers of a heart in a three dimensional image |
US20130324841A1 (en) | 2012-05-31 | 2013-12-05 | Ali Kamen | System and Method for Real-Time Ultrasound Guided Prostate Needle Biopsy Based on Biomechanical Model of the Prostate from Magnetic Resonance Imaging Data |
US9098904B2 (en) | 2010-11-15 | 2015-08-04 | Dartmouth College | System and method for registering ultrasound and magnetic resonance images |
US20160058424A1 (en) | 2014-08-26 | 2016-03-03 | Rational Surgical Solutions, Llc | Image registration for ct or mr imagery and ultrasound imagery using mobile device |
US9282933B2 (en) | 2010-09-17 | 2016-03-15 | Siemens Corporation | Magnetic resonance elastography for ultrasound image simulation |
US20160242745A1 (en) | 2015-02-20 | 2016-08-25 | Emory University | Systems, Methods and Computer Readable Storage Media Storing Instructions for Image-Guided Interventions Based on Patient-Specific Models |
US9521994B2 (en) | 2009-05-11 | 2016-12-20 | Siemens Healthcare Gmbh | System and method for image guided prostate cancer needle biopsy |
WO2018015414A1 (en) | 2016-07-21 | 2018-01-25 | Siemens Healthcare Gmbh | Method and system for artificial intelligence based medical image segmentation |
US20190200965A1 (en) * | 2016-09-12 | 2019-07-04 | Supersonic Imagine | Ultrasound imaging method and an apparatus implementing said method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9761014B2 (en) * | 2012-11-15 | 2017-09-12 | Siemens Healthcare Gmbh | System and method for registering pre-operative and intra-operative images using biomechanical model simulations |
-
2018
- 2018-05-07 US US15/972,324 patent/US11382603B2/en active Active
-
2019
- 2019-05-07 CN CN201910375461.XA patent/CN110458872B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6775404B1 (en) | 1999-03-18 | 2004-08-10 | University Of Washington | Apparatus and method for interactive 3D registration of ultrasound and magnetic resonance images based on a magnetic position sensor |
US20040234113A1 (en) * | 2003-02-24 | 2004-11-25 | Vanderbilt University | Elastography imaging modalities for characterizing properties of tissue |
US7916919B2 (en) | 2006-09-28 | 2011-03-29 | Siemens Medical Solutions Usa, Inc. | System and method for segmenting chambers of a heart in a three dimensional image |
US9521994B2 (en) | 2009-05-11 | 2016-12-20 | Siemens Healthcare Gmbh | System and method for image guided prostate cancer needle biopsy |
US9282933B2 (en) | 2010-09-17 | 2016-03-15 | Siemens Corporation | Magnetic resonance elastography for ultrasound image simulation |
US9098904B2 (en) | 2010-11-15 | 2015-08-04 | Dartmouth College | System and method for registering ultrasound and magnetic resonance images |
US20130324841A1 (en) | 2012-05-31 | 2013-12-05 | Ali Kamen | System and Method for Real-Time Ultrasound Guided Prostate Needle Biopsy Based on Biomechanical Model of the Prostate from Magnetic Resonance Imaging Data |
US9375195B2 (en) | 2012-05-31 | 2016-06-28 | Siemens Medical Solutions Usa, Inc. | System and method for real-time ultrasound guided prostate needle biopsy based on biomechanical model of the prostate from magnetic resonance imaging data |
US20160058424A1 (en) | 2014-08-26 | 2016-03-03 | Rational Surgical Solutions, Llc | Image registration for ct or mr imagery and ultrasound imagery using mobile device |
US20160242745A1 (en) | 2015-02-20 | 2016-08-25 | Emory University | Systems, Methods and Computer Readable Storage Media Storing Instructions for Image-Guided Interventions Based on Patient-Specific Models |
WO2018015414A1 (en) | 2016-07-21 | 2018-01-25 | Siemens Healthcare Gmbh | Method and system for artificial intelligence based medical image segmentation |
US20190200965A1 (en) * | 2016-09-12 | 2019-07-04 | Supersonic Imagine | Ultrasound imaging method and an apparatus implementing said method |
Non-Patent Citations (2)
Title |
---|
Khallaghi, S. et al., "Statistical Biomechanical Surface Registration: Application to MR-TRUS Fusion for Prostate Interventions," IEEE Trans. Med. Imaging, vol. 34, No. 12, pp. 2535-2549, Dec. 2015. |
Nir, G., et al., "Model-based registration of ex vivo and in vivo MRI of the prostate using elastography" IEEE Trans. Med. Imaging, vol. 32, No. 7, pp. 1349-1361, Jul. 2013. |
Also Published As
Publication number | Publication date |
---|---|
CN110458872A (en) | 2019-11-15 |
US20190336109A1 (en) | 2019-11-07 |
CN110458872B (en) | 2023-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11382603B2 (en) | System and methods for performing biomechanically driven image registration using ultrasound elastography | |
US9761014B2 (en) | System and method for registering pre-operative and intra-operative images using biomechanical model simulations | |
Haouchine et al. | Image-guided simulation of heterogeneous tissue deformation for augmented reality during hepatic surgery | |
CN109589170B (en) | Left atrial appendage closure guidance in medical imaging | |
US11504095B2 (en) | Three-dimensional imaging and modeling of ultrasound image data | |
US20180150929A1 (en) | Method and system for registration of 2d/2.5d laparoscopic and endoscopic image data to 3d volumetric image data | |
US8145012B2 (en) | Device and process for multimodal registration of images | |
US20180189966A1 (en) | System and method for guidance of laparoscopic surgical procedures through anatomical model augmentation | |
US20130293578A1 (en) | Four Dimensional Image Registration Using Dynamical Model For Augmented Reality In Medical Applications | |
US11793484B2 (en) | Method, device and system for intracavity probe procedure planning | |
US9697600B2 (en) | Multi-modal segmentatin of image data | |
US20230260129A1 (en) | Constrained object correction for a segmented image | |
KR20240055167A (en) | IMAGE-BASEd PROBE POSITIONING | |
CN108430376B (en) | Providing a projection data set | |
CN111566699A (en) | Registration of static pre-procedural planning data to dynamic intra-procedural segmentation data | |
WO2017101990A1 (en) | Determination of registration accuracy | |
Peressutti et al. | A framework for automatic model-driven 2D echocardiography acquisition for robust respiratory motion estimation in image-guided cardiac interventions | |
SERMESANT et al. | BIOMECHANICAL MODELS FOR IMAGE ANALYSIS AND SIMULATION |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC., PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PHEIFFER, THOMAS;KAPOOR, ANKUR;SIGNING DATES FROM 20180507 TO 20180509;REEL/FRAME:045800/0317 |
|
AS | Assignment |
Owner name: SIEMENS HEALTHCARE GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS MEDICAL SOLUTIONS USA, INC.;REEL/FRAME:046038/0445 Effective date: 20180515 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SIEMENS HEALTHINEERS AG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS HEALTHCARE GMBH;REEL/FRAME:066267/0346 Effective date: 20231219 |