WO2015087203A1 - Procédés et systèmes d'imagerie destinés à la surveillance du traitement de lésions tissulaires - Google Patents

Procédés et systèmes d'imagerie destinés à la surveillance du traitement de lésions tissulaires Download PDF

Info

Publication number
WO2015087203A1
WO2015087203A1 PCT/IB2014/066537 IB2014066537W WO2015087203A1 WO 2015087203 A1 WO2015087203 A1 WO 2015087203A1 IB 2014066537 W IB2014066537 W IB 2014066537W WO 2015087203 A1 WO2015087203 A1 WO 2015087203A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
ultrasound image
tissue
ultrasound
image
Prior art date
Application number
PCT/IB2014/066537
Other languages
English (en)
Inventor
James Robertson Jago
Thomas Patrice Jean Arsene Gauthier
Lars Jonas Olsson
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015087203A1 publication Critical patent/WO2015087203A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/1815Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using microwaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00636Sensing and controlling the application of energy
    • A61B2018/0066Sensing and controlling the application of energy without feedback, i.e. open loop control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00994Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combining two or more different kinds of non-mechanical energy or combining one or more non-mechanical energies with ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Definitions

  • the present invention relates to medical
  • diagnostic ultrasound systems and, in particular, to imaging systems and methods for monitoring ablation of tissue lesions.
  • tissue ablation where the diseased tissue is destroyed by application of local tissue heating, cooling, or other means.
  • ablation methods in common use are RF ablation, microwave ablation, HIFU, and cryo ablation.
  • Imaging methods are typically used to attempt to verify that all the diseased tissue has been treated, but current imaging methods have limitations.
  • Computed tomography is often used to plan, guide and monitor ablation, but it is expensive, delivers potentially harmful ionizing radiation doses to the patient and the operators, and has relatively poor soft tissue contrast.
  • Magnetic resonance imaging (MRI) is expensive and not well suited to interventional procedures involving
  • PET Positron emission tomography
  • methods and systems are provided for monitoring treatment (e.g., ablation) of lesions using a combination of ultrasound imaging with another non-ultrasound imaging modality.
  • the present invention includes imaging a lesion of interest using a modality other than ultrasound, such as MRI, CT and/or PET imaging.
  • the non-ultrasound image in particular, can be used to provide an accurate representation of the lesion boundary.
  • a first ultrasound image of the same lesion is generated and registered with the MRI, CT and/or PET image.
  • the lesion can be treated with or without a treatment plan using known ablative therapies, chemotherapy, radiation therapies, and/or other treatment
  • a second ultrasound image is generated to identify the volume of the treated tissue, which can be registered with the first ultrasound image, the non-ultrasound image, or both.
  • the ultrasound images, along with the non-ultrasound image, are also more accurately registered with reference to tissue features outside of the treatment volume.
  • tissue differences and/or similarities between the lesion and the treated tissue can be identified by comparing data in the two ultrasound images within or around the lesion boundary. Portions of the lesion and the treated tissue that have similar tissue characteristics can indicate insufficiently treated lesion tissue, and because the images are spatially registered a
  • Tissue lesions being treated include, e.g., tumors, cysts, and other tissues that can be treated using therapies, such as known ablative therapies,
  • chemotherapy radiation therapies, and/or other treatment techniques, such as local injections of alcohol or other substances for killing the lesion tissue .
  • FIGURE 1 illustrates in block diagram form the use of three dimensional ultrasonic imaging to guide or monitor treatment in an embodiment of the present invention .
  • FIGURE 2 illustrates in block diagram form the functional subsystems of a three dimensional
  • ultrasonic imaging system suitable for use in an embodiment of the present invention.
  • FIGURE 3 illustrates a workflow in accordance with the present invention for monitoring treatment of a lesion.
  • FIGURE 4 depicts an example registration of an ultrasound image and an MRI image using tissue features outside of a treatment volume.
  • FIGURE 5A illustrates a comparison of 3D
  • FIGURE 5B illustrates untreated tissue in need of further treatment as identified with registration with an MRI image.
  • the present invention includes imaging systems.
  • the present invention provides an imaging system for measuring a remaining volume of a lesion after an ablation treatment.
  • the system can receive a non-ultrasound image of a target region comprising a lesion.
  • the system can acquire a first ultrasound image comprising the lesion, and register the first ultrasound image with the non- ultrasound image.
  • the system can acquire a second ultrasound image that includes image data
  • the system can determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image. In certain embodiments, the system can determine the level of treatment by registering any combination of the non- ultrasound image, the first ultrasound image, and the second ultrasound image using imaged structures that are outside of the treated tissue. In some
  • the imaging system can include an ultrasonic diagnostic imaging system adapted to acquire the first and second ultrasound images.
  • Non-ultrasound images can be used, such as, e.g., a magnetic resonance (MR) image, a computed tomography (CT) image, or a positron emission
  • MR magnetic resonance
  • CT computed tomography
  • positron emission positron emission
  • the first and second ultrasound images can include a 3D volume of the lesion and a 3D volume of the treated tissue, respectively. Determining if sufficient treatment has occurred can include subtracting the 3D volume of the lesion from the 3D volume of the treated tissue according to similar or different tissue
  • the system can also register the non- ultrasound image, the first and second ultrasound images, and a treatment plan depicting a predicted tissue volume to be treated.
  • the treated tissue is ablated tissue.
  • Structural components of the system can include processors and other well-known components used to carry out the methods and features of imaging and ultrasound
  • FIGURE 1 the use of three dimensional ultrasonic imaging to monitor ablation with a tissue ablation probe is shown in partial block diagram form.
  • 3D three dimensional ultrasonic imaging system
  • an ultrasound probe 10 having a two
  • the transducer array transmits ultrasonic beams over a volumetric field of view 12 including a lesion 14 under control of an ultrasound acquisition subsystem 16 and receives echoes in response to the transmitted beams which are coupled to and processed by the acquisition subsystem.
  • the echoes received by the elements of the transducer array are combined into coherent echo signals by the acquisition subsystem and the echo signals along with the coordinates from which they are received (r,6,cp for a radial transmission pattern) are coupled to a 3D image processor 18.
  • the 3D image processor processes the echo signals into a three dimensional ultrasonic image, which is displayed on a display 20.
  • the ultrasound system is controlled by a control panel 22 by which the user defines the
  • FIGURE 1 includes an interventional device system for performing treatment, e.g., tissue ablation.
  • the interventional device system includes an ablation probe 24, the different types of which are well known in the art.
  • the ablation probe 24 is used to ablate a desired tissue region in a patient, and it can be manipulated by a physician (not shown) and/or a guidance subsystem 26, which may mechanically assist the maneuvering and placement of the interventional device within the body.
  • the ablation probe 24 is operated to ablate tissue under the control of an intervention subsystem 28, which can be operated via control panel 36 (or control panel 22, if only one control panel is used) .
  • the intervention subsystem 28 can also receive information on the procedure being performed, such as optical or acoustic image
  • the ablation probe 24 and/or the ultrasound probe 10 may also have active position sensors that are used to provide information as to the location of the tip of the ablation probe along the insertion path 32 and/or the position of the transducer, which can be used to determine the position of the transducer imaging plane as well.
  • the active position sensors may operate by transmitting or receiving signals in the acoustic, optical, radio frequency or electromagnetic spectrum and its output is coupled to a device position
  • Position information of the interventional device is coupled to the display
  • processor 30 when appropriate for the processing or display of information concerning the position of the interventional within the body.
  • Information pertinent to the functioning or operation of the ablation probe is displayed on a display 20.
  • image data may be exchanged over a signal path 38 between the 3D image processor 18 of the ultrasound system and the display processor 30 of the interventional device system for the formation of a 3D image containing information from both systems.
  • the system in FIGURE 1 further includes a signal path 40 that connects the ultrasound acquisition subsystem 16 of the ultrasound system and the device position measurement subsystem 34 of the interventional device system to allow synchronization of the imaging system and the interventional device.
  • FIGURE 2 illustrates some of the components of the 3D ultrasound system of FIG. 1 in further detail.
  • the elements of a two dimensional array transducer 42 are coupled to a plurality of microbeamformers 44.
  • the microbeamformers control the transmission of ultrasound by the elements of the array transducer 42 and
  • the microbeamformers 44 are preferably fabricated in integrated circuit form and located in the housing of the ultrasound probe 10 near the array transducer. Microbeamformers , or subarray beamformers as they are often called, are more fully described in U.S. Pat. Nos. 6,375,617 and 5,997,479, which are incorporated by reference herein in their entirety.
  • the ultrasound probe 10 may also include a position sensor 46 which provides signals indicative of the position of the probe 10 to a transducer position detector 48.
  • the sensor 46 may be a magnetic, electromagnetic, radio frequency, infrared, or other type of sensor such as one which transmits a signal that is detected by a voltage impedance circuit.
  • the transducer position signal 50 produced by the detector 48 may be used by the ultrasound system or coupled to the interventional device system when useful for the formation of spatially coordinated images containing information from both systems.
  • the partially beamformed signals produced by the microbeamformers 44 are coupled to a beamformer 52 where the beam formation process is completed.
  • the resultant coherent echo signals along the beams are processed by filtering, amplitude detection, Doppler signal detection, and other processes by a signal processor 54.
  • the echo signals are then processed into image signals in the coordinate system of the probe (r,6,cp for example) by an image processor 56.
  • the image signals are converted to a desired image format (x,y,z Cartesian coordinates, for example) by a scan converter 58.
  • the three dimensional image data is coupled to a volume renderer 60 which renders a three dimensional view of the volumetric region 12 as seen from a
  • Volume rendering is well known in the art and is described, e.g., in U.S. Pat. No. 5,474,073, which is incorporated by reference herein in its entirety. Volume rendering may also be performed on image data which has not been scan converted as
  • the image plane data bypasses the volume renderer and is coupled directly to a video processor 62 which produces video drive signals compatible with the requirements of the display 64.
  • the volume rendered 3D images are also coupled to the video processor 62 for display.
  • the system can display individual volume rendered images or a series of volume rendered images.
  • two volume renderings can be done of a volumetric data set from slightly offset look directions, and the two are displayed
  • a graphics and registration processor 66 is used for analysis and registration of images, such as the registration of two ultrasound images, an ultrasound image with a non- ultrasound image, or two ultrasound images and a non- ultrasound image.
  • the graphics and registration processor 66 can receive images and data associated with a treatment plan for the ablation procedure, including an expected ablation region for the treatment. The treatment plan and the expected ablation region can also be registered with the other images.
  • the graphics and registration processor 66 receives either scan
  • the ultrasound system described above can be performed with a freehand approach or with an
  • interventional device system such as the PercuNav system, elements of which are shown in FIGURE 2.
  • the PercuNav system provides imaging tools to assist clinicians in ablation procedures. It combines electromagnetic tracking of flexible or rigid instruments with patient images from multiple modalities (e.g., CT, MRI, PET and/or ultrasound) to create a real-time 3D map of the patient space that displays the instrument position, orientation, and trajectory, as well as anatomical landmarks. This map helps guide physicians to areas of interest, even when they are small, hard to visualize, difficult to access, or close to sensitive organs, vessels, or tissue. Furthermore, display of corresponding data from multiple imaging modalities can be overlaid and side-by-side images with areas of interest can be automatically marked on images. Easy localization and comparison of hard-to- find or ambiguous ultrasound targets can also be conducted by referring to related CT or MR images with corresponding areas of interest marked on the different modality images.
  • modalities e.g., CT, MRI, PET and/or ultrasound
  • PercuNav system has a field generator 68 which radiates an electromagnetic field permeating the site of the procedure and surrounding space.
  • Sensors 46 are located on the ultrasound probe 10, the ablation probe 24 and the patient (not shown) which interact with the electromagnetic field and produce signals used to calculate the position and orientation of the 2D image plane of the ultrasound transducer, the tissue ablation probe, and the patient.
  • a coordinate generator 70 of the PercuNav system which receives the transducer position signal 50 and signals from the ablation apparatus and orientation coordinates for the image plane of the probe are also coupled to the field generator for field registration purposes.
  • Coordinate information of the ablation probe and image plane is coupled to the graphics and
  • registration processor 66 which produces graphics in response to operator control signals from the control panel 36 and uses the positional information provided by the PercuNav system to register the various images in accordance with the implementations of the present invention.
  • the present invention provides methods of identifying insufficiently treated portions of a lesion after treatment.
  • the methods can include, for example, generating a non-ultrasound image of a target region of tissue comprising the lesion,
  • acquiring a first ultrasound image comprising the lesion registering the first ultrasound image with the non-ultrasound image, treating tissue that comprises at least a portion of the lesion, acquiring a second ultrasound image comprising the treated tissue, determining if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
  • FIGURE 3 is a flow chart showing the workflow 72 of an implementation of the present invention.
  • This workflow 72 begins with a step 74 that includes obtaining a non-ultrasound image, such as a CT, MR, and/or PET image that includes three-dimensional data of imaged tissue of a patient.
  • the non-ultrasound images can be acquired as part of the interventional procedure, or the CT, MR, and/or PET images can be acquired prior to the interventional procedure and uploaded for display in the system.
  • the field of view for the non-ultrasound image includes a lesion of interest (e.g., a liver tumor) that can be processed using known methods to generate a region of interest identifying an accurate representation of a lesion boundary in 2D or 3D.
  • the lesion boundary is used to accurately determine whether the entire lesion is sufficiently treated during a treatment procedure.
  • the non-ultrasound image also includes surrounding tissue that is outside a treatment volume (e.g., an ablation volume) in the vicinity of the lesion.
  • the surrounding tissue can include, e.g., tissue structures and/or blood vessels, that will provide accurate registration of later acquired ultrasound images with the non- ultrasound images.
  • Automated image-based systems for registering ultrasound volumes of tissue are well known. However, current algorithms that reference the lesion for registration will be confused or compromised by changes in the tissue due to ablation. These changes are, in fact, the very changes in the tissue that are detected and used to determine whether further treatment is needed.
  • tissue that is outside the ablation region is used by the registration algorithm for more accurate registration of images acquired before and after ablation because the tissue outside the ablation region is unaffected. Identifying the tissue outside the ablation region can be done manually through user selection of tissue spatially removed from the lesion or via the expected ablation region defined by a treatment plan that is determined using the non-ultrasound images.
  • a treatment plan can also be used to model how to treat the lesion.
  • the treatment plan depicts a
  • the treatment plan and the expected tissue treatment (e.g., ablation) volume can also be registered with the non- ultrasound image and superimposed over ultrasound images of the patient.
  • Step 76 includes acquiring an ultrasound image that includes the lesion and surrounding tissue that is outside the treatment volume.
  • the ultrasound image can be acquired in 2D or it can include 3D image data of the lesion that is processed and rendered with the volume renderer 60 to generate a 3D volume of the lesion .
  • the non-ultrasound image and the ultrasound image are registered and fused in step 78, thereby overlaying the 3D ultrasound image volume of the lesion with the lesion imaged with the non-ultrasound modality.
  • registration can be conducted using the lesion (as no tissue changes have occurred due to ablation) or by using tissue features surrounding the lesion, such as those outside the treatment volume.
  • Step 80 includes treating (e.g., ablating) the tissue that includes the lesion. Depending on the volume of the ablation, some or all of the lesion will be ablated.
  • a second ultrasound image is acquired after tissue ablation in step 82.
  • the second 3D ultrasound image is acquired immediately or after a specified duration, such as a duration needed for dispersal of any gases associated with the treatment, such as gases generated with RF ablation.
  • the second ultrasound image also includes tissue that is outside of the predicted treatment volume. This ensures that the tissue within the treatment volume is registered accurately even though it has been affected by the treatment.
  • Step 84 includes determining if any portion of the lesion is remaining after ablation that was
  • the determination can be performed in a variety of ways that use the non- ultrasound image to provide a more accurate
  • registration of the ultrasound images, non-ultrasound images, and/or the treatment plan can be accomplished in different orders. But, registration with the non- ultrasound image provides accurate data for defining the lesion boundary, and therefore allows for a
  • the non-ultrasound image is registered with the
  • the second ultrasound image can be registered with the non-ultrasound image, the ultrasound image, or both.
  • the registration algorithm can use tissue features outside the treatment volume.
  • the treatment plan is also registered with one or more of the images and is used to define tissue that is outside the treatment volume .
  • differences in elastography imaging differences in image parameter analysis (e.g., grayscale differences), and/or differences identified with flow-based
  • tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not ablated.
  • rendered ultrasound volumes can be compared between the pre- ablation lesion and the ablation tissue.
  • the two rendered volumes of tissue can be subtracted from each other to highlight tissue that has changed
  • a 3D volume of the lesion can be
  • the lesion border more accurately defined by the non-ultrasound modality can be used to determine whether the treated (e.g., ablated) tissue (as determined by the pre- and post-ablation comparison algorithm) extends beyond the border of the target lesion in all orientations. In this way, the clinician can be confident that the entire lesion has been treated or, if not, can continue with the treatment for a further period of time. Any tissue that lies within the lesion border that has not been highlighted may then be considered as tissue that was not sufficiently ablated and that may require further ablation. If there is any lesion tissue remaining, then the steps can be repeated until the entire lesion has been fully treated. In some
  • the treated tissue volume may be larger than the lesion volume, which would indicated that all of the lesion was treated and no further treatment is needed.
  • the ultrasound systems can operate to perform any of the following steps: receive a non-ultrasound image of a target region comprising a lesion in a patient; acquire a first ultrasound image comprising the lesion; register the first ultrasound image with the non-ultrasound image; acquire a second ultrasound image comprising a region of treated tissue, the region of treated tissue comprising at least a portion of the lesion in the patient; and determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
  • FIGURE 4 illustrates an embodiment of registering an ultrasound image 86 with a non-ultrasound image 88, such as a MRI image.
  • a non-ultrasound image 88 such as a MRI image.
  • the ultrasound image 86 includes tissue features 90 that are outside of a region surrounding the lesion 92.
  • tissue features 90 that are spatially removed from the lesion
  • the ultrasound and non-ultrasound images may be anatomically aligned in the same orientation and overlaid. Registration may be done using known image fusion techniques such as the image fusion capability available on the Percunav image guidance system with image fusion, available from Philips Healthcare of Andover, MA.
  • Image matching techniques may also be used, such as those used to stitch digital photographs together to form a panoramic image or those used in medical diagnostic panoramic imaging, in which a sequence of images are stitched together as they are acquired.
  • Common image matching techniques use block matching, in which arrays of pixels from two images are manipulated to find a difference between them which meets a least squares (MSAD) fit. These techniques are useful for both 2D and 3D medical images as described in US Pat. 6,442,289 (Olsson et al . ) , which is
  • FIGURE 5A and 5B illustrate an implementation of comparing ultrasound images of a lesion for determining whether further treatment is needed after one treatment (e.g., an ablation treatment) .
  • ultrasound imaging data from a first ultrasound scan 94 can be used to render a 3D volume 96 of a lesion of interest.
  • a second 3D volume of the treated tissue volume 98 can be rendered from ultrasound imaging data from a second scan 100.
  • the tissue of the lesion and the treated volume can be compared using a variety of techniques that respond differently to treated vs.
  • untreated tissue such as using ultrasound contrast agents that show up differently in treated vs.
  • tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not treated.
  • the volume of the lesion is larger than the treated tissue volume that showed different characteristics than the untreated lesion tissue.
  • the portion having similar tissue characteristics is shown in black with respect to the lesion and can readily be determined for 2D or for 3D volumes.
  • the accurate location of the insufficiently treated tissue is
  • non-ultrasound image such as an MRI image 88
  • non-ultrasound images can have brighter contrast to show a lesion boundary more clearly.
  • registering the ultrasound images and/or compared ultrasound image data with the MRI image provide the physician with guidance data on what region of a lesion will need further treatment (e.g., ablation) .

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Otolaryngology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Plasma & Fusion (AREA)
  • Quality & Reliability (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

La présente invention concerne des procédés et des systèmes de surveillance du traitement de lésions, faisant appel à une association d'imagerie ultrasonore et d'un autre mode d'application d'imagerie non ultrasonore, tel que TDM, IRM et/ou TEP. Une image d'une lésion est acquise à l'aide d'un mode d'application autre qu'ultrasonore. Une première image ultrasonore de la lésion est générée et enregistrée avec les images IRM, TDM et/ou TEP. La lésion est traitée et une seconde image ultrasonore est générée afin d'identifier le volume du tissu traité. Les deux images ultrasonores sont enregistrées avec l'image non ultrasonore afin de mieux identifier le tissu qui se trouve à l'intérieur d'une limite de la lésion. Les première et seconde images ultrasonores enregistrées sont comparées afin de déterminer si un quelconque tissu lésionnel reste à l'intérieur de la limite de la lésion dans l'image non ultrasonore, ce qui permet d'identifier le tissu lésionnel qui est susceptible d'être insuffisamment traité.
PCT/IB2014/066537 2013-12-13 2014-12-03 Procédés et systèmes d'imagerie destinés à la surveillance du traitement de lésions tissulaires WO2015087203A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361915657P 2013-12-13 2013-12-13
US61/915,657 2013-12-13

Publications (1)

Publication Number Publication Date
WO2015087203A1 true WO2015087203A1 (fr) 2015-06-18

Family

ID=52350159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/066537 WO2015087203A1 (fr) 2013-12-13 2014-12-03 Procédés et systèmes d'imagerie destinés à la surveillance du traitement de lésions tissulaires

Country Status (1)

Country Link
WO (1) WO2015087203A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020518385A (ja) * 2017-05-04 2020-06-25 ガイネソニックス, インコーポレイテッド ドップラー超音波を用いたアブレーション進行過程の監視のための方法
US11419682B2 (en) 2016-11-11 2022-08-23 Gynesonics, Inc. Controlled treatment of tissue and dynamic interaction with, and comparison of, tissue and/or treatment data
WO2023222845A1 (fr) * 2022-05-20 2023-11-23 Koninklijke Philips N.V. Visualisation d'image à modalités multiples pour détection d'accident vasculaire cérébral
WO2024116002A1 (fr) * 2022-11-29 2024-06-06 Biosense Webster (Israel) Ltd. Évaluation d'ablation de tissu à l'aide d'un cathéter ultrasonore intracardiaque

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474073A (en) 1994-11-22 1995-12-12 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scanning for three dimensional display
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
WO2002009588A1 (fr) * 2000-08-01 2002-02-07 Tony Falco Procede et appareil pour la localisation, la definition et la verification de lesions
US6375617B1 (en) 2000-08-24 2002-04-23 Atl Ultrasound Ultrasonic diagnostic imaging system with dynamic microbeamforming
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
US6723050B2 (en) 2001-12-19 2004-04-20 Koninklijke Philips Electronics N.V. Volume rendered three dimensional ultrasonic images with polar coordinates
WO2005010711A2 (fr) * 2003-07-21 2005-02-03 Johns Hopkins University Systeme et procede a ultrasons robotises en 5 dimensions
US20050261591A1 (en) * 2003-07-21 2005-11-24 The Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
WO2013179221A1 (fr) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Procédés basés sur l'imagerie d'élasticité pour un rendement de portillonnage amélioré et une adaptation de la marge dynamique en radiothérapie

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474073A (en) 1994-11-22 1995-12-12 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scanning for three dimensional display
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
WO2002009588A1 (fr) * 2000-08-01 2002-02-07 Tony Falco Procede et appareil pour la localisation, la definition et la verification de lesions
US6375617B1 (en) 2000-08-24 2002-04-23 Atl Ultrasound Ultrasonic diagnostic imaging system with dynamic microbeamforming
US6723050B2 (en) 2001-12-19 2004-04-20 Koninklijke Philips Electronics N.V. Volume rendered three dimensional ultrasonic images with polar coordinates
WO2005010711A2 (fr) * 2003-07-21 2005-02-03 Johns Hopkins University Systeme et procede a ultrasons robotises en 5 dimensions
US20050261591A1 (en) * 2003-07-21 2005-11-24 The Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
WO2013179221A1 (fr) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Procédés basés sur l'imagerie d'élasticité pour un rendement de portillonnage amélioré et une adaptation de la marge dynamique en radiothérapie

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11419682B2 (en) 2016-11-11 2022-08-23 Gynesonics, Inc. Controlled treatment of tissue and dynamic interaction with, and comparison of, tissue and/or treatment data
JP2020518385A (ja) * 2017-05-04 2020-06-25 ガイネソニックス, インコーポレイテッド ドップラー超音波を用いたアブレーション進行過程の監視のための方法
EP3638126A4 (fr) * 2017-05-04 2021-03-10 Gynesonics, Inc. Procédés de surveillance de la progression d'une ablation par écho-doppler
US11612431B2 (en) 2017-05-04 2023-03-28 Gynesonics, Inc. Methods for monitoring ablation progress with doppler ultrasound
WO2023222845A1 (fr) * 2022-05-20 2023-11-23 Koninklijke Philips N.V. Visualisation d'image à modalités multiples pour détection d'accident vasculaire cérébral
WO2024116002A1 (fr) * 2022-11-29 2024-06-06 Biosense Webster (Israel) Ltd. Évaluation d'ablation de tissu à l'aide d'un cathéter ultrasonore intracardiaque

Similar Documents

Publication Publication Date Title
US8075486B2 (en) Enhanced ultrasound image display
CN107072736B (zh) 计算机断层扫描增强的荧光透视系统、装置及其使用方法
Wein et al. Automatic CT-ultrasound registration for diagnostic imaging and image-guided intervention
JP5345275B2 (ja) 超音波データと事前取得イメージの重ね合わせ
JP5265091B2 (ja) 2次元扇形超音波イメージの表示
JP4795099B2 (ja) 超音波を用いた電気解剖学的地図と事前取得イメージの重ね合わせ
JP5622995B2 (ja) 超音波システム用のビーム方向を用いたカテーテル先端部の表示
US10299753B2 (en) Flashlight view of an anatomical structure
JP2008535560A (ja) 身体ボリュームにおける誘導介入的医療デバイスのための3次元イメージング
KR20080053224A (ko) 초음파 데이터 획득을 표시하기 위한 전기해부학적 맵의컬러화
WO2005092198A1 (fr) Systeme pour guider un instrument medical dans le corps d'un patient
JP2006305358A (ja) 超音波輪郭再構築を用いた3次元心臓イメージング
CA2796067A1 (fr) Systemes et procedes d'imagerie amelioree d'objets dans image
JP2006305359A (ja) 超音波輪郭再構築を用いた3次元心臓イメージングのためのソフトウエア製品
WO2014031531A1 (fr) Système et procédé de procédures médicales guidées par des images
Mohareri et al. Automatic localization of the da Vinci surgical instrument tips in 3-D transrectal ultrasound
JP6088653B2 (ja) アブレーション治療のための超音波体積流量測定
WO2015087203A1 (fr) Procédés et systèmes d'imagerie destinés à la surveillance du traitement de lésions tissulaires
Shahin et al. Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions
AU2013251245B2 (en) Coloring electroanatomical maps to indicate ultrasound data acquisition
Neshat et al. Development of a 3D ultrasound-guided system for thermal ablation of liver tumors
Caskey et al. Electromagnetically tracked ultrasound for small animal imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14827540

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14827540

Country of ref document: EP

Kind code of ref document: A1