WO2013134782A1 - Photoacoustic tracking and registration in interventional ultrasound - Google Patents

Photoacoustic tracking and registration in interventional ultrasound Download PDF

Info

Publication number
WO2013134782A1
WO2013134782A1 PCT/US2013/030273 US2013030273W WO2013134782A1 WO 2013134782 A1 WO2013134782 A1 WO 2013134782A1 US 2013030273 W US2013030273 W US 2013030273W WO 2013134782 A1 WO2013134782 A1 WO 2013134782A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
optical
registration
tracking system
image
Prior art date
Application number
PCT/US2013/030273
Other languages
French (fr)
Inventor
Emad M. Boctor
Russell H. Taylor
Jin U. KANG
Original Assignee
The Johns Hopkins University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Johns Hopkins University filed Critical The Johns Hopkins University
Priority to US14/381,374 priority Critical patent/US10758209B2/en
Publication of WO2013134782A1 publication Critical patent/WO2013134782A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0084Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/445Details of catheter construction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound

Definitions

  • the field of the currently claimed embodiments of this invention relates to intraoperative registration and tracking systems, and more particularly to intraoperative registration and tracking systems that use photoacoustic tracking and registration in interventional ultrasound.
  • IOUS imaging can be extremely useful in surgery and other interventional procedures.
  • Ultrasound (US) systems provide real time 2D and 3D views into the patient's anatomy with relatively unobtrusive hardware and without the use of ionizing radiation.
  • An increasing variety of transcutaneous, laparoscopic, and catheter-based probes including both 2D and 3D arrays are available from multiple vendors; image quality is rapidly improving; and advanced image processing techniques are increasing the variety of information that can be provided to the surgeon. Consequently, there is increasing interest in using IOUS in both open and laparoscopic surgery [8-10], in providing guidance for biopsies [1 1], tumor ablation therapy [12, 13], brachytherapy [14], and recently robotic surgery [15, 16].
  • IOUS intraoperative ultrasound
  • preoperative information such as CT or MRI images
  • computer-based information fusion and imaging systems have significant potential to overcome this limitation, but they should address three main challenges to be useful in minimally invasive surgery.
  • An intraoperative registration and tracking system includes an optical source configured to illuminate tissue intraoperatively with electromagnetic radiation at a substantially localized spot so as to provide a photoacoustic source at the substantially localize spot, an optical imaging system configured to form an optical image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the optical image, an ultrasound imaging system configured to form an ultrasound image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the ultrasound image, and a registration system configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
  • FIG. 1 is a schematic illustration of a typical EM-based laparoscopic IOUS guidance and visualization system [3].
  • FIG. 2 is a schematic illustration of an intraoperative registration and tracking system according to an embodiment of the current invention.
  • FIGS. 3-5 are schematic illustrations of portions of an intraoperative registration and tracking system according to another embodiment of the current invention.
  • FIG. 6 is a schematic illustration of a phantom design to test some embodiments of the current invention.
  • FIG. 7 shows tests of an example for an embodiment of the current invention:
  • FIG. 8 shows an example of video overlay of preoperative model based on
  • FIGS. 9 A and 9B are schematic illustrations for identification of tines using
  • FIG. 9A flood illumination
  • FIG. 9B local illumination from fibers in tool.
  • FIGS. 10A and 10B shows A) Experimental Setup and Video Overlay, B) PA
  • FIGS. 11A-11C show workflow for A) Data Collection, B) SC Segmentation, and C) US Segmentation.
  • FIG. 12 shows a typical imaging configuration of the dual-array TRUS probe.
  • FIG. 13 shows photoacoustic tracking feasibility study: (left) Photograph of a scanned laser, US and stereo camera setup, in a typical trial, (middle, right) PA image reconstructed from pseudo-spectra analysis and k-space Green's function method for one spot.
  • FIG. 14 shows visualization of brachytherapy seeds in ex-vivo dog prostate using system.
  • FIG. 15 is a schematic illustration of an intraoperative registration and tracking system according to an embodiment of the current invention.
  • the terms "light” and “optical” are intended to have a broad meaning. They can include, but are not limited to, the visible regions of the electromagnetic spectrum. They can include nonvisible regions of the electromagnetic spectrum such as infrared and ultraviolet light, and even x-ray and microwave regions. As long as the electromagnetic radiation can deposit a localized spot of energy that generates ultrasound, and the spot can be detected along with a corresponding image, it can be included in some embodiments.
  • thermoacoustic is intended to have a broad definition which can be photons at any energy suitable for the particular application that deposit energy that generates an acoustic signal in a body of interest. This is intended to be sufficiently broad to include photons of microwave energy.
  • thermoacoustic effect is often used with reference to microwave energies.
  • photoacoustic as used herein is intended to include thermoacoustic in the broad definition.
  • body refers general to a mass, and not specifically to a human or animal body. In some applications, the body of interest can be a human or animal organ, or a portion thereof.
  • spot is intended to have a broad meaning. It can be point-like or a small circular or oval shape. However, it can also can be a pattern, such as, but not limited to an x shape, a v shape, a Z shape, and N shape, etc.
  • substantially localized spot means a spot of a size and of defined boundaries sufficient for the particular application. (In the case of a pattern, the localization can be with respect to one sub-feature of the pattern.) For example, most surgeries may require spots sizes from 0.5 to 2 mm. However, some surgeries may require more precision than other surgeries and the imaging geometries may vary. Consequently, the general concepts of the current invention are not limited to particular sizes and location precision of the spots.
  • interstitial means to be inserted into tissue, such as, but not limited to, a needle inserted into tissue with the inserted tip being surrounded by the tissue.
  • the term "real-time” is intended to mean that the images can be provided to the user during use of the system. In other words, any noticeable time delay between detection and image display to a user is sufficiently short for the particular application at hand. In some cases, the time delay can be so short as to be unnoticeable by a user.
  • 3DPA 3D photoacoustic images
  • 3DUS 3D ultrasound images or the overall system.
  • the same transducer can be used for both and both have the same coordinate system, and we can use "3DUS coordinates" and "3DPA coordinates" interchangeably.
  • Intraoperative ultrasound (IOUS) imaging is being used increasingly as a guidance modality in interventional procedures - including open and laparoscopic surgery [8, 24], local ablations [13, 25-27], brachytherapy [14], intracardiac procedures [28, 29], etc.
  • IOUS images require that the surgeon relate these images to the physical reality of the patient's anatomy and the motions of surgical tools.
  • IOUS images are increasingly "fused” or registered to preoperative data such as CT or MRI images or surgical planning (e.g., [17, 30]), and the results are used to help guide an intervention. Again, the surgeon must relate the registered images to the physical coordinate system associated with the patient's anatomy and other interventional tools.
  • Robotic systems such as the DaVinci surgical robot [31] typically relate the motions of tools to a "visualization" coordinate system associated with a video endoscope and provide means for displaying other images, but rely on the surgeon to relate this data to the video images.
  • EM systems are the most commonly used tracking systems for laparoscopic surgery, flexible endoscopy, and other MIS applications.
  • EM sensors may be attached to tools, IOUS probes, and cameras, and may be implanted into moving or deforming organs for use as fiducial landmarks (Fig. 1).
  • IOUS probes IOUS probes
  • Fig. 1 fiducial landmarks
  • Fig. 1 fiducial landmarks
  • LPN laparoscopic partial nephrectomies
  • EM sensors have a number of drawbacks that make them less than ideal for use in laparoscopic surgery, especially if a robot is involved.
  • EM systems have problems that apply more broadly to any laparoscopic or MIS environment. These include: intrusive integration of a large EM "base station” and other equipment into the surgical field; field distortion from other metallic objects or during electrocautery and ablation treatment; interference between tools working in close proximity to each other; dealing with EM sensor wire leads; increased cost in sterilizing tools with associated sensors; and overall cost-effectiveness, especially when we include the needed pre-operative calibration and preparation time to ultrasound probes, cameras and the tools.
  • Optical tracking systems such as [38, 39] avoid the field distortion problems associated with EM devices and may be slightly more accurate, but line of sight restrictions often make them impractical for use inside the body, and placing them outside the body can result in degraded tracking accuracy for long or flexible tools or laparoscopic imaging probes, even if the various tracking targets are all visible to the tracker.
  • PA imaging is based on the photoacoustic effect, originally discovered by Alexander Graham Bell.
  • an object is usually irradiated by a short-pulsed, non-ionizing laser beam.
  • Some of the delivered energy is absorbed, according to optical absorption properties of the target tissue or other material, and converted into heat, leading to transient thermoelastic expansion and thus wideband ultrasonic emission, which can be detected by conventional IOUS probes and processed to produce high contrast images. Since the effect is sensitive to tissue density, composition and properties such as hemoglobin oxygen saturation, it is useful both for anatomic and functional imaging.
  • Some embodiments of the current invention use PA methods to replace navigational trackers in ultrasound-guided surgery and other minimally invasive interventions, together with the systems and techniques for doing this.
  • Some embodiments use PA imaging to perform real-time registration between the ultrasound and video image spaces. This approach does not require a base station and does not require implanted fiducial markers to complete the video-to-ultrasound registration needed for providing "augmented reality" guidance and information support in endoscopic surgery. Further, since the image- to-image registration is more direct than tracker-based methods, it can be less likely to be affected by extraneous errors and can provide significant accuracy advantages [1].
  • small implanted PA fiducial markers can also be used to track tissue and organ motion after an initial registration. This can eliminate any dependence on implanted EM fiducials for tracking. This also can provide other advantages such as simplicity, directness, and accuracy in some embodiments.
  • the PA markers may themselves be used as registration fiducials in cases where they may have been implanted for preoperative imaging. Clinical investigators often perform tumor biopsy before operation. Small FDA approved particles can be injected to facilitate several follow up treatment steps including precise repeated biopsy, post-operative imaging and accurate guidance during surgery. Small particles within the resection margin need to be resected with the tumor and hence another use of PA imaging is to detect these markers after resection to assure negative margins.
  • Some embodiments of the current invention can include a safety system.
  • laser light is delivered through optical fibers into the patient's body for laparoscopic surgery. It is desirable to ensure that no laser light escapes from the patient's body, or at least below a safe threshold.
  • additional light can be added to the optical path, such as the optical fiber.
  • the optical fiber can be a multimode optical fiber, for example, to be able to transmit a plurality of beams of light at relatively spaced wavelengths.
  • a low energy infrared light source from an LED or laser may be used as sort of a guide channel, or safety signal. Detectors placed outside the body can detect this monitoring light even in cases when the higher power laser used to create the PA effect is turned off. Safety circuits or other safety monitoring devices can prevent the higher power laser from being turned on if the monitoring light is detected and/or a suitable warning signal such as an audible alarm can be triggered.
  • the monitoring light may be modulated at a known frequency, or with other suitable modulation, and matched to suitable detection circuits to increase sensitivity.
  • Figure 2 provides a schematic illustration of an intraoperative registration and tracking system 100 according to an embodiment of the current invention.
  • the intraoperative registration and tracking system 100 includes an optical source 102 configured to illuminate tissue 104 intraoperatively with electromagnetic radiation at a substantially localized spot (e.g., 106) so as to provide a photoacoustic source at the substantially localize spot, and an optical imaging system 108 configured to form an optical image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the optical image.
  • a substantially localized spot e.g., 106
  • an optical imaging system 108 configured to form an optical image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the optical image.
  • the intraoperative registration and tracking system 100 also includes an ultrasound imaging system 1 10 configured to form an ultrasound image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the ultrasound image; and a registration system 112 configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
  • an ultrasound imaging system 1 10 configured to form an ultrasound image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the ultrasound image
  • a registration system 112 configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
  • the optical source 102 can be configured to provide pulsed radiation and/or continuous wave radiation in some embodiments.
  • the pulsed radiation can be provided by a pulsed laser, as is illustrated in the example of Figure 2.
  • the optical source 102 can include one or more pulsed lasers.
  • the general concepts of the current invention are not limited to only the use of lasers.
  • a pulsed laser that is selectively frequency double can be used. This can permit the selective use of two different illuminating wavelengths.
  • Some embodiments can use two or more pulsed laser, each of which could be selectively frequency double.
  • the optical source 102 can include a tunable pulsed laser to permit greater flexibility in selecting illumination wavelengths.
  • the pulsed radiation can be electromagnetic radiation at any suitable wavelength for the particular application. In many applications, pulsed radiation within the visible, infrared and/or ultraviolet wavelength ranges is suitable. However, the broad concepts of the current invention are not limited to pulsed sources only in these ranges. As long as the radiation provides photons that can be absorbed to provide an acoustic response with a sufficiently small focus for the application, it could be used in some embodiments.
  • the optical source 102 can include additional LEDS and/or lasers in some embodiments. For example one or more continuous wave (cw) laser can be included to provide illumination for the optical imaging system 108 to form images of the tissue.
  • the cw laser, or lasers can be within the visible, infrared and/or ultraviolet wavelength ranges.
  • the optical source 102 can also have a source of light for a leakage detection system. This can be a dual use of one of the cw lasers for both illuminating for imaging and light leakage detection, or a separate source. In some embodiments, an infrared laser is suitable for leakage detection.
  • the optical source 102 can further include an optical fiber 114. Although one optical fiber is illustrated in Figure 2, two, three or more optical fibers can be used.
  • the optical source 102 can include optical components to selectively couple light from the one or more lasers into the same optical fiber and/or to selectable optical fibers from a plurality of optical fibers.
  • the optical fiber 114, and any additional optical fibers can be a multimode optical fiber to be able to transmit different wavelengths of light either sequentially, or concurrently. In other words, multiplexing different wavelengths at the same time can be provided, but is not required in all embodiments.
  • the general concepts of the current invention do not preclude the use of single mode optical fibers in some embodiments.
  • the optical fiber 114 is attached to a structure containing pick-up elements of a video system. This can be, but is not limited to, an endoscope, for example. Alternatively, the optical fiber 114 could be attached to a surgical tool, for example. In some embodiments, there can be a plurality of optical fibers with one or groups attached to different objects, such as, but not limited to, a plurality of surgical tools including an endoscope.
  • the optical source 102 can illuminate the tissue at two or more spots, either in sequence, or at the same time. Optical components can be included in the optical source 102 to scan the illumination beam according to some embodiments, such as, but not limited to a scanning mirror arrangement.
  • the registration system 112 is configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
  • the images and coordinate transformations can be of any dimensions. For example, they can be one-dimensional, two-dimensional and/or three-dimensional images and coordinate transformations.
  • the registration system 112 can be implemented on a work station as is illustrated in Figure 2. These components can be implemented through software by programming the workstation, for example. However, in other embodiments, one or more of these components could be implemented on dedicated hardware, such as, but not limited to, ASICs and/or FPGAs. In addition, the workstation can have one or more CPUs and/or GPUs. Preoperative data can also be registered with the optical and ultrasound images in some embodiments. This can be implemented on workstation 112 through software, or could be implemented through dedicated hardware. A tracking system to track surgical tools and other objects can also be implemented on workstation 112, either through software, or hardware. In some embodiments, the registering of optical images, ultrasound images and/or preoperative images and/or tracking of objects can be done in real time during an operation.
  • the intraoperative registration and tracking system 100 can include one or more displays 116, 118, according to an embodiment of the current invention.
  • the displays 116 and/or 118 can be display screens as illustrated in Figure 2, or any other suitable display.
  • the displays can be the stereo console displays.
  • the ultrasound imaging system 110 includes an ultrasound probe 120 to at least operate as an ultrasound receiver to receive signals from the PA sources.
  • the ultrasound probe 120 may operate as both an ultrasound receiver to receive signals from the PA sources, and as a transmitter and receiver to supplement ultrasound images obtained by the PA sources.
  • the ultrasound imaging system 110 can be an imaging system of any dimension. For example, it can be a one-dimensional, two- dimensional and/or three-dimensional ultrasound imaging system. In the case of a one- dimensional system, it may be viewed as a system that provides range information along a line.
  • Figures 3-5 provide a schematic illustration of a portion of an intraoperative registration and tracking system 200 according to another embodiment of the current invention.
  • a stereo camera 202 and a plurality of optical fibers 204 for the optical source 206 are attached to a turret 108.
  • the turret can be part of a robotic system, or a stand-alone system. It can also be configured for use in open surgery, for example.
  • Figure 4 and 5 also show the ultrasound probe 210 and the tissue 212.
  • the interventional PA techniques are can be broadly applicable, in the current example, we use laparoscopic partial nephrectomies (LPN) and RF ablation of tumors in solid organs such as the liver and kidney.
  • LPN laparoscopic partial nephrectomies
  • RF ablation of tumors in solid organs such as the liver and kidney.
  • a typical workflow would be as follows: Preoperative CT would be used to plan the surgical resection. Intraoperatively, the surgeon would position the kidney so that the tumor is close to the surface facing the surgeon and a 3DUS probe would be placed on the opposite side of the kidney in a position where the tumor, surrounding tissue, and organ surface is visible in the ultrasound. PA-to-video registration would be performed continuously using a system according to an embodiment of the current invention.
  • 3DUS-to-CT registration would be performed and overlay images would be generated on ultrasound and video images, showing the segmented tumor and resection plan.
  • the surgeon may use electrocautery to mark the borders of the resection region.
  • the surgeon will place several small fiducial markers within the planned resection volume, and another CT/IOUS registration will be done.
  • the markers can be located and tracked with PA imaging concurrently with PA-video registration.
  • the tracked markers can be used to maintain registration to the registered preoperative information and to generate visualization overlays to assist the surgeon in performing the resection. We note that many variations on this workflow are possible.
  • PA-to-video registration may still be used to generate overlay images for guidance, so long as the fiducials and organ surface remain visible to the US probe.
  • PA imaging may be used to detect any markers left behind, thus indicating possible inadequate margins.
  • preoperative biopsy is performed, then the surgeon may choose to leave behind markers that may be used for intraoperative targeting and tracking.
  • a workflow for resection of liver tumors would be similar.
  • PA-to-video registration may be used to enable image overlays to assist in placing the ablation probe.
  • the targeted tumors may be located either in 3DUS or through 3DUS-to-CT registration. In the latter case, small implanted fiducials may be used to assist in real time tracking of tumor targets. Photoacoustic imaging would be used to permit the surgeon to accurately visualize the small tines of the ablation probe relative to the chosen ablation target sites.
  • the system includes 1) a laser illumination system; 2) a 3D ultrasound (3DUS) system; and 3) a video camera system, all interfaced to a PC/GPU workstation.
  • either stereo or monoscopic video may be used.
  • the general concepts of the current invention are not limited to these examples.
  • the illumination system can direct a pattern of high-energy laser pulses along with co-located lower energy visible continuous wave (CW) laser light onto the surface of an organ within the field of view of the video system.
  • the illumination system may be mounted with the camera, as shown in the figure, or may be placed independently, so long as it shines spots where they are visible from the camera. Energy absorbed from the pulsed laser spots will generate PA signals that may be imaged by the ultrasound system, thus creating a pattern of "virtual" surface fiducial markers. Since the co-located CW beams are visible at all times in the video camera, the registration between video and ultrasound coordinate systems can be performed.
  • the ultrasound images can concurrently locate implanted fiducials or other structures within the target organ, registration and tracking of preoperative images or other data relative to the video coordinate system may likewise be accomplished.
  • surgical tools or catheters can be illuminated to generate PA signals that may be used to locate them relative to ultrasound images and, hence, to the video coordinate system as well.
  • Phantoms For these examples, we will adapt previous work [3] to create an artificial kidney phantom shown schematically in Fig. 6.
  • the phantom body can be composed of PLASTISOLTM (MF Manufacturing Co., Inc., Fort Worth, TX) with ultrasonic scatterer materials.
  • PLASTISOLTM MF Manufacturing Co., Inc., Fort Worth, TX
  • ultrasonic scatterer materials Into this, we can implant gel-based fiducial spheres simulating tumors with diameters ranging from 5 mm to 10 mm, doped to be readily visible in CT, x-ray, and conventional 3DUS images. Small, brightly colored plastic spheres can be embedded on the surface of the phantom so as to be easily located in 3DUS and video images. Small photoacoustic marker objects can also be embedded in the phantom near the tumors.
  • a fiber-based light delivery system can be used, as illustrated schematically in Fig. 2.
  • the pulsed laser light can be combined with low power continuous wave (CW) light from a separate source and transmitted via optical fiber to a beam splitter using standard optics techniques.
  • CW continuous wave
  • Our current Q-switched laser source has two main wavelength outputs: 1064 nm and the frequency doubled 532 nm output.
  • the 532 nm laser light is green in color and can produce a strong photoacoustic signal since it is strongly absorbed by the soft tissues and blood [53, 54], typical of most surgeries [47, 53].
  • the 1064 nm light has greater penetration depth than the 532 nm light.
  • both Hb and Hb0 2 have less optical absorption coefficient at 1064 nm compared to 532 nm [47, 53, 54].
  • 532 nm generates PA signals at the tissue-air interface. It is known that fat tissue has a relatively higher optical absorption coefficient at NIR range compared to 532 nm [55].
  • intraoperative fat is often covered with a thin layer of blood or other material that is highly responsive at 532 nm and since fat is not always present.
  • 3DUS and Photoacoustic Imaging We can draw upon our extensive experience and systems infrastructure for interventional ultrasound research (see, e.g., our lab web sites [56, 57] and papers [1, 2, 4, 20, 22, 48, 49, 58, 59]). For this example, we have chosen to use 3D IOUS, both because it is increasingly used in interventional applications, and because some of the registration and tracking problems are more straightforward.
  • Video System and Video-US Registration We can work with both calibrated stereo and monoscopic camera setups and software libraries [60-63].
  • the bright spots projected onto the target surface can be located in video images and matched to corresponding bright spots in 3DPA images, whose positions relative to the 3DUS probe has been determined as described above.
  • standard methods can be used to determine the 3D positions of the points relative to camera coordinates, and standard 3D-3D registration methods (e.g., [64-67]) will be used to compute the transformation F yc between ultrasound and camera coordinates.
  • standard 2D-3D photogrammetric methods e.g., [68-70]
  • the achievable bandwidth for measuring F uc can be limited initially by the 10 Hz pulse rate permitted by our laser system and by the volume acquisition rate of our mechanically scanned 3DUS system probes, which can acquire 60 2D images per second.
  • the laser pulses can easily be time- multiplexed to 20 and 40 Hz to provide a greater bandwidth.
  • Intraoperative ultrasound has significant potential as a "bridge" between preoperative data such as surgical plans or CT and MRI images and intraoperative navigation and visualization systems.
  • preoperative data such as surgical plans or CT and MRI images and intraoperative navigation and visualization systems.
  • real-time, continuous tracking of registered preoperative models in 3DUS and video camera coordinates can be provided. Since the example above can provide accurate registration of 3DUS to video coordinates, the remaining barrier is finding a way to determine the 3DUS coordinates of targets that may not be directly visible in subsequent 3DUS images, once they have been determined by an initial registration step. As is often the case with tumors, we will assume that these targets and the immediately surrounding tissue are not highly deformable, so that their deformation may be modeled adequately by a rigid, affine, or similar low degree-of-freedom transformation.
  • the EM tracker sensors we replace the EM tracker sensors with small biocompatible metal fiducial objects that may be located readily in PA images (much more easily than in conventional US). These markers would be implanted in or near the target anatomy and localized in 3DPA images taken concurrently with the 3DUS images used for registration.
  • the tracked positions of the markers can be used to continuously update the coordinate transformation between the preoperative model and 3DUS coordinates.
  • the results from above can be used to enable graphic overlays on the video images similar to Fig. 9.
  • F UCT is the registration transformation).
  • the positions f° of the PA fiducials near each tumor can be determined in 3DUS coordinates from the 3DPA images.
  • the tumors can be
  • the positions f ⁇ of the PA fiducials can be r 1 1 r
  • Boctor "A 3D-elastography-guided system for laparoscopic partial nephrectomies", in Medical Imaging 2010: Visualization, Image-Guided Procedures, and Modeling, San Diego, Feb 13-18, 2010. pp. 762511-762511-12.
  • the Sonix DAQ device developed in collaboration between the University of Hong Kong and Ultrasonix, and the MUSiiC toolkit [15] is used to acquire pre-beamformed radiofrequency (RF) data directly from the US machine.
  • RF radiofrequency
  • a custom-built SC system containing two CMLN-13S2C cameras Point Grey Research, Richmond, Canada) is used to capture images to be used for 3D triangulation.
  • the synthetic phantom is made using plastisol and black dye.
  • the ex vivo liver phantom is made using a gelatin solution and a freshly resected porcine liver. The surface of the liver is partially exposed and not covered by gelatin. Alternate tests with other surfaces such as porcine kidney tissue and fat were also successful in generating a PA signal.
  • the experiment can be split into a data collection phase and a data processing phase.
  • the data collection phase outputs SC image pairs, five frames for each camera, and a 3D RF US volume for each projected laser spot. The number of frames is arbitrary.
  • the data processing phase uses the data and generates a coordinate transformation from the SC frame to the US frame.
  • Figure 10A shows the experimental setup and an overlay of a US image representation using the inverse of the transformation.
  • Figure 11 A shows the workflow of the data collection phase. First a laser spot is projected onto the exposed surface of the ex vivo liver phantom. There will be
  • Steps 3 and 4 show that the 3D US transducer motor actuation and RF data are intermittently collected from the DAQ device to scan and acquire the RF data of the volume of interest.
  • the volume's field of view is 14.7° for the ex vivo tissue phantom, 19.6 0 for the synthetic phantom and the motor step size is 0.49°. This iterative process is manual, but an automatic process is in development. This workflow is repeated for each of the thirty laser spots. [0071]
  • the data processing phase involves the segmentation of the SC images into
  • 3D SC points the segmentation of the 3D RF US volume data into 3D US points, and the computation of the transformation from the SC frame to the US frame.
  • Figure 1 IB shows the workflow for SC segmentation.
  • For each camera we pick a SC image with the laser spot and without the laser spot.
  • the background images without the laser spot are subtracted from the images with the laser spot. This step makes it significantly easier to segment the laser spot.
  • We then apply an appropriate intensity and pixel size thresholds such that the laser spot is segmented out. These thresholds are selected based on the laser beam diameter and the phantom's reflectance.
  • Calibration files for our specific SC allow us to triangulate the segmented point from each camera and obtain a single 3D point in the SC frame. This workflow is repeated for each laser spot projection. We use thirty sets of SC images.
  • Figure 1 1C First, for each slice of a 3D RF US volume, the RF data is beamformed using the k-wave toolbox [16] in MATLAB. The dynamic range of the image is normalized with respect to the volume to decrease the size of the PA signal seen in each volume.
  • Figure 10B shows the k-wave beamformed PA signal image.
  • An appropriate intensity and pixel size threshold is then applied to this image.
  • An ellipse is fitted on the segmented region and an intensity-weighted centroid is computed resulting in lateral and elevational coordinates.
  • the PA signal originates from the surface and any penetration into the surface.
  • the transformation from the SC frame to the US frame can be computed with the 3D SC and 3D US point sets. Any registration method for computing the transformation between two 3D point sets can be used.
  • One of the main reasons for using coherent point drift is that it allows for data points to be missing from either dataset.
  • An assumption that we have made is that each laser spot will be visible in the SC images and each PA signal will be visible in the US volume. This assumption is valid for our experiment, but may not hold in the surgical setting due to SC or transducer movement.
  • the coherent point drift registration algorithm allows us to acquire a registration as long as there are enough corresponding points in the SC images and the US volume.
  • the transformation from the SC frame to the US frame is used to transform the 3D SC points to the US frame for validation.
  • the inverse transformation is used to display a representation of an US image into the SC frame as shown in Figure 10A.
  • 2D array US transducers for a real-time implementation can be used. These transducers can provide an acquisition rate on the order of twenty volumes per second. The 2D array transducer can also be miniaturized and placed closer to the region of interest.
  • a third issue deals with the laser delivery system. As shown in our experimental setup, a laser would have to be fired at the organ in free space. This occurrence is unlikely in practical situations.
  • a fiber delivery tool can be used that will allow us to safely guide the laser beam into the patient's body. This tool can also be able to project concurrent laser spots, greatly enhancing our registration acquisition rate.
  • Boctor E. M. A 3D-elastography-guided system for laparoscopic partial nephrectomies, in Medical Imaging 2010: Visualization, Image-Guided
  • Task 1 3DUS B-mode and PA-mode reconstruction
  • Rationale Volumetric intraoperative ultrasound is used to fuse the preoperative MRI model to the surgical scene.
  • 3DUS data can be acquired using two different approaches.
  • One approach is to utilize a 2D ultrasound array to directly provide 3DUS B-mode data.
  • these 2D arrays are not widely available and to the best of our knowledge, there is no 2D TRUS array.
  • there are a number of mechanical probes that provide 3DUS data by wobbling a ID array but these are relatively slow and need customization to synchronize with a PA imaging system.
  • the second approach is to track a conventional ID TRUS probe using mechanical, optical or electromagnetic tracking devices.
  • Methodology For any given image from the convex array, the first step is to find a few "leading points" or feature points inside the image. Given two images, a cost function is defined for specific in-plane degrees of freedom (lateral translation, axial translation, and elevational rotation). These are all global motion parameters defined as a scalar for the whole image. To compute the cost function, a simple block matching using NCC can be performed. The key is that the block matching only happens for the selected leading point and not for the whole image, which makes it fast and robust. The incoming images are matched with a reference image that does not change until the rotation/translation in the image reaches a certain threshold. At this point, the reference is switched to a new image.
  • the algorithm goes to a recovery mode which down-samples the images and searches the whole image to match the images. Because of down-sampling it is not accurate, but accuracy is restored when the algorithm switches back to normal tracking mode.
  • the JHU group has recently demonstrated a similar method to stitch US images to compose a panoramic view for an entire forearm. In this application, we do not expect this extreme motion.
  • Task 2 Multi-modality fusion utilizing photoacoustic imaging data
  • TRUS Transrectal ultrasound
  • MR scans provide clear delineation of intraprostatic and surrounding anatomies, but become deregistered immediately after the patient is removed from the MR scan table.
  • Task 1 describes the ability to acquire 3DUS data using an available bi-plane TRUS probe.
  • These data include conventional B-mode imaging, which is essential to reveal prostate anatomy and boundary, and PA-mode imaging, which can reveal small vascular structures that cannot be recovered using conventional Doppler imaging. Both prostate anatomy and vascular structures are essential to perform reliable deformable registration with pre-operative MRI.
  • PA imaging is developed based on the photoacoustic effect, originally described by Alexander Graham Bell who showed that thin discs produced sound when exposed to an interrupted beam of sunlight. In PA imaging an object is usually irradiated by a short-pulsed, non-ionizing laser beam.
  • Some of the delivered energy is absorbed, according to optical absorption properties of biological tissue, and converted into heat, leading to transient thermoelastic expansion and thus wideband ultrasonic emission.
  • the generated ultrasonic waves are detected by ultrasonic transducers to form images. It is known that optical absorption is closely associated with physiological properties, such as hemoglobin concentration and oxygen saturation. As a result, the magnitude of the ultrasonic emission (i.e. photoacoustic signal), which is proportional to the local energy deposition, reveals physiologically specific optical absorption contrast.
  • the system includes: 1) a laser illumination system; 2) a dual-array TRUS probe; and 3) a da Vinci endoscope.
  • the illumination system will direct a "Z" or "N" pattern of laser pulses onto the surface of prostate organ within the field of view of the endoscope.
  • the pattern can be produced through the following design alternatives: a) multiple optical fibers to shape the pattern; b) a relatively large fiber (1-2 mm in diameter) with a lens and aperture to produce the pattern; and 3) a single small fiber (200 um diameter) that can be actuated by small motors to produce the pattern. Energy absorbed from the pulsed laser spots will generate PA signals that may be imaged by the ultrasound system, thus creating a pattern of "virtual" surface "Z" fiducial marker.
  • the registration between video and ultrasound coordinate systems is a straightforward process [Cheng-2012]. Since the ultrasound images can concurrently locate prostate anatomy, registration and tracking of preoperative MRI images relative to the video coordinate system may likewise be accomplished.
  • the laser illumination system can target a relatively larger field-of-view with wavelengths (700-800 nm) to generate PA images of the neuro-vascular bundle which will be utilized, with the B-mode information, to perform multimodality registration.
  • Our current Q-switched laser has two main wavelength outputs (1064 nm and the frequency doubled 532 nm output) and tunable output from the OPO unit (690 - 950 nm).
  • the 532 nm laser light is green in color and can produce a strong photoacoustic signal since it is strongly absorbed by the soft tissues and blood, typical of most surgeries.
  • 1064 nm light has greater penetration depth than the 532 nm light. This is mainly because both Hb and Hb02 have less optical absorption coefficient at 1064 nm compared to 532 nm. In fact, Hb compared to Hb02 has a relatively higher optical absorption coefficient in the range of 650 nm - 750 nm.
  • a single "one dimensional" ultrasound sensor may be deployed on the end of a catheter or probe inserted into an organ and used to determine the distances to multiple spots on the surface of the organ. This information may be used to determine the 3D position of the sensor relative to the spots and hence to the optical imaging system.
  • optical illumination and the generated pattern can be two separate events.
  • photoacoustic tracking according to some embodiments of the current invention is to generate several fiducial landmarks (spots or patterns) that can be observed by the camera (single, stereo, or more than two) and also can be detected by the ultrasound sensing system.
  • a spatial relationship between the ultrasound sensor and the camera system can be calculated from matching these features in both spaces.
  • These features or patterns can be generated by the same illumination system. For example, one can use multiple fibers to generate a random pattern of several non- collinear spots as shown in [Cheng-2012] and Figure 2, for example.
  • This pattern can also be of "Z” or "N” shape, where the ultrasound image can intersect in few points - typically three across this "Z” shape.
  • the pattern can be generated physically. This means that the surgeon can paint a pattern right on top of the surgical site or suture a random "small” material to the surface to be cut.
  • the illumination system can be simplified to generate enough flux to cover the area of interest.
  • the physical paint or sutured fiducial marks can be made of materials that are sensitive to specific wavelength and also can provide better transduction coefficient (i.e. conversion ratio from light to sound energy). These materials can be made of black paint; or made of Polydimethylsiloxane (PDMS) mixed with Carbon black that has a coefficient of 310*10 "6 /K, more than ten times higher than common metals.
  • PDMS Polydimethylsiloxane

Abstract

An intraoperative registration and tracking system includes an optical source configured to illuminate tissue intraoperatively with electromagnetic radiation at a substantially localized spot so as to provide a photoacoustic source at the substantially localize spot, an optical imaging system configured to form an optical image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the optical image, an ultrasound imaging system configured to form an ultrasound image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the ultrasound image, and a registration system configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.

Description

PHOTO ACOUSTIC TRACKING AND REGISTRATION
IN INTERVENTIONAL ULTRASOUND
CROSS-REFERENCE OF RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Application No.
61/608,910, filed March 9, 2012, the entire contents of which are hereby incorporated by reference.
BACKGROUND
1. Field of Invention
[0002] The field of the currently claimed embodiments of this invention relates to intraoperative registration and tracking systems, and more particularly to intraoperative registration and tracking systems that use photoacoustic tracking and registration in interventional ultrasound.
2. Discussion of Related Art
[0003] Intraoperative ultrasound (IOUS) imaging can be extremely useful in surgery and other interventional procedures. Ultrasound (US) systems provide real time 2D and 3D views into the patient's anatomy with relatively unobtrusive hardware and without the use of ionizing radiation. An increasing variety of transcutaneous, laparoscopic, and catheter-based probes including both 2D and 3D arrays are available from multiple vendors; image quality is rapidly improving; and advanced image processing techniques are increasing the variety of information that can be provided to the surgeon. Consequently, there is increasing interest in using IOUS in both open and laparoscopic surgery [8-10], in providing guidance for biopsies [1 1], tumor ablation therapy [12, 13], brachytherapy [14], and recently robotic surgery [15, 16]. [0004] One significant factor limiting the use of IOUS is the necessity for the surgeon to relate the information in the ultrasound images to preoperative information such as CT or MRI images and to what he or she is seeing in a laparoscopic video monitor or in direct viewing. Computer-based information fusion and imaging systems have significant potential to overcome this limitation, but they should address three main challenges to be useful in minimally invasive surgery. First, reliably and accurately registering intraoperative ultrasound to the surgical scene as observed by endoscopic video cameras. Second, it can also be important to accurately register and fuse pre-operative models continuously to the surgical scene. This feature can be especially important when IOUS cannot provide the needed information to guide the intervention. Third, after guiding the intervention tool using IOUS data and/or pre-operative models, there is often a need to identify a tool in the ultrasound images and to recover its position relative to patient anatomy. This requirement can be crucial in ablative therapy treatment, biopsy and needle steering scenarios, where the tool may be especially difficult to see and accurate placement on anatomic targets is required.
[0005] There are important limitations with the conventional approaches. Typically, systems for integrating IOUS for information support or intraoperative guidance use an electromagnetic or optical navigational tracker to provide real-time information about the position of the ultrasound probe relative to the patient, endoscopic cameras, and other equipment in the surgical environment (e.g., [17-23]). However, these approaches have serious limitations. Navigational trackers typically track sensors or markers relative to a separate base station placed somewhere close to the surgical environment, thus adding complexity. Optical systems require that the markers be visible to the optical sensors or cameras in the base. Electromagnetic systems require wires between the sensors and base unit, thus complicating sterility, are subject to field distortions, and may not work well in the presence of metal. Accuracy in estimating tool tip position is limited by tool shaft bending and the effects of angle estimation error if markers are placed at a distance from the tip, and it is often impractical to embed sensors near the tips of small tools such as needles or the tines of ablation probes. In addition, the estimation of IOUS-to-camera or IOUS-to-tool transformations necessarily requires an indirect calculation based on multiple tracking targets and is subject to error buildup. Furthermore, calibration of US imaging probes to tracking devices is tedious. Therefore, there remains a need for improved intraoperative registration and tracking systems.
SUMMARY
[0006] An intraoperative registration and tracking system according to some embodiments of the current invention includes an optical source configured to illuminate tissue intraoperatively with electromagnetic radiation at a substantially localized spot so as to provide a photoacoustic source at the substantially localize spot, an optical imaging system configured to form an optical image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the optical image, an ultrasound imaging system configured to form an ultrasound image of at least a portion of the tissue and to detect and determine a position of the substantially localized spot in the ultrasound image, and a registration system configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Further objectives and advantages will become apparent from a consideration of the description, drawings, and examples.
[0008] FIG. 1 is a schematic illustration of a typical EM-based laparoscopic IOUS guidance and visualization system [3].
[0009] FIG. 2 is a schematic illustration of an intraoperative registration and tracking system according to an embodiment of the current invention.
[0010] FIGS. 3-5 are schematic illustrations of portions of an intraoperative registration and tracking system according to another embodiment of the current invention. [0011] FIG. 6 is a schematic illustration of a phantom design to test some embodiments of the current invention.
[0012] FIG. 7 shows tests of an example for an embodiment of the current invention:
(left) Photograph of a scanned laser, US and stereo camera setup, in a typical trial, (middle, right) Photoacoustic image reconstructed from pseudo-spectra analysis and k-space Green's function method for one spot.
[0013] FIG. 8 shows an example of video overlay of preoperative model based on
CT, registered from 3D IOUS, and tracked with EM [3] according to conventional approaches.
[0014] FIGS. 9 A and 9B are schematic illustrations for identification of tines using
PA approach according to two embodiments of the current invention: FIG. 9A) flood illumination; FIG. 9B) local illumination from fibers in tool.
[0015] FIGS. 10A and 10B shows A) Experimental Setup and Video Overlay, B) PA
Signal within an US image.
[0016] FIGS. 11A-11C show workflow for A) Data Collection, B) SC Segmentation, and C) US Segmentation.
[0017] FIG. 12 shows a typical imaging configuration of the dual-array TRUS probe.
[0018] FIG. 13 shows photoacoustic tracking feasibility study: (left) Photograph of a scanned laser, US and stereo camera setup, in a typical trial, (middle, right) PA image reconstructed from pseudo-spectra analysis and k-space Green's function method for one spot.
[0019] FIG. 14 shows visualization of brachytherapy seeds in ex-vivo dog prostate using system.
[0020] FIG. 15 is a schematic illustration of an intraoperative registration and tracking system according to an embodiment of the current invention. DETAILED DESCRIPTION
[0021] Some embodiments of the current invention are discussed in detail below. In describing embodiments, specific terminology is employed for the sake of clarity. However, the invention is not intended to be limited to the specific terminology so selected. A person skilled in the relevant art will recognize that other equivalent components can be employed and other methods developed without departing from the broad concepts of the current invention. All references cited anywhere in this specification, including the Background and Detailed Description sections, are incorporated by reference as if each had been individually incorporated.
[0022] The terms "light" and "optical" are intended to have a broad meaning. They can include, but are not limited to, the visible regions of the electromagnetic spectrum. They can include nonvisible regions of the electromagnetic spectrum such as infrared and ultraviolet light, and even x-ray and microwave regions. As long as the electromagnetic radiation can deposit a localized spot of energy that generates ultrasound, and the spot can be detected along with a corresponding image, it can be included in some embodiments.
[0023] The term "photoacoustic" is intended to have a broad definition which can be photons at any energy suitable for the particular application that deposit energy that generates an acoustic signal in a body of interest. This is intended to be sufficiently broad to include photons of microwave energy. The term "thermoacoustic" effect is often used with reference to microwave energies. The term photoacoustic as used herein is intended to include thermoacoustic in the broad definition.
[0024] The term "body" refers general to a mass, and not specifically to a human or animal body. In some applications, the body of interest can be a human or animal organ, or a portion thereof.
[0025] The term "spot" is intended to have a broad meaning. It can be point-like or a small circular or oval shape. However, it can also can be a pattern, such as, but not limited to an x shape, a v shape, a Z shape, and N shape, etc. [0026] The term "substantially localized spot" means a spot of a size and of defined boundaries sufficient for the particular application. (In the case of a pattern, the localization can be with respect to one sub-feature of the pattern.) For example, most surgeries may require spots sizes from 0.5 to 2 mm. However, some surgeries may require more precision than other surgeries and the imaging geometries may vary. Consequently, the general concepts of the current invention are not limited to particular sizes and location precision of the spots.
[0027] The term "interstitial" means to be inserted into tissue, such as, but not limited to, a needle inserted into tissue with the inserted tip being surrounded by the tissue.
[0028] The term "real-time" is intended to mean that the images can be provided to the user during use of the system. In other words, any noticeable time delay between detection and image display to a user is sufficiently short for the particular application at hand. In some cases, the time delay can be so short as to be unnoticeable by a user.
[0029] We use "3DPA" to mean 3D photoacoustic images and "3DUS" to mean conventional 3D ultrasound images or the overall system. The same transducer can be used for both and both have the same coordinate system, and we can use "3DUS coordinates" and "3DPA coordinates" interchangeably.
[0030] Intraoperative ultrasound (IOUS) imaging is being used increasingly as a guidance modality in interventional procedures - including open and laparoscopic surgery [8, 24], local ablations [13, 25-27], brachytherapy [14], intracardiac procedures [28, 29], etc. Using IOUS images requires that the surgeon relate these images to the physical reality of the patient's anatomy and the motions of surgical tools. Similarly, IOUS images are increasingly "fused" or registered to preoperative data such as CT or MRI images or surgical planning (e.g., [17, 30]), and the results are used to help guide an intervention. Again, the surgeon must relate the registered images to the physical coordinate system associated with the patient's anatomy and other interventional tools. Although this process was traditionally performed in the surgeon's head, surgical navigation and guidance systems are becoming more prevalent and have become "standard of care" in some areas. Typically, these systems rely on a combination of navigational tracking of surgical instruments, "registration" of preoperative and intraoperative data, and some form of information displaying to make the information available to the surgeon. Robotic systems such as the DaVinci surgical robot [31] typically relate the motions of tools to a "visualization" coordinate system associated with a video endoscope and provide means for displaying other images, but rely on the surgeon to relate this data to the video images.
[0031] Real time registration of ultrasound images and preoperative data to intraoperative video display has been a significant challenge. This could become increasingly important as advanced ultrasound imaging methods are increasingly developed and deployed in minimally invasive surgery (MIS). For example, in recent work, we have demonstrated the use of ultrasound elastography (USE) for monitoring ablation of liver tumors [22, 32], image-guided prostatectomies [33], and breast cancer radiotherapy targeting [34], and as a "bridge" for co-registering preoperative CT with intraoperative video in laparoscopic partial nephrectomy (LPN) [3]. Generally, the conventional approaches (e.g., [35-37]) have been to place tracking devices on IOUS probes, the endoscope, and surgical instruments. However, this indirect approach is subject to error buildup from multiple tracking and calibration errors, thus limiting the accuracy of the IOUS to intraoperative image registration.
[0032] Because they have no line-of-sight restrictions, electromagnetic (EM) systems are the most commonly used tracking systems for laparoscopic surgery, flexible endoscopy, and other MIS applications. In these environments EM sensors may be attached to tools, IOUS probes, and cameras, and may be implanted into moving or deforming organs for use as fiducial landmarks (Fig. 1). For example, in prior work targeting laparoscopic partial nephrectomies (LPN), we have used EM tracked US probes to obtain 3D US/USE images of lesions in kidneys [3]. These images have been registered with preoperative CT and used to create a real time video overlay display of the tumor within the kidney, which was tracked using a small EM sensor implanted into the tumor, thus providing continuing feedback on intact tumor location and resection margins during resection.
[0033] However, EM sensors have a number of drawbacks that make them less than ideal for use in laparoscopic surgery, especially if a robot is involved. In addition to the obvious problems associated with field distortion from close proximity of the robot, EM systems have problems that apply more broadly to any laparoscopic or MIS environment. These include: intrusive integration of a large EM "base station" and other equipment into the surgical field; field distortion from other metallic objects or during electrocautery and ablation treatment; interference between tools working in close proximity to each other; dealing with EM sensor wire leads; increased cost in sterilizing tools with associated sensors; and overall cost-effectiveness, especially when we include the needed pre-operative calibration and preparation time to ultrasound probes, cameras and the tools.
[0034] Optical tracking systems such as [38, 39] avoid the field distortion problems associated with EM devices and may be slightly more accurate, but line of sight restrictions often make them impractical for use inside the body, and placing them outside the body can result in degraded tracking accuracy for long or flexible tools or laparoscopic imaging probes, even if the various tracking targets are all visible to the tracker.
[0035] Accurate tracking of surgical tools and devices in ultrasound images is very difficult, due to several factors including image quality, tool size, orientation and depth, and limitations of current tracking technologies, although there has been some work to address these issues. One straightforward approach (e.g., [18]) embeds small EM sensors into needles, catheters, and other tools. Others seek to find tools directly from IOUS images. For example, Stoll et al. [40] have attached a set of passive markers by which the position and orientation of a surgical instrument can be computed from its ultrasound image. The identification of these passive markers, however, still relies on the quality of ultrasound images and the type of the surrounding tissues. Similarly, Rohling et al. have applied extensive image processing methods to detect needle shafts [41] and also have investigated a beam forming approach to steer the beam to maximize the tool visibility [42]. All of these approaches need to have an optimal orientation (i.e. needle in US plane) and considerable size for the tool to be identified and recovered accurately.
[0036] Some embodiments of the current invention use photoacoustic (PA) imaging to overcome many of these limitations [1, 2, 4, 43, 44]. PA imaging [45, 46] is based on the photoacoustic effect, originally discovered by Alexander Graham Bell. In PA imaging, an object is usually irradiated by a short-pulsed, non-ionizing laser beam. Some of the delivered energy is absorbed, according to optical absorption properties of the target tissue or other material, and converted into heat, leading to transient thermoelastic expansion and thus wideband ultrasonic emission, which can be detected by conventional IOUS probes and processed to produce high contrast images. Since the effect is sensitive to tissue density, composition and properties such as hemoglobin oxygen saturation, it is useful both for anatomic and functional imaging. There is current interest in using PA to locate small objects such as needles or brachytherapy seeds within the body [5, 43, 44, 47, 48]. Some embodiments of the current invention are directed to such systems. For example, we have shown repeatable average registration accuracy of 0.56±0.28 mm in artificial phantoms [1] and 0.42±0.15 mm in ex vivo liver [1], 0.38±0.27 mm for Kidney and 0.85±0.45 mm for fat, compared to -1.7-3 mm for artificial phantoms and ~3-5 mm for tissue obtained with other methods (e.g., [3, 7, 49, 50]).
[0037] Some embodiments of the current invention use PA methods to replace navigational trackers in ultrasound-guided surgery and other minimally invasive interventions, together with the systems and techniques for doing this. Some embodiments use PA imaging to perform real-time registration between the ultrasound and video image spaces. This approach does not require a base station and does not require implanted fiducial markers to complete the video-to-ultrasound registration needed for providing "augmented reality" guidance and information support in endoscopic surgery. Further, since the image- to-image registration is more direct than tracker-based methods, it can be less likely to be affected by extraneous errors and can provide significant accuracy advantages [1].
[0038] In some embodiments, small implanted PA fiducial markers can also be used to track tissue and organ motion after an initial registration. This can eliminate any dependence on implanted EM fiducials for tracking. This also can provide other advantages such as simplicity, directness, and accuracy in some embodiments. We also note that the PA markers may themselves be used as registration fiducials in cases where they may have been implanted for preoperative imaging. Clinical investigators often perform tumor biopsy before operation. Small FDA approved particles can be injected to facilitate several follow up treatment steps including precise repeated biopsy, post-operative imaging and accurate guidance during surgery. Small particles within the resection margin need to be resected with the tumor and hence another use of PA imaging is to detect these markers after resection to assure negative margins.
[0039] Three projected PA spots is sufficient for registration to stereo cameras, since the 3D locations of the PA spots relative to the cameras may be found by triangulation and the 3D locations relative to ultrasound come directly from localization in 3DUS. Registration to monoscopic cameras, such as conventional endoscopes, may be accomplished by the use of more points. For example, the well-known method of Bopp and Krauss (H. Bopp and H. Krauss, "An Orientation and Calibration Method for Non-Topographic Applications", Photogrammetric Engineering and Remote Sensing, vol. 44- 9, pp. 1191-1196, September, 1978) may be used if five or more spots are available. However, many other methods known in the art may also be used instead. Another example is the method of Tsai (R. Y. Tsai, "A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses", IEEE Journal of Robotics and Automation, vol. RA-3- 4, pp. 323-358, 1987.).
[0040] Some embodiments of the current invention can include a safety system. In some embodiments, laser light is delivered through optical fibers into the patient's body for laparoscopic surgery. It is desirable to ensure that no laser light escapes from the patient's body, or at least below a safe threshold. In an embodiment, additional light can be added to the optical path, such as the optical fiber. The optical fiber can be a multimode optical fiber, for example, to be able to transmit a plurality of beams of light at relatively spaced wavelengths. (However, this does not preclude the use of single mode optical fibers, other types of waveguides, and/or free space illumination, depending on the particular application.) For example, a low energy infrared light source from an LED or laser may be used as sort of a guide channel, or safety signal. Detectors placed outside the body can detect this monitoring light even in cases when the higher power laser used to create the PA effect is turned off. Safety circuits or other safety monitoring devices can prevent the higher power laser from being turned on if the monitoring light is detected and/or a suitable warning signal such as an audible alarm can be triggered. The monitoring light may be modulated at a known frequency, or with other suitable modulation, and matched to suitable detection circuits to increase sensitivity. The use of infrared light is suitable for some applications because its presence will not distract the surgeon. However, visible or other wavelengths of light can be used is other embodiments of the current invention. A similar system may also be deployed within the laser system enclosure itself to ensure that stray laser light does not escape from the enclosure. [0041] Figure 2 provides a schematic illustration of an intraoperative registration and tracking system 100 according to an embodiment of the current invention. The intraoperative registration and tracking system 100 includes an optical source 102 configured to illuminate tissue 104 intraoperatively with electromagnetic radiation at a substantially localized spot (e.g., 106) so as to provide a photoacoustic source at the substantially localize spot, and an optical imaging system 108 configured to form an optical image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the optical image. The intraoperative registration and tracking system 100 also includes an ultrasound imaging system 1 10 configured to form an ultrasound image of at least a portion of the tissue 104 and to detect and determine a position of the substantially localized spot (e.g., corresponding to 106) in the ultrasound image; and a registration system 112 configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image.
[0042] The optical source 102 can be configured to provide pulsed radiation and/or continuous wave radiation in some embodiments. The pulsed radiation can be provided by a pulsed laser, as is illustrated in the example of Figure 2. In some embodiments, the optical source 102 can include one or more pulsed lasers. However, the general concepts of the current invention are not limited to only the use of lasers. In some embodiments, a pulsed laser that is selectively frequency double can be used. This can permit the selective use of two different illuminating wavelengths. Some embodiments can use two or more pulsed laser, each of which could be selectively frequency double. In some embodiments, the optical source 102 can include a tunable pulsed laser to permit greater flexibility in selecting illumination wavelengths. The pulsed radiation can be electromagnetic radiation at any suitable wavelength for the particular application. In many applications, pulsed radiation within the visible, infrared and/or ultraviolet wavelength ranges is suitable. However, the broad concepts of the current invention are not limited to pulsed sources only in these ranges. As long as the radiation provides photons that can be absorbed to provide an acoustic response with a sufficiently small focus for the application, it could be used in some embodiments. [0043] The optical source 102 can include additional LEDS and/or lasers in some embodiments. For example one or more continuous wave (cw) laser can be included to provide illumination for the optical imaging system 108 to form images of the tissue. The cw laser, or lasers, can be within the visible, infrared and/or ultraviolet wavelength ranges. In some embodiments, the optical source 102 can also have a source of light for a leakage detection system. This can be a dual use of one of the cw lasers for both illuminating for imaging and light leakage detection, or a separate source. In some embodiments, an infrared laser is suitable for leakage detection.
[0044] The optical source 102 can further include an optical fiber 114. Although one optical fiber is illustrated in Figure 2, two, three or more optical fibers can be used. The optical source 102 can include optical components to selectively couple light from the one or more lasers into the same optical fiber and/or to selectable optical fibers from a plurality of optical fibers. In some embodiments, the optical fiber 114, and any additional optical fibers, can be a multimode optical fiber to be able to transmit different wavelengths of light either sequentially, or concurrently. In other words, multiplexing different wavelengths at the same time can be provided, but is not required in all embodiments. However, the general concepts of the current invention do not preclude the use of single mode optical fibers in some embodiments. In the embodiment of Figure 2, the optical fiber 114 is attached to a structure containing pick-up elements of a video system. This can be, but is not limited to, an endoscope, for example. Alternatively, the optical fiber 114 could be attached to a surgical tool, for example. In some embodiments, there can be a plurality of optical fibers with one or groups attached to different objects, such as, but not limited to, a plurality of surgical tools including an endoscope. In some embodiments, the optical source 102 can illuminate the tissue at two or more spots, either in sequence, or at the same time. Optical components can be included in the optical source 102 to scan the illumination beam according to some embodiments, such as, but not limited to a scanning mirror arrangement.
[0045] The registration system 112 is configured to determine a coordinate transformation that registers the optical image with the ultrasound image based at least partially on a correspondence of the spot in the optical image with the spot in the ultrasound image. In general, the images and coordinate transformations can be of any dimensions. For example, they can be one-dimensional, two-dimensional and/or three-dimensional images and coordinate transformations.
[0046] The registration system 112, as well as some or all of the signal processing for the optical imaging system 108 and ultrasound imaging system, can be implemented on a work station as is illustrated in Figure 2. These components can be implemented through software by programming the workstation, for example. However, in other embodiments, one or more of these components could be implemented on dedicated hardware, such as, but not limited to, ASICs and/or FPGAs. In addition, the workstation can have one or more CPUs and/or GPUs. Preoperative data can also be registered with the optical and ultrasound images in some embodiments. This can be implemented on workstation 112 through software, or could be implemented through dedicated hardware. A tracking system to track surgical tools and other objects can also be implemented on workstation 112, either through software, or hardware. In some embodiments, the registering of optical images, ultrasound images and/or preoperative images and/or tracking of objects can be done in real time during an operation.
[0047] The intraoperative registration and tracking system 100 can include one or more displays 116, 118, according to an embodiment of the current invention. The displays 116 and/or 118 can be display screens as illustrated in Figure 2, or any other suitable display. For example, in embodiments in which the intraoperative registration and tracking system 100 is integrated into a robotic system, the displays can be the stereo console displays.
[0048] The ultrasound imaging system 110 includes an ultrasound probe 120 to at least operate as an ultrasound receiver to receive signals from the PA sources. In some embodiments, the ultrasound probe 120 may operate as both an ultrasound receiver to receive signals from the PA sources, and as a transmitter and receiver to supplement ultrasound images obtained by the PA sources. In general, the ultrasound imaging system 110 can be an imaging system of any dimension. For example, it can be a one-dimensional, two- dimensional and/or three-dimensional ultrasound imaging system. In the case of a one- dimensional system, it may be viewed as a system that provides range information along a line. [0049] Figures 3-5 provide a schematic illustration of a portion of an intraoperative registration and tracking system 200 according to another embodiment of the current invention. In this embodiment, a stereo camera 202 and a plurality of optical fibers 204 for the optical source 206 are attached to a turret 108. The turret can be part of a robotic system, or a stand-alone system. It can also be configured for use in open surgery, for example. Figure 4 and 5 also show the ultrasound probe 210 and the tissue 212.
[0050] The following examples will describe some more details of some embodiments of the current invention. However, the broad concepts of the current invention are not limited only to these particular examples.
EXAMPLE 1
[0051] Although the interventional PA techniques according to some embodiments of the current invention are can be broadly applicable, in the current example, we use laparoscopic partial nephrectomies (LPN) and RF ablation of tumors in solid organs such as the liver and kidney. For the LPN application a typical workflow would be as follows: Preoperative CT would be used to plan the surgical resection. Intraoperatively, the surgeon would position the kidney so that the tumor is close to the surface facing the surgeon and a 3DUS probe would be placed on the opposite side of the kidney in a position where the tumor, surrounding tissue, and organ surface is visible in the ultrasound. PA-to-video registration would be performed continuously using a system according to an embodiment of the current invention. 3DUS-to-CT registration would be performed and overlay images would be generated on ultrasound and video images, showing the segmented tumor and resection plan. Using this information, the surgeon may use electrocautery to mark the borders of the resection region. Using combined US and video overlay guidance, the surgeon will place several small fiducial markers within the planned resection volume, and another CT/IOUS registration will be done. The markers can be located and tracked with PA imaging concurrently with PA-video registration. The tracked markers can be used to maintain registration to the registered preoperative information and to generate visualization overlays to assist the surgeon in performing the resection. We note that many variations on this workflow are possible. For example, if preoperative CT is not available, PA-to-video registration may still be used to generate overlay images for guidance, so long as the fiducials and organ surface remain visible to the US probe. After resection, PA imaging may be used to detect any markers left behind, thus indicating possible inadequate margins. Similarly, if preoperative biopsy is performed, then the surgeon may choose to leave behind markers that may be used for intraoperative targeting and tracking. A workflow for resection of liver tumors would be similar.
[0052] For RF ablation of tumors, PA-to-video registration may be used to enable image overlays to assist in placing the ablation probe. The targeted tumors may be located either in 3DUS or through 3DUS-to-CT registration. In the latter case, small implanted fiducials may be used to assist in real time tracking of tumor targets. Photoacoustic imaging would be used to permit the surgeon to accurately visualize the small tines of the ablation probe relative to the chosen ablation target sites.
[0053] System Overview: In this example, the system (Fig. 2) includes 1) a laser illumination system; 2) a 3D ultrasound (3DUS) system; and 3) a video camera system, all interfaced to a PC/GPU workstation. Depending on the application, either stereo or monoscopic video may be used. For the current example, we use both stereo and monoscopic conventional camera setups available in our lab. Alternatively, we could also use conventional laparoscopes and the stereo endoscope from the DaVinci robot system in our mock operating room lab, for example. However, the general concepts of the current invention are not limited to these examples. The illumination system can direct a pattern of high-energy laser pulses along with co-located lower energy visible continuous wave (CW) laser light onto the surface of an organ within the field of view of the video system. The illumination system may be mounted with the camera, as shown in the figure, or may be placed independently, so long as it shines spots where they are visible from the camera. Energy absorbed from the pulsed laser spots will generate PA signals that may be imaged by the ultrasound system, thus creating a pattern of "virtual" surface fiducial markers. Since the co-located CW beams are visible at all times in the video camera, the registration between video and ultrasound coordinate systems can be performed. Since the ultrasound images can concurrently locate implanted fiducials or other structures within the target organ, registration and tracking of preoperative images or other data relative to the video coordinate system may likewise be accomplished. Similarly, surgical tools or catheters can be illuminated to generate PA signals that may be used to locate them relative to ultrasound images and, hence, to the video coordinate system as well.
[0054] Phantoms: For these examples, we will adapt previous work [3] to create an artificial kidney phantom shown schematically in Fig. 6. The phantom body can be composed of PLASTISOL™ (MF Manufacturing Co., Inc., Fort Worth, TX) with ultrasonic scatterer materials. Into this, we can implant gel-based fiducial spheres simulating tumors with diameters ranging from 5 mm to 10 mm, doped to be readily visible in CT, x-ray, and conventional 3DUS images. Small, brightly colored plastic spheres can be embedded on the surface of the phantom so as to be easily located in 3DUS and video images. Small photoacoustic marker objects can also be embedded in the phantom near the tumors. To compare the contrast of PA signals in a realistic model, we can produce similar phantoms in which ex vivo pig kidneys. The livers are substituted for the PLASTISOL body material and suitably doped alginate is injected to provide the implanted tumors [52]. Additionally, we can also introduce a small tube (INTRAMEDIC™, BD, Franklin Lakes, New Jersey) filled with animal blood, placed near to the surface and near to the PA markers to verify different aspect of this embodiment of the current invention. These phantoms can be CT scanned to provide an accurate "ground truth" for validation experiments.
[0055] For accurate, real-time registration of video to 3D ultrasound coordinate systems, there are three principal components, as discussed below.
[0056] Laser Illumination System: In this example, a fiber-based light delivery system can be used, as illustrated schematically in Fig. 2. To facilitate localization of the laser spots in video images, the pulsed laser light can be combined with low power continuous wave (CW) light from a separate source and transmitted via optical fiber to a beam splitter using standard optics techniques. Our current Q-switched laser source has two main wavelength outputs: 1064 nm and the frequency doubled 532 nm output. The 532 nm laser light is green in color and can produce a strong photoacoustic signal since it is strongly absorbed by the soft tissues and blood [53, 54], typical of most surgeries [47, 53]. The 1064 nm light has greater penetration depth than the 532 nm light. This is mainly because both Hb and Hb02 have less optical absorption coefficient at 1064 nm compared to 532 nm [47, 53, 54]. For surface laser spots, we can investigate the use of both 532 nm and 1064 nm light. With our recent PA experiments utilizing kidney, liver [1] and fat, we have shown that 532 nm generates PA signals at the tissue-air interface. It is known that fat tissue has a relatively higher optical absorption coefficient at NIR range compared to 532 nm [55]. However, intraoperative fat is often covered with a thin layer of blood or other material that is highly responsive at 532 nm and since fat is not always present. However, we can also use alternate strategies, such as using 1064 nm or successive pulses of 532 and 1064 nm if fat is present. Alternatively, one can also make use of tunable wavelengths and low-cost pulsed laser diodes (PLD). Some data indicate that one can get usable PA surface images in phantoms with 2-16 μΐ pulses, comparable to energy from available PLDs, according to some embodiments of the current invention.
[0057] 3DUS and Photoacoustic Imaging: We can draw upon our extensive experience and systems infrastructure for interventional ultrasound research (see, e.g., our lab web sites [56, 57] and papers [1, 2, 4, 20, 22, 48, 49, 58, 59]). For this example, we have chosen to use 3D IOUS, both because it is increasingly used in interventional applications, and because some of the registration and tracking problems are more straightforward. We can use an existing Ultrasonix SonixRP (Ultrasonix, Vancouver, CA) system, for example, along with two mechanical 3D ultrasound probes: 1) a linear array with rotational actuation from Vermon Inc. (5-10 MHz), and 2) a linear array with precise translational actuation from NDK Inc. (5-10 MHz). We can also use a compact phased 3D array from NDK (5-10 MHz with 64*32 elements) that can permit rapid volume acquisition, for example. We have demonstrated our ability to form 3D PA images of laser spots projected onto the surface of a liver phantom (Fig. 7) [1, 2].
[0058] Video System and Video-US Registration: We can work with both calibrated stereo and monoscopic camera setups and software libraries [60-63]. The bright spots projected onto the target surface can be located in video images and matched to corresponding bright spots in 3DPA images, whose positions relative to the 3DUS probe has been determined as described above. In the case of stereo video, standard methods can be used to determine the 3D positions of the points relative to camera coordinates, and standard 3D-3D registration methods (e.g., [64-67]) will be used to compute the transformation Fyc between ultrasound and camera coordinates. For monoscopic cameras, standard 2D-3D photogrammetric methods (e.g., [68-70]) can be used. The achievable bandwidth for measuring Fuc can be limited initially by the 10 Hz pulse rate permitted by our laser system and by the volume acquisition rate of our mechanically scanned 3DUS system probes, which can acquire 60 2D images per second. However, the laser pulses can easily be time- multiplexed to 20 and 40 Hz to provide a greater bandwidth. In alternative embodiments, one can make use of electronically scanned 2D array probes to provide 20-30 volumes per second.
[0059] Parameter optimization: We can evaluate our method on both artificial and ex vivo phantoms and in a small in vivo study. We can assess the accuracy of our registration by comparing the results IOUS to those obtained using the physical surface fiducials described earlier. We can locate these fiducials in conventional 3DUS images and in the same video images as for the laser spots. We can use the video-US registration methods to compute the US-to-camera transformations Fj and F® using the photoacoustic laser spots and conventional surface fiducials, respectively, along with the registration difference D = (F ) 1 Fj£ · The translational and rotational components D F of may be represented by vectors e and a , respectively, and (for small errors) D F^v a a ¥ v + I for any v . We can systematically vary the positions of the phantom, cameras, and ultrasound probe, compute D F^ for each position, and perform statistical analysis on the corresponding 6 l r r
vectors = [ek,ak] in order to predict target registration error (TRE) [71, 72] statistics for points within the IOUS volume. We can also perform a leave-one-out analysis similar to [1] to estimate error statistics for points on the organ surface and can use these statistics to compute a separate estimate of volumetric TRE. We can repeat these procedures on our artificial phantom, on ex vivo phantoms with kidney, liver, and fat tissue, and on a limited in vivo study on two pigs. In these examples, we can systematically vary speed/resolution of the 3DUS probe; the intensity, wavelength, and aperture size of the laser pulses; and the imaging geometry in order to determine the optimal values and performance sensitivity of these parameters for system design. For the in vivo study, we can also determine the sensitivity to breathing artifacts. We can optimize parameters in the first pig and use these parameters in the second pig to estimate a 90% confidence "non-inferiority" bound <isuch that n < mE + d where and are the TREs for in vivo and ex vivo results.
Real-time tracking of registered preoperative data
[0060] Intraoperative ultrasound has significant potential as a "bridge" between preoperative data such as surgical plans or CT and MRI images and intraoperative navigation and visualization systems. In this example, real-time, continuous tracking of registered preoperative models in 3DUS and video camera coordinates can be provided. Since the example above can provide accurate registration of 3DUS to video coordinates, the remaining barrier is finding a way to determine the 3DUS coordinates of targets that may not be directly visible in subsequent 3DUS images, once they have been determined by an initial registration step. As is often the case with tumors, we will assume that these targets and the immediately surrounding tissue are not highly deformable, so that their deformation may be modeled adequately by a rigid, affine, or similar low degree-of-freedom transformation.
[0061] In prior work [3, 52], we demonstrated registration of ultrasound elastography and conventional IOUS images to preoperative CT images and resection plans for laparoscopic partial nephrectomies. One significant challenge for this application is maintaining registration of pertinent parts of the preoperative model in the presence of large anatomic changes during resection. Since the tumor and resection margin must remain intact, our approach in [3] was to implant small EM markers into or near the tumor and to track those to provide real time overlay for guiding the surgeon (Fig. 8). Although successful, this approach introduces complications in dealing with the wires, sterilization, calibration, unavoidable EM interference, and the size of the EM sensors.
[0062] According to an embodiment of the current invention, we replace the EM tracker sensors with small biocompatible metal fiducial objects that may be located readily in PA images (much more easily than in conventional US). These markers would be implanted in or near the target anatomy and localized in 3DPA images taken concurrently with the 3DUS images used for registration. For this tracking embodiment, we can use 1064 nm wavelength pulses, for example, to avoid high absorbtion by blood and tissue scattering and to achieve deep penetration into the target. We can use the phantom model to determine the point spread function of the PA system due to the impulse excitation, wide angle detection, and other system parameters. We can use standard Radon transform and back projection methods to reconstruct PA images and to determine the most effective system for this particular application. However, the broad concepts of the current invention are not limited to this example. In prior work [43, 44] we have demonstrated accurate, high-contrast PA imaging and localization of brachytherapy seeds in ex vivo dog prostate using 1064 nm laser pulses [4], and other groups have reported similar results (e.g., [5, 47, 73]). Based on our experience [4], we can construct 1 mm markers from the same brachytherapy material.
[0063] Once an initial registration is performed, the tracked positions of the markers can be used to continuously update the coordinate transformation between the preoperative model and 3DUS coordinates. The results from above can be used to enable graphic overlays on the video images similar to Fig. 9. For this example, we can implement rigid and simple affine transformations, but we note that higher order deformable transformations are also possible, depending on the application and number of markers.
[0064] As a test of an embodiment of the current invention, we can implant US- visible "tumors" and PA fiducials in phantoms and obtain initial 3DUS and 3DPA images. The 3DUS images can be segmented to produce models M° of the tumors relative to 3DUS coordinates. (Alternative: we can CT scan the phantom, segment to produce models at poses F T in CT coordinates, register CT to 3DUS coordinates, and compute M° = FUCT ( CT ,where
1
FUCT is the registration transformation). The positions f° of the PA fiducials near each tumor can be determined in 3DUS coordinates from the 3DPA images. We can then systematically modify the imaging arrangement by moving the 3DUS probe and/or distorting the phantom by cutting into it or stretching it and obtain new 3DUS and 3DPA images. The tumors can
1
be re-segmented to produce models M)f) . The positions f^ of the PA fiducials can be r 1 1 r
determined and transformation parameters nw computed such that tf') = T(i°:,n t)) . We can then compare ^ to T(M°;« (f)) , using standard measures of 3D-3D registration accuracy, such as TREs, average surface distances, and DICE coefficients. We can perform the same procedure in vivo on pig kidneys and livers. In addition, we can create at least one phantom similar to that in Fig. 8, generate graphic overlays, and compare the overlay with the visible tumor.
[0065] References
[1] A. Cheng, J. U. Kang, R. H. Taylor, and E. M. Boctor, "Direct 3D Ultrasound to Video Registration Using the Photoacoustic Effect", in Medical Image Computing and Computer-Assisted Intervention (MICCAI), Nice, October, 2012. p. (accepted).
[2] S. Vyas, S. Su, R. Kim, N. Kuo, R. Taylor, J. Kang, and E. Boctor, " Intraoperative Ultrasound to Stereocamera Registration using Interventional Photoacoustic
Imaging", in SPIE Medical Imaging, San Diego, February, 2012.
[3] P. J. Stolka, M. Keil, G. Sakas, E. McVeigh, M. E. Allaf, R. H. Taylor, and E. M.
Boctor, "A 3D-elastography-guided system for laparoscopic partial nephrectomies", in Medical Imaging 2010: Visualization, Image-Guided Procedures, and Modeling, San Diego, Feb 13-18, 2010. pp. 762511-762511-12.
[4] N. Kuo, H. J. Kang, D. Y. Song, J. U. Kang, and E. M. Boctor, "Real-time
Photoacoustic Imaging of Prostate Brachytherapy Seeds Using a Clinical Ultrasound System", Journal of Biomedical Optics, vol. 17- 6, pp. 066005-1 to 066005-10, June, 2012.
[5] T. Harrison and R. J. Zemp, "Coregistered photoacoustic-ultrasound imaging applied to brachytherapy", J. Biomnedical Optics, vol. 16- 8, August, 201 1.
[6] G. Ku, B. D. Fornage, X. Jin, M. Xu, K. K. Hunt, and L. V. Wang, "Thermoacoustic and Photoacoustic Tomography of Thick Biological Tissues Toward Breast Imaging", Technology in Cancer Research & Treatment, vol. 4- 5, pp. 559-565, 2005.
[7] C. L. Cheung, C. Wedlake, J. Moore, S. E. Pautler, and T. M. Peters, "Fused Video and Ultrasound Images for Minimally Invasive Partial Nephrectomy: A Phantom Study", in Medical Image Computing and Computer-Assisted Interventions
(MICCAI), Beijing, Sept. 20-24, 2010. pp. 408-415.
[8] O. Ukimura, C. Magi-Galluzzi, and I. S. Gill, "Real-Time Transrectal Ultrasound
Guidance During Laparoscopic Radical Prostatectomy: Impact on Surgical Margins", The Journal of Urology, vol. 175- 4, pp. 1304-1310, 2006.
[9] O. Ukimura and I. S. Gill, "Imaging- Assisted Endoscopic Surgery: Cleveland Clinic Experience", Journal of Endourology, vol. 22- 4, p. in press, April, 2008.
[10] M. Menack, J. Spitz, and M. Arregui, "Staging of pancreatic and ampullary cancers for resectability using laparoscopy with laparoscopic ultrasound.", Surg Endosc, vol. 15- 10, pp. 1 129-34, Oct, 2001. ] S. W. Kwan, M. Bhargavan, and R. K. K. Jr., "Effect of Advanced Imaging
Technology on How Biopsies Are Done and Who Does Them", Radiology, vol. 256- 3, September 2010.
] M. Choti, "Surgical Management of Hepatocellular Carcinoma: Resection and
Ablation", JVIR, vol. 33- 9, pp. 871-6, 2002.
] M. Choti, M. E. Bohlman, and S. B. Solomon, "Robotically Assisted Radiofrequency Ablation of Liver Cancer", IHPBA, 2002.
] D. R. Reed, K. E. Wallner, S. Narayanan, S. G. Sutlief, E. C. Ford, and P. S. Cho, "Intraoperative Fluoroscopic Dose Assessment in Prostate Brachytherapy Patients", Intemational Journal of Radiation Oncology, Biology, Physics, vol. 63- 1, pp. 301- 307, 2005.
] S. D. Wexner, R. Bergamaschi, A. Lacy, J. Udo, H. Brolmanm, R. H. Kennedy, and H. John, "The current status of robotic pelvic surgery: results of a multinational interdisciplinary consensus conference", Surgical Endoscopy, vol. 23-, pp. 438-443, 2009.
] S. Bhayani and N. Das, "Robotic assisted laparoscopic partial nephrectomy for
suspected renal cell carcinoma: retrospective review of surgical outcomes of 35 cases", BMC Surgery, vol. 8- 16, 2008.
] J. Stefansic, A. Herline, Y. Shyr, W. Chapman, J. Fitzpatrick, B. Dawant, and R. J.
Galloway, "Registration of Physical Space to Laparoscopic image space for use in minimally invasive hepatic surgery", IEEE Trans Med Imaging, vol. 19- 10, pp. 1012-1023, October, 2000.
] ,"SonixGPS - a revolutionary needle Guidance Positioning System,"http: //
ultrasonix.com/products/gps,2011.
] E. M. Boctor, Enabling Technologies For Ultrasound Imaging In Computer-Assisted Intervention, thesis in Computer Science Department, Johns Hopkins University, 2006.
] E. M. Boctor, M. A. Choti, E. C. Burdette, and R. J. W. Ill, "Three-dimensional
ultrasound-guided robotic needle placement: an experimental evaluation", Int. J. of Medical Robotics and Computer Assisted Surgery, vol. 4- 2, pp. 180-191, 2008.] H. Rivaz, E. Boctor, P. Foroughi, R. Zellars, G. Fichtinger, and G. Hager,
"Ultrasound Elastography: A Dynamic Programming Approach", IEEE Transactions on Medical Imaging, vol. 27- 10, pp. 1373-1377, October, 2008.
] H. Rivaz, I. Fleming, L. Assumpcao, G. Fichtinger, U. M. Hamper, M. A. Choti, G.
D. Hager, and E. Boctor, "Ablation Monitoring with Elastography: 2D In-vivoand 3D Ex- vivo Studies", in Med Image Comput Comput Assist Interv (MICCAI), New York, Sept 6-10, 2008. pp. 458-466.
] P. Foroughi, H. Rivaz, I. Fleming, G. D. Hager, and E. M. Boctor, "Tracked
Ultrasound Elastrography (TrUE)", Medical Image Computing and Computer Assisted Intervention, vol. 13- Pt 2, pp. 9-16, 2010. [24] J. Machi, S. Uchida, K. Sumida, W. M. L. Limm, S. A. Hundahl, A. J. Oishi, N. L.
Furumoto, and R. H. Oishi, "Ultrasound-Guided Radiofrequency Thermal Ablation of Liver Tumors: Percutaneous, Laparoscopic, and Open Surgical Approaches", Journal of Gastrointestinal Surgery, vol. 5- 5, pp. 477-489, October 2001.
[25] M. Choti, "Radiofrequency Ablation", Cancer Jounal, vol. 6- 4, pp. S291-2, 2000.
[26] M. A. Choti, "Surgical Management of Hepatocellular Carcinoma: Resection and Ablation", Journal of Vascular and Interventional Radiology, vol. 13- 9 Pt 2, pp. SI 97-203, September 2002.
[27] E. Berber, M. Tsinberg, G. Tellioglu, C. H. Simpfendorfer, and A. E. Siperstein, "Resection versus Laparoscopic Radiofrequency Thermal Ablation of Solitary Colorectal Liver Metastasis", Journal of Gastrointestinal Surgery, vol. 12- 11, pp. 1967-1972, November 2008.
[28] J. E. Olgin, J. M. Kalman, M. Chin, C. Stillson, M. Maguire, P. Ursel, and M. D.
Lesh, "Electrophysiological Effects of Long, Linear Atrial Lesions Placed Under Intracardiac Ultrasound Guidance", Circulation, vol. 96-, pp. 2715-2721, 1997.
] J. W. Cannon, J. A. Stall, I. S. Salgo, H. B. Knowles, R. D. Howe, P. E. Dupont, G.
R. Marx, and P. J. d. Nido, "Real-Time Three-Dimensional Ultrasound for Guiding Surgical Tasks", Computer Aided Surgery, vol. 8- 2, pp. 82-90, 2003.
W. Wein, B. Roeper, and N. Navab, "Automatic Registration and Fusion of
Ultrasound with CT for Radiotherapy", in Medical Image Computing and Computer Assisted Intervention, 2005, pp. 303-31 1
G. S. Guthart and J.K. Salisbury, "The Intuitive Telesurgery System: Overview and Application", in Proc. of the IEEE International Conference on Robotics and Automation (ICRA2000), San Francisco, 2000, pp. 618-621
H. Rivaz, E. Boctor, M. A. Choti, and G. D. Hager, "Real-Time Regularized
Ultrasound Elastography.", IEEE Trans. Med. Imaging, vol. 30- 4, pp. 928-945, 2011.
I. N. Fleming, H. Rivaz, K. Macura, L.-M. Su, U. Hamper, T. Lotan, G. Lagoda, A. Burnett, Russell H. Taylor, G. D. Hager, and E. M. Boctor, "Ultrasound elastography: enabling technology for image guided laparoscopic prostatectomy", in SPIE Medical Imaging 2009: Visualization, Image-guided Procedures and Modeling., Orlando, Florida, January, 2009. pp. 7261-7273. 10.1 117/12.806507
H. Rivaz, P. Foroughi, I. Fleming, R. Zellars, E. Boctor, and G. Hager, "Tracked Regularized Ultrasound Elastography for Targeting Breast Radiotherapy", in Med Image Comput Comput Assist Interv. ( MICCAI), London, Sept. 20-24, 2009. pp. 507-15.
H. J. Kang, P. J. Stolka, and M. B. Emad, "OpenlTGLinkMUSiiC: A Standard Communications Protocol for Advanced Ultrasound Research", The MIDAS Journal, 2011.
] R. S. J. Estepar, N. Stylopoulos, R. E. Ellis, E. Samset, C. Westin, C. Thompson, and K. Vosburgh, "Towards Scarless Surgery: An Endoscopic-Ultrasound Navigation System for Transgastric Access Procedures", in Medical Image Computing and Computer Assisted Intervention, 2006. pp. 445-453.
[37] R. S. J. Estepar, C. Westin, and K. G. Vosburgh, "Towards Real Time 2D to 3D
Registration for Ultrasound-Guided Endoscopic and Laparoscopic Procedures", InternationalJournal of Computer Assisted Radiology and Surgery, vol. 4- 6, pp. 549-560, 2009.
[38] — 'Northern Digital Polaris Family of Optical Tracking Systems, "http: //www
.ndigital.com /medical/polarisfamily.php,2011.
[39] — ,"Claron Technologies MicronTracker,"http: //
clarontech.com/microntracker_technology .php,201 1.
[40] J. Stoll and P. Dupont, " Passive Markers for Ultrasound Tracking of Surgical
Instruments", in Medical Image Computing and Computer-Assisted Interventions, 2005. pp. 41-48.
[41] S. H. Okazawa, R. Ebrahimi, J. Chuang, R. N. Rohling, and S. E. Salcudean,
"Methods for segmenting curved needles in ultrasound images", Medical Image Analysis, vol. 10-, pp. 330-342, 2006.
[42] R. Rohling, W. Fung, and P. Lajevardi, "PUPIL: Programmable Ultrasound Platform and Interface Library", in Medical Image Computing and Computer Assisted
Interventions, 2003. pp. 424-431.
[43] E. Boctor, S. Verma, T. DeJournett, J. Kang, and J. Spicer, "Brachytherapy Seed Localization Using Combined Photoacoustic and Ultrasound Imaging (Abstract 7629-49)", in SPIE Medical Imaging Conference San Diego, 2010, p. 203
[44] N. Kuo, H. J. Kang, T. DeJournett, J. Spicer, and E. Boctor, "Photoacoustic Imaging of Prostate Brachytherapy Seeds in Ex Vivo Prostate", in SPIE Medical Imaging, Lake Buena Vista, Florida, 2011, pp. 796409-01-796409-07
[45] L. V. Wang and H. Wu, Biomedical Optics: Principles and Imaging: Wiley, 2007.
[46] — ,"//en.wikipedia.org/wiki/Photoacoustic_imaging_in_biomedicine,"http:
//en. wikipedia.org/wiki/Photoacoustic_imaging_in_biomedicine,
[47] J. L. Su, R. R. Bouchard, A. B. Karpiouk, J. D. Hazle, and S. Y. Emelianov,
"Photoacoustic imaging of prostate brachytherapy seeds", Biomedical Optics Express, vol. 2- 8, pp. 2243-54, 2011.
[48] H.-J. Kang, N. Kuo, and E. M. Boctor, "Software framework of Real-time
photoacoustic imaging system for Prostate Brachytherapy Seeds", in SPIE Medical Imaging, San Diego, February, 2012.
[49] J. Leven, D. Burschka, R. Kumar, G. Zhang, S. Blumenkranz, X. Dai, M. Awad, G.
Hager, M. Marohn, M. Choti, C. Hasser, and R. H. Taylor, "DaVinci Canvas: A Telerobotic Surgical System with Integrated, Robot- Assisted, Laparoscopic
Ultrasound Capability", in Medical Image Computing and Computer- Assisted Interventions, Palm Springs, CA, 2005. pp. 811-818. PMID: 16685945 [50] M. C. Yip, T. K. Adebar, R. N. Rohling, S. E. Salcudean, and C. Y. Nguan, "3D
Ultrasound to Stereoscopic Camera Registration through an Air-Tissue Boundary", in Medical Image Computing and Computer- Assisted Interventions (MICCAI), Beijing, Sept. 20-24, 2010. pp. 626-634.
[51] J. Su, A. Karpiouk, B. Wang, and S. Emelianov, "Photoacoustic imaging of clinical metal needles in tissue", JBiomed Opt., vol. 15- 2, pp. 021309.1-6, 2010.
[52] M. Keil, P. J. Stolka, M. Wiebel, G. Sakas, E. R. McVeigh, R. H. Taylor, and E.
Boctor, "Ultrasound and CT Registration Quality: Elastography vs. Classical B- Mode", in ISBI, 2009. pp. 967-970.
[53] S. Prahl," Optical properties spectra compiled by Scott Prahl ",http:
//omlc.ogi.edu/spectra/,
[54] C. Tsai, J. Chen, and W. Wang, "Near-infrared absorption property of biological soft tissue constituents", J. Med. Biol. Eng., vol. 21-, pp. 7-14, 2001.
[55] J. Yao and L. V. Wang, "Photoacoustic tomography: fundamentals, advances and prospects", Contrast Media Mol. Imaging, vol. 6-, pp. 332-345, 2011.
[56] E. M. Boctor, "Medical UltraSound Imaging and Intervention Collaboration (MUSiiC) Lab Research, "https: //musiic. lcsr.jhu.edu/Research,
[57] R. H. Taylor, "Computer Integrated Interventional Systems (CiiS) Lab Research,"http:
//ciis . lcsr .jhu. edu/dokuwiki/ doku.php?id=research,
[58] H. J. Kang, P. J. Stolka, and E. M. Boctor, "OpenlTGLinkMUSiiC: A Standard
Communications Protocol for Advanced Ultrasound Research", The MIDAS Journal, 2011.
[59] H.-J. Kang, N. P. Deshmukh, P. Stolka, E. C. Burdette, and E. M. Boctor,
"Ultrasound Imaging Software Framework for Real-Time Monitoring of Acoustic Ablation Therapy", in SP IE Medical Imaging, San Diego, 2012
[60] B. Vagvolgyi, S. Dimaio, A. Deguet, P. Kazanzides, R. Kumar, C. Hasser, and R.
Taylor, "The Surgical Assistant Workstation", in 2008 MICCAI Workshop - Systems and Architectures for Computer Assisted Interventions, New York, September 6, 2008. p. in electronic proceedings at http:
//midasjournal.org/browse/publication/295.
[61] B. Vagvolgyi, Li-Ming Su, R. Taylor, and G. D. Hager, "Video to CT Registration for Image Overlay on Solid Organs", in 4th Workshop on Aubmented Environments for Medical Imaging and Computer- Aided Surgery, September 10, 2008.
[62] P. Kazanzides, S. DiMaio, A. Deguet, B. Vagvolgyi, M. Balicki, C. Schneider, R.
Kumar, A. Jog, B. Itkowitz, C. Hasser, and R. Taylor, "The Surgical Assistant Workstation (SAW) in Minimally- Invasive Surgery and Microsurgery", in
International Workshop on Systems and Architectures for Computer Assisted
Interventions, Beijing, September 24, 2010. [63] G. Hager and K. Toyama., "The XVision System: A General-Purpose Substrate for Portable Real-Time Vision Applications", Computer Vision and Image
Understanding, vol. 69- 1, pp. 23 - 37, January, 1998.
[64] K. S. Arun, T. S. Huang, and S. D. Blostein, "Least-Squares Fitting of Two 3-D Point Sets", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 9- 5, pp. 698-700, 1987.
[65] B. K. P. Horn, H. M. Hilden, and S. Negahdaripour, "Closed-form solution of
absolute orientation using orthonormal matrices", J Optical Cos Am, vol. 5-, pp. 1127-1135, 1988.
[66] P. J. Besl and N. D. McKay, "A Method for Registration of 3-D Shapes", IEEE
Transactions on Pattern Analysis and Machine Intelligence, vol. 14- 2, pp. pp239- 256, 1992.
[67] A. Myronenko and X. Song, "Point-Set Registration: Coherent Point Drift", IEEE Trans, on Pattern Analysis and Machine Intelligence, vol. 32- 12, pp. 2262-2275, 2010.
[68] H. Bopp and H. Krauss, "An Orientation and Calibration Method for Non- Topographic Applications", Photogrammetric Engineering and Remote Sensing, vol. 44- 9, pp. 1 191-1196, September, 1978.
[69] R. M. J. H. Haralick, "2D-3D pose estimation", in Int Conf on Pattern Recognition (ICPR), Nov 14-17, 1988. pp. 385-391.
[70] R. Y. Tsai, "A Versatile Camera Calibration Technique for High- Accuracy 3D
Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses", IEEE Journal of Robotics and Automation, vol. 3- 4, pp. 323-358, 1987.
[71] United States Patent 5,551,429, J. M. Fitzpatrick and J. J. McCrory, "Method for relating the data of an image space to physical space", Filed September 3, Issue date Dec. 8, 1993 (CIP from Feb 12, 1993).
[72] C. Maurer, M. Fitzpatrick, R. Galloway, M. Wang, R. Maciunas, and G. Allen, "The Accuracy of Image Guided Neurosurgery Using Implantable Fiducial Markers", in Computer Assisted Radiology, Berlin, 1995, pp. 1 197-1202
[73] K. Vallum, B. Chinni, and N. Rao, "Photoacoustic Imaging: Opening New Frontiers in Medical Imaging", Journal of Clinical Imaging Science, vol. 1- 2, pp. 1-7, 2011.
[74] J. L. Su, B. Wang, and S. Y. Emelianov, "Photoacoustic imaging of coronary artery stents", Optics Express, vol. 17- 22, pp. 19894-19901, 2009.
[75] C. Kim, T. N. Erpelding, K. Maslov, L. Jankovic, W. J. Akers, L. Song, S. Achilefu, J. A. Margenthaler, M. D. Pashley, and L. V. Wang, "Handheld array-based photoacoustic probe for guiding needle biopsy of sentinel lymph nodes", Journal of Biomedical Optics, vol. 154-, p. 046010, July/August, 2010.
[76] M. G. v. Vledder, E. M. Boctor, L. R. Assumpcao, H. Rivaz, P. Foroughi, G. D.
Hager, U. M. Hamper, T. M. Pawlik, and M. A. Choti, "Intra-operative ultrasound elasticity imaging for monitoring of hepatic tumour thermal ablation", HPB (Oxford), vol. 12- 10, pp. 717-723, December, 2010.
EXAMPLE 2
[0066] In this example, we present direct 3D US to video registration and
demonstrate its feasibility on ex vivo tissue. We use a 3D US transducer instead of a 2D US transducer to detect the PA signal. Using a 3D transducer allows this registration method to function for a non-planar set of 3D points. This can be a significant advantage for a laparoscopic environment. Also, organ surfaces will rarely form a planar surface. In addition to using a synthetic phantom with excellent light absorption characteristics, we also use a piece of resected ex vivo porcine liver tissue embedded in a gelatin phantom to demonstrate this method in a practical environment for applications to laparoscopic tumor resections, for example.
[0067] This example will detail the experimental procedure and algorithms to validate this method on a synthetic phantom and an ex vivo liver phantom using a 3D US transducer. We will present target registration error (TRE) results.
Methods
[0068] To perform our experiment, we use a Q-switched neodymium-doped yttrium aluminum garnet (Nd:YAG), Brilliant (Quantel Laser, France) laser frequency doubled to 532 nm wavelength at approximately 6mJ/cm2 to generate a PA effect on the synthetic phantom and approximately 19mJ/cm2 on the ex vivo tissue phantom. At this wavelength, most of the laser energy is absorbed at the superficial surface of the tissue. However, there is slight penetration into the tissue, creating a source of error that will be discussed. Our stated energy is lower than the maximum permissible exposure of 19.5mJ/cm as calculated from the IEC 60825-1 laser safety standard [14] based on a 0.25s exposure time, a 4ns pulse width, and a frequency of 10Hz. Alternate tests showed that a lower energy was also able to generate a PA effect on ex vivo tissue. We use a SonixCEP US system along with a 4DL14- 5/38 US transducer developed by Ultrasonix Medical Corporation (Richmond, Canada) to scan the volume of interest. The motor actuation of this transducer induces angular movement around an internal pivot point. The Sonix DAQ device, developed in collaboration between the University of Hong Kong and Ultrasonix, and the MUSiiC toolkit [15] is used to acquire pre-beamformed radiofrequency (RF) data directly from the US machine. We use the k-wave toolbox [16] in MATLAB (Mathworks Inc. Natick, MA) designed for reconstructing PA images based on RF data. A custom-built SC system containing two CMLN-13S2C cameras (Point Grey Research, Richmond, Canada) is used to capture images to be used for 3D triangulation. The synthetic phantom is made using plastisol and black dye. The ex vivo liver phantom is made using a gelatin solution and a freshly resected porcine liver. The surface of the liver is partially exposed and not covered by gelatin. Alternate tests with other surfaces such as porcine kidney tissue and fat were also successful in generating a PA signal.
[0069] Our experiment can be split into a data collection phase and a data processing phase. The data collection phase outputs SC image pairs, five frames for each camera, and a 3D RF US volume for each projected laser spot. The number of frames is arbitrary. The data processing phase uses the data and generates a coordinate transformation from the SC frame to the US frame. Figure 10A shows the experimental setup and an overlay of a US image representation using the inverse of the transformation.
[0070] Figure 11 A shows the workflow of the data collection phase. First a laser spot is projected onto the exposed surface of the ex vivo liver phantom. There will be
inaccuracies in SC spot triangulation if the laser spot is projected at or near the liver gelatin interface because the laser spots become irregularly shaped when projected onto clear materials. Second, several images are taken for each camera. The laser spot projected onto the phantom must be visible in at least one image per camera for triangulation to be possible. Our cameras have a faster capture rate than our laser's repetition rate, so some of the frames will be devoid of the laser signal. We will exploit this during data processing. Steps 3 and 4 show that the 3D US transducer motor actuation and RF data are intermittently collected from the DAQ device to scan and acquire the RF data of the volume of interest. The volume's field of view is 14.7° for the ex vivo tissue phantom, 19.60 for the synthetic phantom and the motor step size is 0.49°. This iterative process is manual, but an automatic process is in development. This workflow is repeated for each of the thirty laser spots. [0071] The data processing phase involves the segmentation of the SC images into
3D SC points, the segmentation of the 3D RF US volume data into 3D US points, and the computation of the transformation from the SC frame to the US frame.
[0072] Figure 1 IB shows the workflow for SC segmentation. For each camera, we pick a SC image with the laser spot and without the laser spot. Next, the background images without the laser spot are subtracted from the images with the laser spot. This step makes it significantly easier to segment the laser spot. We then apply an appropriate intensity and pixel size thresholds such that the laser spot is segmented out. These thresholds are selected based on the laser beam diameter and the phantom's reflectance. Next, we fit an ellipse to the segmented region and compute the intensity weighted centroid. Calibration files for our specific SC allow us to triangulate the segmented point from each camera and obtain a single 3D point in the SC frame. This workflow is repeated for each laser spot projection. We use thirty sets of SC images.
[0073] The workflow for the segmentation of the 3D RF US volume is shown in
Figure 1 1C. First, for each slice of a 3D RF US volume, the RF data is beamformed using the k-wave toolbox [16] in MATLAB. The dynamic range of the image is normalized with respect to the volume to decrease the size of the PA signal seen in each volume. Figure 10B shows the k-wave beamformed PA signal image. Next, we project the volume onto the lateral-elevational plane by taking the mean along each axial ray. An appropriate intensity and pixel size threshold is then applied to this image. An ellipse is fitted on the segmented region and an intensity-weighted centroid is computed resulting in lateral and elevational coordinates. As described earlier, the PA signal originates from the surface and any penetration into the surface. Since air cannot generate a PA signal in our setup, we can exploit that the high intensity pixels farthest away in the axial direction are from the surface. Thus, we obtain the axial coordinate corresponding with a lateral-elevational coordinate as the axial-most high intensity pixel. This step is particularly important because the penetration of the laser pulse is much deeper for the ex vivo tissue phantom than the penetration for the synthetic phantom. We use bilinear interpolation to obtain axial coordinates between sampled points. These three coordinates are converted to 3D US coordinates based on transducer specifications. This workflow is repeated for each of our thirty 3D RF US volumes.
[0074] The transformation from the SC frame to the US frame can be computed with the 3D SC and 3D US point sets. Any registration method for computing the transformation between two 3D point sets can be used. We use the coherent point drift algorithm [17] in our experiment. One of the main reasons for using coherent point drift is that it allows for data points to be missing from either dataset. An assumption that we have made is that each laser spot will be visible in the SC images and each PA signal will be visible in the US volume. This assumption is valid for our experiment, but may not hold in the surgical setting due to SC or transducer movement. The coherent point drift registration algorithm allows us to acquire a registration as long as there are enough corresponding points in the SC images and the US volume.
[0075] The transformation from the SC frame to the US frame is used to transform the 3D SC points to the US frame for validation. The inverse transformation is used to display a representation of an US image into the SC frame as shown in Figure 10A.
Results
[0076] The results of our experiment on the synthetic phantom and on the ex vivo tissue phantom are validated using the target registration error (TRE) metric defined in equation 1. Fsc_us is the transformation from the SC frame to the US frame and is computed with all of SC and US points except for one. The TRE is the resulting difference between the actual US test point and the transformed SC test point in the US frame.
TRE = Fsc vs * SCtest— UStesc ^ ^)
[0077] Twenty-nine of the thirty points are used to compute the transformation from the SC frame to the US frame. The remaining point is used as a test point to compute the TRE. This computation is repeated with each of the thirty points as test points. Table 1 shows the average and standard deviation of the TRE results for the thirty cases in the synthetic phantom and the ex vivo tissue phantom experiment respectively. [0078] Table 1. Average TRE Results for Leave One Out Registration Experiments
Figure imgf000033_0001
Discussion
[0079] There are several considerations when discussing this system's deployment in applications of laparoscopic tumor resections. The first is the placement of the transducer. In our experiments, we use a relatively large 3D US transducer that would be near impossible to put inside the body during a laparoscopic procedure. However, the transducer is often placed externally [3], [8] in these procedures, so the size of the probe is not an issue.
Naturally, there are disadvantages of placing the transducer externally and farther from the region or organ of interest. The quality of ultrasound images degrades as the depth increases, which would likely lead to errors in localizing fiducials or, in our case, the PA signal.
However, since the PA signal only has to travel in one direction, as opposed to traditional US, our PA images will have better quality than US images of equivalent depth.
[0080] Another issue with our 3D US transducer is the acquisition speed. There are certain applications where an acquisition speed of a volume per several seconds is sufficient, but a real-time implementation would require a higher acquisition rate. In some
embodiments, 2D array US transducers for a real-time implementation can be used. These transducers can provide an acquisition rate on the order of twenty volumes per second. The 2D array transducer can also be miniaturized and placed closer to the region of interest.
[0081] A third issue deals with the laser delivery system. As shown in our experimental setup, a laser would have to be fired at the organ in free space. This occurrence is unlikely in practical situations. A fiber delivery tool can be used that will allow us to safely guide the laser beam into the patient's body. This tool can also be able to project concurrent laser spots, greatly enhancing our registration acquisition rate.
[0082] At the level of error measurements shown in Table 1 , it is likely that the calibration of the SC system is a significant contributor. They are able to locate point sources at sub-millimeter accuracy [6], [7]. This error is usually negligible in comparison with the 3mm errors from calibration. Since our results are 0.56mm and 0.42mm errors respectively, the SC system's error becomes significant. We use a custom SC system, so its errors are likely greater than a finely tuned commercial SC system.
[0083] The experimental results in Table 1 show that our system achieves sub- millimeter TRE measurements for both the synthetic phantom and the ex vivo tissue phantom. There is a slight difference in the results, and it is entirely due to the elevational error. This is likely due to the larger field of view in the synthetic phantom experiment as well as normal variation across experiments.
[0084] There are a couple of factors that affect these errors as we move from a bench- top setup to in vivo. When our SC system is replaced with a stereo endoscopic camera, the errors may increase. This is because our SC system has a larger disparity than standard stereo endoscopic cameras. Also, the errors are reported based on surface points. Since the region of interest is often subsurface, our reported TRE will be biased for subsurface target errors. We believe that the bias will be fairly small since the PA spots are being detected in the same modality as any subsurface regions.
References for Example 2
1. Wang Y., Butner S., and Darzi A.: The developing market for medical robotics.
Proceedings of the IEEE 94(9), 1763-1771, September (2006).
2. Taylor, R., Lavallee, S., Burdea, G., and Mosges, R.: Computer integrated
surgery. MIT Press Cambridge, MA (1996).
3. Stolka P. J., Keil M., Sakas G., McVeigh E., Allaf M. E., Taylor R. H., and
Boctor E. M.: A 3D-elastography-guided system for laparoscopic partial nephrectomies, in Medical Imaging 2010: Visualization, Image-Guided
Procedures, and Modeling, San Diego, February 13-18, 762511, 762511-12 (2010) Boctor E., Viswanathan A., Choti M., Taylor R., Fichtinger G., and Hager G.: A Novel Closed Form Solution for Ultrasound Calibration, in International
Symposium on Biomedical Imaging, Arlington, 527-530 (2004) Poon T. and Rohling R.: Comparison of calibration methods for spatial tracking of a 3-D ultrasound probe. Ultrasound in Medicine and Biology 31(8), 1095-1108, August (2005) Navab N., Mitschke M., and Schutz O.: Camera- Augmented Mobile C-Arm (CAMC) Application: 3D Reconstruction using Low Cost Mobile C-Arm. in MICCAI 1999, 688-697 (1999) Wiles A., Thompson D., and Frantz D.: Accuracy assessment and interpretation for optical tracking systems. Proceedings of SPIE, 5367, 421-432 (2004) Yip M. C, Adebar T , Rohling R., Salcudean SJL and Nguan C. Y.: 3D Ultrasound to Stereoscopic Camera Registration through an Air-Tissue Boundary, in MICCAI 2010 2, 626-634 (2010) Vyas S., Su S., Kim R., Kuo N., Taylor R. H., Kang J. U., and Boctor E. M.: Interoperative Ultrasound to Stereocamera Registration using Interventional Photoacoustic Imaging, in Medical Imaging 2012: Visualization, Image-Guided Procedures, and Modeling , San Diego, February 4-9, 8316, 83160S (2012) Kolkman R., Steenbergen,W., and van Leeuwen T.: In vivo photoacoustic imaging of blood vessels with a pulsed laser diode. Lasers in medical science 21(3), 134-139 (2006). Kuo N., Kang H. J., DeJoumett T., Spicer J., and Boctor E. M.: Photoacoustic imaging of prostate brachytherapy seeds in ex vivo prostate, in Medical Imaging 2011 : Visualization, Image-Guided Procedures, and Modeling, Lake Buena Vista, February 12-17, 7964, 796409 (201 1)
12. Xu M. and Wang L.: Photoacoustic imaging in biomedicine. Review of scientific instruments 77, 041101 (2006)
13. Hoelen C, De Mul F., Pongers R., and Dekker A.: Three-dimensional
photoacoustic imaging of blood vessels in tissue. Optics letters 23(8), 648-650 (1998)
14. IEC 60825-1 :1993+A1 :1997+A2:2001 : Safety of Laser Products - Part 1 :
Equipment Classification and Requirements. International Electrotechnical Commission, Geneva (2001)
15. Kang H. J., uo N., Guo X., Song D., Kang J. U., and Boctor E. M.: Software framework of a real-time pre-beamformed RF data acquisition of an ultrasound research scanner, in Medical Imaging 2012: Visualization, Image-Guided Procedures, and Modeling , San Diego, February 4-9, 8320, 83201F (2012)
16. Treeby B., and Cox B.: k-Wave: MATLAB toolbox for the simulation and
reconstruction of photoacoustic wave-fields. Journal of Biomedical Optics 15(2), 021314 (2010)
17. Myronenko A., and Song X.: Point-Set Registration: Coherent Point Drift. IEEE Trans, on Pattern Analysis and Machine Intelligence 32(12), 2262-2275 (2010)
EXAMPLE 3
[0085] In this example we refine and evaluate the registration required to bring the preoperative prostate MRJ model into the da Vinci visualization system. To achieve this goal we perform the following three tasks to be executed in the following order:
[0086] Task 1: 3DUS B-mode and PA-mode reconstruction [0087] Rationale: Volumetric intraoperative ultrasound is used to fuse the preoperative MRI model to the surgical scene. In general, 3DUS data can be acquired using two different approaches. One approach is to utilize a 2D ultrasound array to directly provide 3DUS B-mode data. Unfortunately, these 2D arrays are not widely available and to the best of our knowledge, there is no 2D TRUS array. Alternatively, there are a number of mechanical probes that provide 3DUS data by wobbling a ID array, but these are relatively slow and need customization to synchronize with a PA imaging system. The second approach is to track a conventional ID TRUS probe using mechanical, optical or electromagnetic tracking devices. In our previous work, we integrated an EM tracker into a laparoscopic and robotic environment to guide partial nephrectomy procedures [Stolka-2009, -2010] and faced the following challenges: 1) the need for calibration between the US image reference frame and the tracking sensor reference frame; 2) interference with EM trackers (or line-of-sight issues with optical trackers), especially in the robotic environment; 3) the intrusiveness and bulkiness of these trackers as we need to integrate a base station (field generator for EM or cameras for optical); and 4) an overall navigation accuracy of 3-5 mm, which is not sufficient to navigate critical structures in prostatectomy procedures.
[0088] Our approach is to utilize a readily available dual-array TRUS probe
[Ultrasonix BPC8-4/10 or BCPL9-5/55]. These arrays provide two orthogonal views in realtime: a linear array provides a longitudinal section and convex array provides a transverse section, as shown in Fig. 12. We reconstruct a 3DUS volume by rotating the probe around its main axis and incrementally collecting images from the linear (longitudinal) array. In this motion scenario, the convex (transverse) array images will exhibit in-plane motion (3 degrees of freedom) that tracks rotational sweep. It is important to note that both arrays can be utilized for recovering tracking information. If a small translational motion is introduced while rotating the probe about its axis, the convex array will not detect out-of-plane motion. However, the linear array image sequence will easily detect it.
[0089] Methodology: For any given image from the convex array, the first step is to find a few "leading points" or feature points inside the image. Given two images, a cost function is defined for specific in-plane degrees of freedom (lateral translation, axial translation, and elevational rotation). These are all global motion parameters defined as a scalar for the whole image. To compute the cost function, a simple block matching using NCC can be performed. The key is that the block matching only happens for the selected leading point and not for the whole image, which makes it fast and robust. The incoming images are matched with a reference image that does not change until the rotation/translation in the image reaches a certain threshold. At this point, the reference is switched to a new image. If the tracking is lost, the algorithm goes to a recovery mode which down-samples the images and searches the whole image to match the images. Because of down-sampling it is not accurate, but accuracy is restored when the algorithm switches back to normal tracking mode. The JHU group has recently demonstrated a similar method to stitch US images to compose a panoramic view for an entire forearm. In this application, we do not expect this extreme motion.
[0090] With freehand acquisition, the probe motion will be more than one rotational degree-of-freedom. The proposed image-based approach can still handle this complex motion by extending our tracking algorithm to utilize both orthogonal arrays' imaging data. Our team has extensive experience in dealing with probe tracking using a single probe/array and relying on fully-developed speckle features. We do not expect to face this problem here because the freehand acquisition will be guided by a stage similar to the one shown in Figure 12. Alternatively, we can easily track 6 DoF motion by utilizing the PA-based tracking approach described in Task 2.
[0091] Task 2: Multi-modality fusion utilizing photoacoustic imaging data
[0092] Rationale: Transrectal ultrasound (TRUS) has emerged as the intra-operative imaging modality of choice for radical prostatectomy. Studies have demonstrated that it can help identify prostate margins. An AR navigation system that directly fuses stereo video with tracked 3D TRUS volume has been proposed for prostatectomy. By directly utilizing the intra-operative imaging modality, the system avoids the complication of deformable registration. But, due to its relatively low resolution and signal-to-noise ratio, TRUS is not ideal for detecting critical detailed surrounding anatomies such as the neurovascular bundles. On the other hand, MR scans provide clear delineation of intraprostatic and surrounding anatomies, but become deregistered immediately after the patient is removed from the MR scan table. By extracting detailed MR models and aligning them with intra-operative TRUS that is tracked within the da Vinci system, we can overlay the models on top of the live stereo video and provide surgeons with an X-ray vision of critical anatomies.
[0093] Methodology: Before we describe our registration approach, we will detail our intraoperative data acquisition. Task 1 describes the ability to acquire 3DUS data using an available bi-plane TRUS probe. These data include conventional B-mode imaging, which is essential to reveal prostate anatomy and boundary, and PA-mode imaging, which can reveal small vascular structures that cannot be recovered using conventional Doppler imaging. Both prostate anatomy and vascular structures are essential to perform reliable deformable registration with pre-operative MRI. PA imaging is developed based on the photoacoustic effect, originally described by Alexander Graham Bell who showed that thin discs produced sound when exposed to an interrupted beam of sunlight. In PA imaging an object is usually irradiated by a short-pulsed, non-ionizing laser beam. Some of the delivered energy is absorbed, according to optical absorption properties of biological tissue, and converted into heat, leading to transient thermoelastic expansion and thus wideband ultrasonic emission. The generated ultrasonic waves are detected by ultrasonic transducers to form images. It is known that optical absorption is closely associated with physiological properties, such as hemoglobin concentration and oxygen saturation. As a result, the magnitude of the ultrasonic emission (i.e. photoacoustic signal), which is proportional to the local energy deposition, reveals physiologically specific optical absorption contrast.
[0094] We have demonstrated the use of the photoacoustic (PA) effect to register 3D ultrasound images to 3D video images (see Fig. 13). Dr. Boctor has also conducted experiments to detect Brachytherapy seeds based on PA imaging. This has demonstrated accurate, high-contrast PA imaging and localization of brachytherapy seeds in ex-vivo dog prostate using 1064 nm laser pulses (see Fig. 14). The system (Fig. 15) includes: 1) a laser illumination system; 2) a dual-array TRUS probe; and 3) a da Vinci endoscope. The illumination system will direct a "Z" or "N" pattern of laser pulses onto the surface of prostate organ within the field of view of the endoscope. It may be mounted with the camera or may be held by one of the da Vinci instruments, so long as it shines these patterns where they are visible from the endoscope. The pattern can be produced through the following design alternatives: a) multiple optical fibers to shape the pattern; b) a relatively large fiber (1-2 mm in diameter) with a lens and aperture to produce the pattern; and 3) a single small fiber (200 um diameter) that can be actuated by small motors to produce the pattern. Energy absorbed from the pulsed laser spots will generate PA signals that may be imaged by the ultrasound system, thus creating a pattern of "virtual" surface "Z" fiducial marker. Because the selected wavelength (532 nm) of these laser patterns is visible at all times in the endoscope, the registration between video and ultrasound coordinate systems is a straightforward process [Cheng-2012]. Since the ultrasound images can concurrently locate prostate anatomy, registration and tracking of preoperative MRI images relative to the video coordinate system may likewise be accomplished. The laser illumination system can target a relatively larger field-of-view with wavelengths (700-800 nm) to generate PA images of the neuro-vascular bundle which will be utilized, with the B-mode information, to perform multimodality registration.
[0095] Our current Q-switched laser has two main wavelength outputs (1064 nm and the frequency doubled 532 nm output) and tunable output from the OPO unit (690 - 950 nm). The 532 nm laser light is green in color and can produce a strong photoacoustic signal since it is strongly absorbed by the soft tissues and blood, typical of most surgeries. 1064 nm light has greater penetration depth than the 532 nm light. This is mainly because both Hb and Hb02 have less optical absorption coefficient at 1064 nm compared to 532 nm. In fact, Hb compared to Hb02 has a relatively higher optical absorption coefficient in the range of 650 nm - 750 nm. For surface laser spots and for tracking tasks we will use 532 nm. With our recent PA experiments utilizing kidney, liver [Cheng-2012] and fat, we have shown that 532 nm generates PA signals at the tissue-air interface. For PA vascular imaging and for initial multi-modality fusion of US/PA with preoperative MRI, we will investigate the use of the following wavelength 532nm and the range of 690 - 750 nm. We will also explore and investigate the use of low-cost pulsed laser diodes (PLD). Some results indicate that we can get usable PA surface images in phantoms with 2-16 uJ pulses, comparable to energy from available PLDs.
[0096] Other embodiments are not limited to the examples and embodiments described above. For example, in another embodiment of this invention, a single "one dimensional" ultrasound sensor may be deployed on the end of a catheter or probe inserted into an organ and used to determine the distances to multiple spots on the surface of the organ. This information may be used to determine the 3D position of the sensor relative to the spots and hence to the optical imaging system.
[0097] In another embodiment, optical illumination and the generated pattern can be two separate events. As described above, photoacoustic tracking according to some embodiments of the current invention is to generate several fiducial landmarks (spots or patterns) that can be observed by the camera (single, stereo, or more than two) and also can be detected by the ultrasound sensing system. Hence, a spatial relationship between the ultrasound sensor and the camera system can be calculated from matching these features in both spaces. These features or patterns can be generated by the same illumination system. For example, one can use multiple fibers to generate a random pattern of several non- collinear spots as shown in [Cheng-2012] and Figure 2, for example. This pattern can also be of "Z" or "N" shape, where the ultrasound image can intersect in few points - typically three across this "Z" shape. In addition, the pattern can be generated physically. This means that the surgeon can paint a pattern right on top of the surgical site or suture a random "small" material to the surface to be cut. In this case, the illumination system can be simplified to generate enough flux to cover the area of interest. The physical paint or sutured fiducial marks can be made of materials that are sensitive to specific wavelength and also can provide better transduction coefficient (i.e. conversion ratio from light to sound energy). These materials can be made of black paint; or made of Polydimethylsiloxane (PDMS) mixed with Carbon black that has a coefficient of 310*10"6 /K, more than ten times higher than common metals.
[0098] The embodiments illustrated and discussed in this specification are intended only to teach those skilled in the art how to make and use the invention. In describing embodiments of the invention, specific terminology is employed for the sake of clarity. However, the invention is not intended to be limited to the specific terminology so selected. The above-described embodiments of the invention may be modified or varied, without departing from the invention, as appreciated by those skilled in the art in light of the above teachings. It is therefore to be understood that, within the scope of the claims and their equivalents, the invention may be practiced otherwise than as specifically described.

Claims

WE CLAIM:
1. An intraoperative registration and tracking system, comprising: an optical source configured to illuminate tissue intraoperatively with electromagnetic radiation at a substantially localized spot so as to provide a photoacoustic source at said substantially localize spot; an optical imaging system configured to form an optical image of at least a portion of said tissue and to detect and determine a position of said substantially localized spot in said optical image; an ultrasound imaging system configured to form an ultrasound image of at least a portion of said tissue and to detect and determine a position of said substantially localized spot in said ultrasound image; and a registration system configured to determine a coordinate transformation that registers said optical image with said ultrasound image based at least partially on a correspondence of said spot in said optical image with said spot in said ultrasound image.
2. An intraoperative registration and tracking system according to claim 1 , wherein said optical source is configured to illuminate said tissue intraoperatively with electromagnetic radiation at a plurality of substantially localized spots so as to provide a plurality of photoacoustic sources corresponding to said plurality of substantially localize spots, wherein said optical imaging system is further configured to detect and determine a position of each of said plurality of substantially localized spots in said optical image, wherein said ultrasound imaging system is further configured to detect and determine a position of each of said plurality of substantially localized spots in said ultrasound image, and wherein said registration system is further configured to determine said coordinate transformation that registers said optical image with said ultrasound image based at least partially on a correspondence of at least some of said plurality of spots in said optical image with corresponding spots in said ultrasound image.
3. An intraoperative registration and tracking system according to claim 2, wherein said registration system is configured to perform a coherent point drift algorithm for determining said coordinate transformation that registers said optical image with said ultrasound image.
4. An intraoperative registration and tracking system according to claim 2, wherein said optical source comprises a plurality of optical waveguides to selectively provide said plurality of substantially localized spots.
5. An intraoperative registration and tracking system according to claim 2, wherein said optical source comprises steerable optical components to selectively provide said plurality of substantially localized spots.
6. An intraoperative registration and tracking system according to claim 1, wherein said optical imaging system comprises a stereo video camera.
7. An intraoperative registration and tracking system according to claim 1 , wherein said optical source comprises a pulsed laser.
8. An intraoperative registration and tracking system according to claim 7, wherein said pulsed laser is configured to selectively emit at a primary wavelength and at a frequency doubled wavelength.
9. An intraoperative registration and tracking system according to claim 7, wherein said optical source comprises an optical fiber rigidly attached relative to said optical imaging system, and wherein at least a portion of said optical imaging system and said optical fiber are configured to be inserted at least one of interstitially or laparoscopically.
10. An intraoperative registration and tracking system according to claim 9, wherein said optical source further comprises at least a second optical fiber rigidly attached relative to a surgical tool to be at least one of registered or tracked by said intraoperative registration and tracking system.
1 1. An intraoperative registration and tracking system according to claim 1 , further comprising a turret upon which at least a portion of said optical source and said optical imaging system are attached, wherein at least one of a position and orientation of at least a portion of said turret can be adjusted during a surgical procedure.
12. An intraoperative registration and tracking system according to claim 1, wherein said ultrasound imaging system comprises an ultrasound transducer that is configured to receive ultrasound signals from said photoacoustic source, and wherein said ultrasound imaging system is configured to form said ultrasound image based on said ultrasound signals from said photoacoustic source.
13. An intraoperative registration and tracking system according to claim 1, wherein said ultrasound imaging system comprises an ultrasound transducer that is configured to receive ultrasound signals from said photoacoustic source and to transmit ultrasound signals, and wherein said ultrasound imaging system is configured to form said ultrasound image based on at least one of said ultrasound signals from said photoacoustic source and ultrasound signals that return to said ultrasound transducer after being emitted from said ultrasound transducer.
14. An intraoperative registration and tracking system according to claim 1, further comprising a tracking system configured to identify an object in a plurality of registered optical and ultrasound images at a corresponding plurality of times to thereby track positions of said object at each of said plurality of times.
15. An intraoperative registration and tracking system according to any one of claims 1- 14, wherein said optical source is further configured to provide continuous illumination to illuminate at least a portion of said tissue intraoperatively with light to be detected by said optical imaging system.
16. An intraoperative registration and tracking system according to claim 15, wherein said optical source comprises a continuous wave laser.
17. An intraoperative registration and tracking system according to any one of claims 1- 14, wherein said registration system is further configured to register preoperative images with said optical and ultrasound images based on fiducial markers in said tissue that are locatable in at least one of said optical image and said ultrasound image.
18. An intraoperative registration and tracking system according to claim 15, wherein said registration system is further configured to register preoperative images with said optical and ultrasound images based on fiducial markers in said tissue that are locatable in at least one of said optical image and said ultrasound image.
19. An intraoperative registration and tracking system according to any one of claims 1- 14, wherein said optical source is further configured to provide a safety beam of light.
20. An intraoperative registration and tracking system according to claim 19, further comprising a light leakage detector arranged to detect levels of said safety light outside of selected illumination zones.
21. An intraoperative registration and tracking system according to claim 20, wherein said safety beam of light is infrared light.
22. An intraoperative registration and tracking system according to claim 20, wherein said safety beam of light is modulated to provide a detection signature.
23. An intraoperative registration and tracking system according to claim 20, wherein said optical source is further configured to disable illumination of said tissue intraoperatively with said electromagnetic radiation at said substantially localized spot responsive to signals from said light leakage detector.
24. An intraoperative registration and tracking system according to any one of claims 2-5 and 7-14, wherein said optical imaging system comprises a stereo video camera.
25. An intraoperative registration and tracking system according to any one of claims 2-5 and 8-14, wherein said optical source comprises a pulsed laser.
PCT/US2013/030273 2012-03-09 2013-03-11 Photoacoustic tracking and registration in interventional ultrasound WO2013134782A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/381,374 US10758209B2 (en) 2012-03-09 2013-03-11 Photoacoustic tracking and registration in interventional ultrasound

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261608910P 2012-03-09 2012-03-09
US61/608,910 2012-03-09

Publications (1)

Publication Number Publication Date
WO2013134782A1 true WO2013134782A1 (en) 2013-09-12

Family

ID=49117433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/030273 WO2013134782A1 (en) 2012-03-09 2013-03-11 Photoacoustic tracking and registration in interventional ultrasound

Country Status (2)

Country Link
US (1) US10758209B2 (en)
WO (1) WO2013134782A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015085109A1 (en) * 2013-12-04 2015-06-11 The Johns Hopkins University Systems and methods for real-time tracking of photoacoustic sensing
WO2015112817A1 (en) * 2014-01-24 2015-07-30 Actuated Medical, Inc. Photoacoustic needle insertion platform
WO2015092664A3 (en) * 2013-12-18 2015-09-24 Koninklijke Philips N.V. Electromagnetic tracker based ultrasound probe calibration
EP2974771A1 (en) * 2014-07-18 2016-01-20 Universität der Bundeswehr München A method and apparatus for determining an energy deposition of an ion beam
JP2016101416A (en) * 2014-11-28 2016-06-02 キヤノン株式会社 Subject information acquisition device
WO2016119840A1 (en) * 2015-01-28 2016-08-04 Brainlab Ag Light point identification method
WO2016136206A1 (en) * 2015-02-26 2016-09-01 Canon Kabushiki Kaisha Phantom
CN106687049A (en) * 2014-09-19 2017-05-17 富士胶片株式会社 Photoacoustic image generation method and device
JP2017513662A (en) * 2014-03-28 2017-06-01 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Alignment of Q3D image with 3D image
US9877697B2 (en) 2014-04-30 2018-01-30 Emory University Systems, methods and computer readable storage media storing instructions for generating planning images based on HDR applicators
CN107913102A (en) * 2016-10-06 2018-04-17 韦伯斯特生物官能(以色列)有限公司 Anatomic image and the positioning control system carried out using ultrasound it is preoperative registering
CN108778113A (en) * 2015-09-18 2018-11-09 奥瑞斯健康公司 The navigation of tubulose network
US10334227B2 (en) 2014-03-28 2019-06-25 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10350009B2 (en) 2014-03-28 2019-07-16 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging and printing of surgical implants
US10368054B2 (en) 2014-03-28 2019-07-30 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
CN111110347A (en) * 2019-11-29 2020-05-08 中奕智创医疗科技有限公司 Ultrasonic positioning system, device and storage medium based on biplane image
EP3939496A1 (en) * 2020-07-15 2022-01-19 Deutsches Krebsforschungszentrum - Stiftung des öffentlichen Rechts / Universität Heidelberg Method and system for context-aware photoacoustic imaging
US11266465B2 (en) 2014-03-28 2022-03-08 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
CN115177217A (en) * 2022-09-09 2022-10-14 之江实验室 Photoacoustic signal simulation method and device based on spherical particle light pulse excitation effect

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6106259B2 (en) * 2012-03-21 2017-03-29 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Clinical workstation integrating medical imaging and biopsy data and method of using the same
US9186053B2 (en) * 2012-05-03 2015-11-17 Covidien Lp Methods of using light to repair hernia defects
CN103544688B (en) * 2012-07-11 2018-06-29 东芝医疗系统株式会社 Medical imaging fusing device and method
JP6309240B2 (en) * 2012-10-26 2018-04-11 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic equipment
JP6238539B2 (en) * 2013-03-21 2017-11-29 キヤノン株式会社 Processing apparatus, subject information acquisition apparatus, and processing method
JP6223129B2 (en) * 2013-10-31 2017-11-01 キヤノン株式会社 Subject information acquisition apparatus, display method, subject information acquisition method, and program
JP6621819B2 (en) * 2014-10-30 2019-12-18 セノ メディカル インストルメンツ,インク. Photoacoustic imaging system with detection of relative orientation of light source and acoustic receiver using acoustic waves
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10806346B2 (en) * 2015-02-09 2020-10-20 The Johns Hopkins University Photoacoustic tracking and registration in interventional ultrasound
US9436993B1 (en) * 2015-04-17 2016-09-06 Clear Guide Medical, Inc System and method for fused image based navigation with late marker placement
US10028662B2 (en) * 2015-05-14 2018-07-24 Endra Life Sciences Inc. Systems and methods for imaging biological tissue structures
US10898166B2 (en) * 2015-05-14 2021-01-26 Endra Life Sciences Inc. Systems and methods for imaging biological tissue structures
JP2017012485A (en) * 2015-07-01 2017-01-19 キヤノン株式会社 Biomarker and subject information acquisition apparatus
US10849650B2 (en) * 2015-07-07 2020-12-01 Eigen Health Services, Llc Transperineal needle guidance
US20170055844A1 (en) * 2015-08-27 2017-03-02 Canon Kabushiki Kaisha Apparatus and method for acquiring object information
US20170084036A1 (en) * 2015-09-21 2017-03-23 Siemens Aktiengesellschaft Registration of video camera with medical imaging
US11083437B2 (en) * 2015-11-02 2021-08-10 Purdue Research Foundation Method and device for in situ cancer margin detection
CN109310363B (en) * 2015-11-07 2022-08-02 普渡研究基金会 Intraoperative photoacoustic navigation device and method
WO2017096241A1 (en) 2015-12-02 2017-06-08 Augmenteum, Inc. System for and method of projecting augmentation imagery in a head-mounted display
US10932751B2 (en) * 2016-09-30 2021-03-02 The Johns Hopkins University Catheter ultrasound transmission element (CUTE) catheter
US10918445B2 (en) * 2016-12-19 2021-02-16 Ethicon Llc Surgical system with augmented reality display
JP2018126389A (en) * 2017-02-09 2018-08-16 キヤノン株式会社 Information processing apparatus, information processing method, and program
US20200143534A1 (en) * 2017-06-29 2020-05-07 Sony Corporation Medical imaging system, method and computer program product
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
JP6469295B1 (en) * 2018-09-21 2019-02-13 株式会社A−Traction Surgery support apparatus, control method therefor, and surgery support system
US10799090B1 (en) * 2019-06-13 2020-10-13 Verb Surgical Inc. Method and system for automatically turning on/off a light source for an endoscope during a surgery
JP7362354B2 (en) * 2019-08-26 2023-10-17 キヤノン株式会社 Information processing device, inspection system and information processing method
DE102019214303B4 (en) * 2019-09-19 2022-07-28 Siemens Healthcare Gmbh Method and system for assisting medical personnel in resection and computer program product
JP2023505309A (en) * 2019-12-03 2023-02-08 ティシュー ディファレンシエーション インテリジェンス,エルエルシー Intraoperative ultrasound probe system and related methods
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11607277B2 (en) 2020-04-29 2023-03-21 Globus Medical, Inc. Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11786128B2 (en) 2020-06-18 2023-10-17 Illumisonics Inc. PARS imaging methods
US11122978B1 (en) * 2020-06-18 2021-09-21 Illumisonics Inc. PARS imaging methods
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11289196B1 (en) 2021-01-12 2022-03-29 Emed Labs, Llc Health testing and diagnostics platform
US11929168B2 (en) 2021-05-24 2024-03-12 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11615888B2 (en) 2021-03-23 2023-03-28 Emed Labs, Llc Remote diagnostic testing and treatment
US20220354380A1 (en) * 2021-05-06 2022-11-10 Covidien Lp Endoscope navigation system with updating anatomy model
JP2022180177A (en) * 2021-05-24 2022-12-06 富士フイルム株式会社 Endoscope system, medical image processing device, and operation method thereof
US11369454B1 (en) 2021-05-24 2022-06-28 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
CN113349980B (en) * 2021-06-11 2022-12-09 北京德铭联众科技有限公司 Ultrasonic guiding, positioning and adjusting device for accurately injuring tissues in animal body
US11610682B2 (en) 2021-06-22 2023-03-21 Emed Labs, Llc Systems, methods, and devices for non-human readable diagnostic tests
JP2023005896A (en) * 2021-06-29 2023-01-18 富士フイルム株式会社 Endoscope system, medical image processing apparatus and operation method of the same
WO2023031688A1 (en) * 2021-09-01 2023-03-09 Rsip Neph Ltd. Combined multi-imaging modalities in surgical procedures

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049486A1 (en) * 2003-08-28 2005-03-03 Urquhart Steven J. Method and apparatus for performing stereotactic surgery
US20090015826A1 (en) * 2006-03-30 2009-01-15 Duke University Optical assay system for intraoperative assessment of tumor margins
US20100168561A1 (en) * 2006-12-18 2010-07-01 Trillium Precision Surgical, Inc. Intraoperative Tissue Mapping and Dissection Systems, Devices, Methods, and Kits
WO2011100753A2 (en) * 2010-02-15 2011-08-18 The Johns Hopkins University Interventional photoacoustic imaging system

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4896673A (en) 1988-07-15 1990-01-30 Medstone International, Inc. Method and apparatus for stone localization using ultrasound imaging
US5285788A (en) 1992-10-16 1994-02-15 Acuson Corporation Ultrasonic tissue imaging method and apparatus with doppler velocity and acceleration processing
US5730130A (en) 1993-02-12 1998-03-24 Johnson & Johnson Professional, Inc. Localization cap for fiducial markers
US5765561A (en) 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5868673A (en) 1995-03-28 1999-02-09 Sonometrics Corporation System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly
US5817022A (en) 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US6160835A (en) * 1998-03-20 2000-12-12 Rocky Mountain Instrument Co. Hand-held marker with dual output laser
US6178340B1 (en) 1998-08-24 2001-01-23 Eduardo Svetliza Three-dimensional infrared imager for subcutaneous puncture and study of vascular network
WO2002000093A2 (en) 2000-06-27 2002-01-03 Insightec-Image Guided Treatment Ltd. Registration of target object images to stored image data
WO2002024094A2 (en) 2000-09-25 2002-03-28 Insightec-Image Guided Treatment Ltd. Non-ivasive system and device for locating a surface of an object in a body
US7914453B2 (en) 2000-12-28 2011-03-29 Ardent Sound, Inc. Visual imaging system for ultrasonic probe
US7103212B2 (en) * 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
JP2004222870A (en) * 2003-01-21 2004-08-12 Pentax Corp Probe for endoscope
WO2005043319A2 (en) 2003-10-21 2005-05-12 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for intraoperative targeting
IL166408A0 (en) 2005-01-20 2006-01-15 Ultraview Ltd Combined 2d pulse-echo ultrasound and optoacousticsignal for glaucoma treatment
EP1795142B1 (en) 2005-11-24 2008-06-11 BrainLAB AG Medical tracking system using a gamma camera
US20090054763A1 (en) * 2006-01-19 2009-02-26 The Regents Of The University Of Michigan System and method for spectroscopic photoacoustic tomography
WO2007115825A1 (en) 2006-04-12 2007-10-18 Nassir Navab Registration-free augmentation device and method
JP5432708B2 (en) 2006-06-23 2014-03-05 コーニンクレッカ フィリップス エヌ ヴェ Timing control device for photoacoustic and ultrasonic composite imager
WO2008004222A2 (en) 2006-07-03 2008-01-10 Yissum Research Development Company Of The Hebrew University Of Jerusalem Computer image-aided method and system for guiding instruments through hollow cavities
US20080123083A1 (en) * 2006-11-29 2008-05-29 The Regents Of The University Of Michigan System and Method for Photoacoustic Guided Diffuse Optical Imaging
WO2008103383A1 (en) 2007-02-20 2008-08-28 Gildenberg Philip L Videotactic and audiotactic assisted surgical methods and procedures
US8771188B2 (en) * 2007-06-20 2014-07-08 Perception Raisonnement Action En Medecine Ultrasonic bone motion tracking system
JP2009018427A (en) 2007-07-10 2009-01-29 Seiko Epson Corp Liquid jet apparatus, and cleaning method of liquid ejection head in liquid jet apparatus
FR2920084B1 (en) 2007-08-24 2010-08-20 Endocontrol IMAGING SYSTEM FOR MONITORING A SURGICAL TOOL IN AN OPERATIVE FIELD
US8063822B2 (en) 2008-06-25 2011-11-22 Rockstar Bidco L.P. Antenna system
WO2010107930A1 (en) * 2009-03-17 2010-09-23 The Uwm Research Foundation, Inc. Ultrasonic imaging device
JP2011005042A (en) 2009-06-26 2011-01-13 Canon Inc Photoacoustic imaging apparatus and photoacoustic imaging method
WO2011063266A2 (en) * 2009-11-19 2011-05-26 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
CN103209656B (en) 2010-09-10 2015-11-25 约翰霍普金斯大学 The subsurface anatomy that registration is crossed visual
US8686335B2 (en) * 2011-12-31 2014-04-01 Seno Medical Instruments, Inc. System and method for adjusting the light output of an optoacoustic imaging system
EP2725984B1 (en) 2011-07-01 2018-10-17 Koninklijke Philips N.V. Object-pose-based initialization of an ultrasound beamformer
US20140378796A1 (en) 2011-12-30 2014-12-25 Koninklijke Philips N.V. System and method for needle navitation using pa effect in us imaging
GB201307551D0 (en) 2013-04-26 2013-06-12 Ucl Business Plc A method and apparatus for determining the location of a medical instrument with respect to ultrasound imaging and a medical instrument
US10806346B2 (en) 2015-02-09 2020-10-20 The Johns Hopkins University Photoacoustic tracking and registration in interventional ultrasound

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050049486A1 (en) * 2003-08-28 2005-03-03 Urquhart Steven J. Method and apparatus for performing stereotactic surgery
US20090015826A1 (en) * 2006-03-30 2009-01-15 Duke University Optical assay system for intraoperative assessment of tumor margins
US20100168561A1 (en) * 2006-12-18 2010-07-01 Trillium Precision Surgical, Inc. Intraoperative Tissue Mapping and Dissection Systems, Devices, Methods, and Kits
WO2011100753A2 (en) * 2010-02-15 2011-08-18 The Johns Hopkins University Interventional photoacoustic imaging system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VYAS, S. ET AL.: "Intraoperative ultrasound to stereocamera registration using interventional photoacoustic imaging", MEDICAL IMAGING, vol. 8316, 23 February 2012 (2012-02-23), pages 83160S-1 - 83160S-8 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015085109A1 (en) * 2013-12-04 2015-06-11 The Johns Hopkins University Systems and methods for real-time tracking of photoacoustic sensing
US9723995B2 (en) 2013-12-04 2017-08-08 The Johns Hopkins University Systems and methods for real-time tracking of photoacoustic sensing
CN105828722A (en) * 2013-12-18 2016-08-03 皇家飞利浦有限公司 Electromagnetic tracker based ultrasound probe calibration
WO2015092664A3 (en) * 2013-12-18 2015-09-24 Koninklijke Philips N.V. Electromagnetic tracker based ultrasound probe calibration
CN105828722B (en) * 2013-12-18 2019-10-01 皇家飞利浦有限公司 Ultrasonic probe calibration based on electromagnetic tracker
WO2015112817A1 (en) * 2014-01-24 2015-07-30 Actuated Medical, Inc. Photoacoustic needle insertion platform
GB2538013A (en) * 2014-01-24 2016-11-02 Actuated Medical Inc Photoacoustic needle insertion platform
US10334227B2 (en) 2014-03-28 2019-06-25 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10368054B2 (en) 2014-03-28 2019-07-30 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US10350009B2 (en) 2014-03-28 2019-07-16 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging and printing of surgical implants
US11266465B2 (en) 2014-03-28 2022-03-08 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US11304771B2 (en) 2014-03-28 2022-04-19 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
JP2017513662A (en) * 2014-03-28 2017-06-01 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Alignment of Q3D image with 3D image
US9877697B2 (en) 2014-04-30 2018-01-30 Emory University Systems, methods and computer readable storage media storing instructions for generating planning images based on HDR applicators
WO2016009042A1 (en) * 2014-07-18 2016-01-21 Ludwig-Maximilians-Universität München A method and apparatus for determining an energy deposition of an ion beam
EP2974771A1 (en) * 2014-07-18 2016-01-20 Universität der Bundeswehr München A method and apparatus for determining an energy deposition of an ion beam
US10441817B2 (en) 2014-07-18 2019-10-15 Ludwig-Maximilians-Universität München Method and apparatus for determining an energy deposition of an ion beam
CN106687049A (en) * 2014-09-19 2017-05-17 富士胶片株式会社 Photoacoustic image generation method and device
CN106687049B (en) * 2014-09-19 2019-09-17 富士胶片株式会社 Photoacoustic image generation method and device
EP3195809A4 (en) * 2014-09-19 2017-10-04 Fujifilm Corporation Photoacoustic image generation method and device
JP2016101416A (en) * 2014-11-28 2016-06-02 キヤノン株式会社 Subject information acquisition device
US10456214B2 (en) 2015-01-28 2019-10-29 Brainlab Ag Light point identification method
EP3461451A1 (en) * 2015-01-28 2019-04-03 Brainlab AG Laser pointer system for radiotherapy
WO2016119840A1 (en) * 2015-01-28 2016-08-04 Brainlab Ag Light point identification method
WO2016136206A1 (en) * 2015-02-26 2016-09-01 Canon Kabushiki Kaisha Phantom
CN108778113A (en) * 2015-09-18 2018-11-09 奥瑞斯健康公司 The navigation of tubulose network
CN108778113B (en) * 2015-09-18 2022-04-15 奥瑞斯健康公司 Navigation of tubular networks
US11403759B2 (en) 2015-09-18 2022-08-02 Auris Health, Inc. Navigation of tubular networks
CN107913102A (en) * 2016-10-06 2018-04-17 韦伯斯特生物官能(以色列)有限公司 Anatomic image and the positioning control system carried out using ultrasound it is preoperative registering
CN107913102B (en) * 2016-10-06 2022-08-16 韦伯斯特生物官能(以色列)有限公司 Preoperative registration of anatomical images with position tracking system using ultrasound
CN111110347A (en) * 2019-11-29 2020-05-08 中奕智创医疗科技有限公司 Ultrasonic positioning system, device and storage medium based on biplane image
EP3939496A1 (en) * 2020-07-15 2022-01-19 Deutsches Krebsforschungszentrum - Stiftung des öffentlichen Rechts / Universität Heidelberg Method and system for context-aware photoacoustic imaging
WO2022013397A1 (en) * 2020-07-15 2022-01-20 Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts Method and system for context-aware photoacoustic imaging
CN115177217A (en) * 2022-09-09 2022-10-14 之江实验室 Photoacoustic signal simulation method and device based on spherical particle light pulse excitation effect

Also Published As

Publication number Publication date
US10758209B2 (en) 2020-09-01
US20150031990A1 (en) 2015-01-29

Similar Documents

Publication Publication Date Title
US10758209B2 (en) Photoacoustic tracking and registration in interventional ultrasound
EP3614928B1 (en) Tissue imaging system
US11871913B2 (en) Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11730562B2 (en) Systems and methods for imaging a patient
Boctor et al. Three‐dimensional ultrasound‐guided robotic needle placement: an experimental evaluation
Hughes-Hallett et al. Augmented reality partial nephrectomy: examining the current status and future perspectives
Lediju Bell et al. Photoacoustic-based visual servoing of a needle tip
US9119669B2 (en) Medical tracking system using a gamma camera
US9782147B2 (en) Apparatus and methods for localization and relative positioning of a surgical instrument
US20120253200A1 (en) Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
US20130218024A1 (en) Interventional In-Situ Image-Guidance by Fusing Ultrasound and Video
Schneider et al. Tracked “pick-up” ultrasound for robot-assisted minimally invasive surgery
Najmaei et al. Image‐guided techniques in renal and hepatic interventions
JP6745998B2 (en) System that provides images to guide surgery
WO2015087203A1 (en) Imaging systems and methods for monitoring treatment of tissue lesions
Shahin et al. Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions
Cheng et al. Direct 3D ultrasound to video registration using photoacoustic effect
Vyas et al. Intraoperative ultrasound to stereocamera registration using interventional photoacoustic imaging
Shen Framework for ultrasonography-based augmented reality in robotic surgery: application to transoral surgery and gastrointestinal surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13758479

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13758479

Country of ref document: EP

Kind code of ref document: A1