WO2019157294A1 - System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target - Google Patents

System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target Download PDF

Info

Publication number
WO2019157294A1
WO2019157294A1 PCT/US2019/017231 US2019017231W WO2019157294A1 WO 2019157294 A1 WO2019157294 A1 WO 2019157294A1 US 2019017231 W US2019017231 W US 2019017231W WO 2019157294 A1 WO2019157294 A1 WO 2019157294A1
Authority
WO
WIPO (PCT)
Prior art keywords
target area
target
medical device
fluoroscopic
images
Prior art date
Application number
PCT/US2019/017231
Other languages
French (fr)
Inventor
Ron Barak
Ariel Birenbaum
Guy Alexandroni
Oren P. Weingarten
Original Assignee
Covidien Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/022,222 external-priority patent/US10699448B2/en
Application filed by Covidien Lp filed Critical Covidien Lp
Priority to CA3088277A priority Critical patent/CA3088277A1/en
Priority to JP2020542153A priority patent/JP7322039B2/en
Priority to AU2019217999A priority patent/AU2019217999A1/en
Priority to EP19751690.9A priority patent/EP3750134A4/en
Priority to CN201980012379.9A priority patent/CN111699515B/en
Publication of WO2019157294A1 publication Critical patent/WO2019157294A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00158Holding or positioning arrangements using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/042Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10121Fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30021Catheter; Guide wire
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the disclosure relates to the field of imaging, and particularly to the estimation of a pose of an imaging device and to three-dimensional imaging of body organs.
  • Pose estimation of an imaging device may be required or used for variety of applications, including registration between different imaging modalities or the generation of augmented reality.
  • One of the known uses of a pose estimation of an imaging device is the construction of a three-dimensional volume from a set of two-dimensional images captured by the imaging device while in different poses. Such three-dimensional construction is commonly used in the medical field and has a significant impact.
  • imaging modalities such as magnetic resonance imaging, ultrasound imaging, computed tomography (CT), fluoroscopy as well as others are employed by clinicians to identify and navigate to areas of interest within a patient and ultimately targets for treatment.
  • CT computed tomography
  • pre-operative scans may be utilized for target identification and intraoperative guidance.
  • real-time imaging may be often required in order to obtain a more accurate and current image of the target area.
  • real-time image data displaying the current location of a medical device with respect to the target and its surrounding may be required in order to navigate the medical device to the target in a more safe and accurate manner (e.g., with unnecessary or no damage caused to other tissues and organs).
  • a system for constructing fluoroscopic -based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device includes a structure of markers and a computing device.
  • a sequence of images of the target area and of the structure of markers is acquired via the fluoroscopic imaging device.
  • the computing device is configured to estimate a pose of the fluoroscopic imaging device for a plurality of images of the sequence of images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and construct fluoroscopic -based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
  • the computing device is further configured to facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquiring the sequence of images, and determine an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
  • the system further comprises a locating system indicating a location of the medical device within the patient.
  • the computing device may be further configured to display the target area and the location of the medical device with respect to the target, facilitate navigation of the medical device to the target area via the locating system and the display, and correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
  • the computing device is further configured to display a 3D rendering of the target area on the display, and register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
  • the locating system is an electromagnetic locating system.
  • the target area comprises at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
  • the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
  • the target area may include at least a portion of lungs and the target may be a soft tissue target.
  • a method for constructing fluoroscopic -based three dimensional volumetric data of a target area within a patient from a sequence of two-dimensional (2D) fluoroscopic images of a target area and of a structure of markers acquired via a fluoroscopic imaging device is provided.
  • the structure of markers is positioned between the patient and the fluoroscopic imaging device.
  • the method includes using at least one hardware processor for estimating a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of 2D fluoroscopic images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and constructing fluoroscopic -based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
  • a medical device is positioned in the target area prior to acquiring the sequence of images, and wherein the method further comprises using the at least one hardware processor for determining an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
  • the method further includes facilitating navigation of the medical device to the target area via a locating system indicating a location of the medical device and via a display, and correcting a display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
  • the method further includes displaying a 3D rendering of the target area on the display, and registering the locating system to the 3D rendering, where the correcting of the location of the medical device with respect to the target comprises updating the registration of the locating system to the 3D rendering.
  • the method further includes using the at least one hardware processor for generating the 3D rendering of the target area based on previously acquired CT volumetric data of the target area.
  • the target area includes at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
  • the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
  • the target area may include at least a portion of lungs and the target may be a soft-tissue target.
  • a system for constructing fluoroscopic -based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device includes a computing device configured to estimate a pose of the fluoroscopic imaging device for a plurality of images of a sequence of images based on detection of a possible and most probable projection of a structure of markers as a whole on each image of the plurality of images, and construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
  • the computing device is further configured to facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquisition of the sequence of images, and determine an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
  • the computing device is further configured to display the target area and the location of the medical device with respect to the target, facilitate navigation of the medical device to the target area via the locating system and the display, and correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
  • the computing device is further configured to display a 3D rendering of the target area on the display, and register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
  • FIG. 1 is a flow chart of a method for estimating the pose of an imaging device by utilizing a structure of markers in accordance with one aspect of the disclosure
  • FIG. 2A is a schematic diagram of a system configured for use with the method of Fig. 1 in accordance with one aspect of the disclosure
  • FIG. 2B is a schematic illustration of a two-dimensional grid structure of sphere markers in accordance with one aspect of the disclosure
  • FIG. 3 shows an exemplary image captured by a fluoroscopic device of an artificial chest volume of a Multipurpose Chest Phantom Nl "LUNGMAN", by Kyoto Kagaku, placed over the grid structure of radio-opaque markers of Fig. 2B;
  • Fig. 4 is a probability map generated for the image of Fig. 3 in accordance with one aspect of the disclosure;
  • Figs. 5A-5C show different exemplary candidates for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3 overlaid on the probability map of Fig. 4;
  • Fig. 6A shows a selected candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
  • Fig. 6B shows an improved candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
  • Fig. 6C shows a further improved candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on image 300 of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
  • Fig. 7 is a flow chart of an exemplary method for constructing fluoroscopic three-dimensional volumetric data in accordance with one aspect of the disclosure.
  • FIG. 8 is a view of one illustrative embodiment of an exemplary system for constructing fluoroscopic-based three-dimensional volumetric data in accordance with the disclosure.
  • both the medical device and the target should be visible in some sort of a three-dimensional guidance system.
  • the target is a small soft-tissue object, such as a tumor or a lesion
  • an X-ray volumetric reconstruction is needed in order to be able to identify it.
  • CT and Cone-beam CT which are extensively used in the medical world. These machines algorithmically combine multiple X-ray projections from known, calibrated X-ray source positions into three dimensional volume in which, inter alia, soft-tissues are visible.
  • a CT machine can be used with iterative scans during procedure to provide guidance through the body until the tools reach the target. This is a tedious procedure as it requires several full CT scans, a dedicated CT room and blind navigation between scans. In addition, each scan requires the staff to leave the room due to high-levels of ionizing radiation and exposes the patient to such radiation.
  • a Cone-beam CT machine which is available in some operation rooms and is somewhat easier to operate, but is expensive and like the CT only provides blind navigation between scans, requires multiple iterations for navigation and requires the staff to leave the room.
  • a CT-based imaging system is extremely costly, and in many cases not available in the same location as the location where a procedure is carried out.
  • a fluoroscopic imaging device is commonly located in the operating room during navigation procedures.
  • the standard fluoroscopic imaging device may be used by a clinician, for example, to visualize and confirm the placement of a medical device after it has been navigated to a desired location.
  • standard fluoroscopic images display highly dense objects such as metal tools and bones as well as large soft- tissue objects such as the heart, the fluoroscopic images have difficulty resolving small soft-tissue objects of interest such as lesions.
  • the fluoroscope image is only a two-dimensional projection, while in order to accurately and safely navigate within the body, a volumetric or three-dimensional imaging is required.
  • An endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs.
  • endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three dimensional rendering or volume of the particular body part such as the lungs.
  • the resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable medical device) through a bronchoscope and a branch of the bronchus of a patient to an area of interest.
  • a locating system such as an electromagnetic tracking system, may be utilized in conjunction with the CT data to facilitate guidance of the navigation catheter through the branch of the bronchus to the area of interest.
  • the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical instruments.
  • minimally invasive procedures such as laparoscopy procedures, including robotic-assisted surgery, may employ intraoperative fluoroscopy in order to increase visualization, e.g., for guidance and lesion locating, or in order to prevents injury and complications.
  • Fig. 1 illustrates a flow chart of a method for estimating the pose of an imaging device by utilizing a structure of markers in accordance with an aspect of the disclosure.
  • a probability map may be generated for an image captured by an imaging device.
  • the image includes a projection of a structure of markers.
  • the probability map may indicate the probability of each pixel of the image to belong to the projection of a marker of the structure of markers.
  • the structure of markers may be of a two-dimensional pattern.
  • the structure of markers may be of a periodic pattern, such as a grid.
  • the image may include a projection of at least a portion of the structure of markers.
  • Fig. 2B is a schematic illustration of a two-dimensional (2D) grid structure of sphere markers 220 in accordance with the disclosure.
  • Fig. 3 is an exemplary image 300 captured by a fluoroscopic device of an artificial chest volume of a Multipurpose Chest Phantom Nl "LUNGMAN", by Kyoto Kagaku, placed over the 2D grid structure of sphere markers 220 of Fig. 2B.
  • 2D grid structure of sphere markers 220 includes a plurality of sphere shaped markers, such as sphere markers 230a and 230b, arranged in a two-dimensional grid pattern.
  • Image 300 includes a projection of a portion of 2D grid structure of sphere markers 220 and a projection of a catheter 320.
  • the projection of 2D grid structure of sphere markers 220 on image 300 includes projections of the sphere markers, such as sphere marker projections 3l0a, 3l0b and 3l0c.
  • the probability map may be generated, for example, by feeding the image into a simple marker (blob) detector, such as a Harris corner detector, which outputs a new image of smooth densities, corresponding to the probability of each pixel to belong to a marker.
  • Fig. 4 illustrates a probability map 400 generated for image 300 of Fig. 3.
  • Probability map 400 includes pixels or densities, such as densities 4l0a, 4l0b and 4l0c, which correspond accordingly to markers 3l0a, 3l0b and 3l0c.
  • the probability map may be downscaled (e.g., reduced in size) in order to make the required computations more simple and efficient. It should be noted that probability map 400, as shown in Figs. 5A-6B is downscaled by four and probability map 400 as shown in Fig. 6C is downscaled by two.
  • a step 110 different candidates may be generated for the projection of the structure of markers on the image.
  • the different candidates may be generated by virtually positioning the imaging device in a range of different possible poses.
  • possible poses of the imaging device, it is meant three-dimensional positions and orientations of the imaging device. In some embodiments, such a range may be limited according to the geometrical structure and/or degrees of freedom of the imaging device.
  • a virtual projection of at least a portion of the structure of markers is generated, as if the imaging device actually captured an image of the structure of markers while positioned at that pose.
  • the candidate having the highest probability of being the projection of the structure of markers on the image may be identified based on the image probability map.
  • Each candidate e.g., a virtual projection of the structure of markers, may be overlaid or associated to the probability map.
  • a probability score may be then determined or associated with each marker projection of the candidate.
  • the probability score may be positive or negative, e.g., there may be a cost in case virtual markers projections falls within pixels of low probability.
  • the probability scores of all of the markers projections of a candidate may be then summed and a total probability score may be determined for each candidate. For example, if the structure of markers is a two-dimensional grid, then the projection will have a grid form.
  • Each point of the projection grid would lie on at least one pixel of the probability map.
  • a 2D grid candidate will receive the highest probability score if its points lie on the highest density pixels, that is, if its points lie on projections of the centeres of the markers on the image.
  • the candidate having the highest probability score may be determined as the candidate which has the highest probability of being the projection of the structure of markers on the image.
  • the pose of the imaging device for the image may be then estimated based on the virtual pose of the imaging device used to generate the identified candidate.
  • Figs. 5A-5C illustrate different exemplary candidates 500a-c for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3 overlaid on probability map 400 of Fig. 4.
  • Candidates 500a, 500b and 500c are indicated as a grid of plus signs (“+”), while each such sign indicates the center of a projection of a marker.
  • Candidates 500a, 500b and 500c are virtual projections of 2D grid structure of sphere markers 220, as if the fluoroscope used to capture image 300 is located at three different poses associated correspondingly with these projections.
  • candidate 500a was generated as if the fluoroscope is located at: position [0, -50, 0], angle: -20 degrees.
  • Candidate 500b was generated as if the fluoroscope is located at: position [0, -10, 0], angle: -20 degrees.
  • Candidate 500c was generated as if the fluoroscope is located at: position [7.5, -40, 11.25], angle: -25 degrees.
  • the above-mentioned coordinates are with respect to 2D grid structure of sphere markers 220.
  • Densities 4l0a of probability map 400 are indicated in Figs. 5A-5C. Plus signs 5l0a, 5l0b and 5l0c are the centers of the markers projections of candidates 500a, 500b and 500c correspondingly, which are the ones closest to densities 4l0a.
  • plus sign 5l0c is the sign which best fits densities 4l0a and therefore would receive the highest probability score among signs 5l0a, 5l0b and 5l0c of candidates 500a, 500b and 500c correspondingly.
  • candidate 500c would receive the highest probability score since its markers projections best fit probability map 400.
  • candidate 500c would be identified as the candidate with the highest probability of being the projection of 2D grid structure of sphere markers 220 on image 300.
  • a locally deformed version of the candidate may be generated in order to maximize its probability of being the projection of the structure of markers on the image.
  • the locally deformed version may be generated based on the image probability map.
  • a local search algorithm may be utilized to deform the candidate so that it would maximize its score. For example, in case the structure of markers is a 2D grid, each 2D grid point may be treated individually. Each point may be moved towards the neighbouring local maxima on the probability map using gradient ascent method.
  • an improved candidate for the projection of the structure of markers on the image may be detected based on the locally deformed version of the candidate.
  • the improved candidate is determined such that it fits (exactly or approximately) the locally deformed version of the candidate.
  • Such improved candidate may be determined by identifying a transformation that will fit a new candidate to the local deformed version, e.g., by using homography estimation methods.
  • the virtual pose of the imaging device associated with the improved candidate may be then determined as the estimated pose of the imaging device for the image.
  • the generation of a locally deformed version of the candidate and the determination of an improved candidate may be iteratively repeated.
  • the camera pose may be estimated by solving a homography that transforms a 2D fiducial structure in 3D space into image coordinates that matches the fiducial probability map generated from the imaging device output.
  • Fig. 6A shows a selected candidate 600a, for projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4.
  • Fig. 6B shows an improved candidate 600B, for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4.
  • Fig. 6C shows a further improved candidate 600c, for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4.
  • the identified or selected candidate is candidate 500c, which is now indicated 600a.
  • Candidate 600b is the improved candidate which was generated based on a locally deformed version of candidate 600a according to the method disclosed above.
  • Candidate 600c is a further improved candidate with respect to candidate 600b, generated by iteratively repeating the process of locally deforming the resulting candidate and determining an approximation to maximize the candidate probability.
  • Fig. 6C illustrates the results of refined candidates based on a higher resolution probability map. In an aspect, this is done after completing a refinement step using the down-sampled version of the probability map.
  • Plus signs 6l0a, 6l0b and 6l0c are the centers of the markers projections of candidates 600a, 600b and 600c correspondingly, which are the ones closest to densities 4l0a of probability map 400.
  • the candidates for the projection of 2D grid structure of sphere markers 220 on image 300 converge to the candidate of the highest probability according to probability map 400.
  • the imaging device may be configured to capture a sequence of images.
  • a sequence of images may be captured, automatically or manually, by continuously sweeping the imaging device at a certain angle.
  • pose estimation of a sequence of images is required, the estimation process may become more efficient by reducing the range or area of possible virtual poses for the imaging device.
  • a plurality of non-sequential images of the sequence of images may be then determined. For example, the first image in the sequence, the last image, and one or more images in- between. The one or more images in-between may be determined such that the sequence is divided into equal image portions.
  • the pose of the imaging device may be estimated only for the determined non-sequential images.
  • the area or range of possible different poses for virtually positioning the imaging device may be reduced.
  • the reduction may be performed based on the estimated poses of the imaging device for the determined non-sequential images.
  • the pose of the imaging device for the rest of the images may be then estimated according to the reduced area or range. For example, the pose of the imaging device for the first and tenth images of the sequence are determined at the first stage.
  • the pose of the imaging device for the second to ninth images must be along a feasible and continuous path between its pose for the first image and its pose for the tenth image, and so on.
  • geometrical parameters of the imaging device may be pre -known, or pre-determined, such as the field of view of the source, height range, rotation angle range and the like, including the device degrees of freedom (e.g., independent motions allowed).
  • such geometrical parameters of the imaging device may be determined in real-time while estimating the pose of the imaging device for the captured images. Such information may be also used to reduce the area or range of possible poses.
  • a user practicing the disclosed disclosure may be instructed to limit the motion of the imaging device to certain degrees of freedom or to certain ranges of motion for the sequence of images. Such limitations may be also considered when determining the imaging device possible poses and thus may be used to make the imaging device pose estimation faster.
  • an image pre-processing methods may be first applied to the one or more images in order to correct distortions and/or enhance the visualization of the projection of the structure of markers on the image.
  • the imaging device is a fluoroscope
  • correction of “pincushion” distortion which slightly warps the image, may be performed. This distortion may be automatically addressed by modelling the warp with a polynomial surface and applying compatible warp which will cancel out the pincushion effect.
  • the image may be in versed in order to enhance the projections of the markers.
  • the image may be blurred using Gaussian filter with sigma value equal, for example, to one half of the spheres diameter, in order to facilitate the search and evaluation of candidates as disclosed above.
  • one or more models of the imaging device may be calibrated to generate calibration data, such as a data file, which may be used to automatically calibrate the specific imaging device.
  • the calibration data may include data referring to the geometric calibration and/or distortion calibration, as disclosed above.
  • the geometric calibration may be based on data provided by the imaging device manufacturer.
  • a manual distortion calibration may be performed once for a specific imaging device.
  • the imaging device distortion correction can be calibrated as a preprocessing step during every procedure as the pincushion distortion may change as a result of imaging device maintenance or even as a result of a change in time.
  • Fig. 2A illustrates a schematic diagram of a system 200 configured for use with the method of Fig. 1 in accordance with one aspect of the disclosure.
  • System 200 may include a workstation 80, an imaging device 215 and a structure of markers structure 218.
  • workstation 80 may be coupled with imaging device 215, directly or indirectly, e.g., by wireless communication.
  • Workstation 80 may include a memory 202, a processor 204, a display 206 and an input device 210.
  • Processor or hardware processor 204 may include one or more hardware processors.
  • Workstation 80 may optionally include an output module 212 and a network interface 208.
  • Memory 202 may store an application 81 and image data 214.
  • Application 81 may include instructions executable by processor 204, inter alia, for executing the method of Fig. 1 and a user interface 216.
  • Workstation 80 may be a stationary computing device, such as a personal computer, or a portable computing device such as a tablet computer. Workstation 80 may embed a plurality of computer devices.
  • Memory 202 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by processor 204 and which control the operation of workstation 80 and in some embodiments, may also control the operation of imaging device 215.
  • memory 202 may include one or more solid-state storage devices such as flash memory chips.
  • memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown).
  • computer-readable media can be any available media that can be accessed by the processor 204. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by workstation 80.
  • Application 81 may, when executed by processor 204, cause display 206 to present user interface 216.
  • Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet.
  • Network interface 208 may be used to connect between workstation 80 and imaging device 215.
  • Network interface 208 may be also used to receive image data 214.
  • Input device 210 may be any device by means of which a user may interact with workstation 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • Imaging device 215 may be any imaging device, which captures 2D images, such as a standard fluoroscopic imaging device or a camera.
  • markers structure 218, may be a structure of markers having a two-dimensional pattern, such as a grid having two dimensions of width and length (e.g., 2D grid), as shown in Fig. 2B.
  • 2D pattern as opposed to a 3D pattern, may facilitate the pose estimation process.
  • a 2D pattern would be more convenient for the patient.
  • the markers should be formed such that they will be visible in the imaging modality used.
  • the markers should be made of a material which is at least partially radio-opaque.
  • the shape of the markers may be symmetric and such that the projection of the markers on the image would be the same at any pose the imaging device may be placed. Such configuration may simplify and enhance the pose estimation process and/or make it more efficient.
  • markers having a rotation symmetry may be preferred, such as spheres.
  • the size of the markers structure and/or the number of markers in the structure may be determined according to the specific use of the disclosed systems and methods.
  • the markers structure may be of a size similar or larger than the size of the area of interest.
  • the pattern of markers structure 218 may be two-dimensional and/or periodic, such as a 2D grid. Using a periodic and/or of a two-dimensional pattern structure of markers may further enhance and facilitate the pose estimation process and make it more efficient.
  • 2D grid structure of sphere markers 220 has a 2D periodic pattern of a grid and includes symmetric markers in the shape of a sphere. Such a configuration simplifies and enhances the pose estimation process, as described in Fig. 1, specifically when generating the virtual candidates for the markers structure projection and when determining the optimal one.
  • the structure of markers, as a fiducial should be positioned in a stationary manner during the capturing of the one or more images.
  • the sphere markers diameter may be 2+0.2 mm and the distance between the spheres may be about 15+0.15 mm isotropic.
  • imaging device 215 may capture one or more images (i.e., a sequence of images) such that at least a projection of a portion of markers structure 218 is shown in each image.
  • the image or sequence of images captured by imaging device 215 may be then stored in memory 202 as image data 214.
  • the image data may be then processed by processor 204 and according to the method of Fig. 1, to determine the pose of imaging device 215.
  • the pose estimation data may be then output via output module 212, display 206 and/or network interface 208.
  • Markers structure 218 may be positioned with respect to an area of interest, such as under an area of interest within the body of a patient going through a fluoroscopic scan.
  • Markers structure 218 and the patient will then be positioned such that the one or more images captured by imaging device 215 would capture the area of interest and a portion of markers structure 218. If required, once the pose estimation process is complete, the projection of markers structure 218 on the images may be removed by using well known methods.
  • One such method is described in commonly-owned U.S. Patent Application No. 16/259,612, entitled: “IMAGE RECONSTRUCTION SYSTEM AND METHOD", filed on January 28, 2019, by Alexandroni et al., the entire content of which is hereby incorporated by reference.
  • Fig. 7 is a flow chart of an exemplary method for constructing fluoroscopic three-dimensional volumetric data in accordance with the disclosure.
  • a method for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two dimensional fluoroscopic images is hereby disclosed.
  • a sequence of images of the target area and of a structure of markers is acquired via a fluoroscopic imaging device.
  • the structure of markers may be the two-dimensional structure of markers described with respect to Figs. 1, 2A and 2B.
  • the structure of markers may be positioned between the patient and the fluoroscopic imaging device.
  • the target area may include, for example, at least a portion of the lungs, and as exemplified with respect to the system of Fig. 8.
  • the target is a soft-tissue target, such as within a lung, kidney, liver and the like.
  • a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of images may be estimated.
  • the pose estimation may be performed based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and as described with respect to Fig. 1.
  • a fluoroscopic -based three-dimensional volumetric data of the target area may be constructed based on the estimated poses of the fluoroscopic imaging device. Exemplary systems and methods for constructing such fluoroscopic-based three-dimensional volumetric data are disclosed in the above commonly-owned U.S. Patent Publication No. 2017/0035379, which is incorporated by reference.
  • a medical device may be positioned in the target area prior to the acquiring of the sequence of images.
  • the sequence of images and consequently the fluoroscopic-based three-dimensional volumetric data may also include a projection of the medical device in addition to the target.
  • the offset (i.e., Dc, Ay and Dz) between the medical device and the target may be then determined based on the fluoroscopic -based three-dimensional volumetric data.
  • the target may be visible or better exhibited in the generated three-dimensional volumetric data. Therefore, the target may be detected, automatically, or manually by the user, in the three-dimensional volumetric data.
  • the medical device may be detected, automatically or manually by a user, in the sequence of images, as captured, or in the generated three-dimensional volumetric data.
  • the automatic detection of the target and/or the medical device may be performed based on systems and methods as known in the art and such as described, for example, in commonly-owned U.S. Patent Application No. 62/627,911, titled: "SYSTEM AND METHOD FOR CATHETER DETECTION IN FLUOROSCOPIC IMAGES AND UPDATING DISPLAYED POSITION OF CATHETER", filed on February 8, 2018, by Birenbaum et al.
  • the manual detection may be performed by displaying to the user the three-dimensional volumetric data and/or captured images and requesting his input. Once the target and the medical device are detected in the three-dimensional volumetric data and/or the captures images, their location in the fluoroscopic coordinate system of reference may be obtained and the offset between them may be determined.
  • the offset between the target and the medical device may be utilized for various medical purposes, including facilitating approach of the medical device to the target area and treatment.
  • the navigation of a medical device to the target area may be facilitated via a locating system and a display.
  • the locating system locates or tracks the motion of the medical device through the patient’s body.
  • the display may display the medical device location to the user with respect to the surroundings of the medical device within the patient’s body and the target.
  • the locating system may be, for example, an electromagnetic or optic locating system, or any other such system as known in the art.
  • the medical device may be navigated to the target area through the airways luminal network and as described with respect to Fig. 8.
  • a display of the location of the medical device with respect to the target may be corrected based on the determined offset between the medical device and the target.
  • a 3D rendering of the target area may be displayed on the display.
  • the 3D rendering of the target area may be generated based on CT volumetric data of the target area which was acquired previously, e.g., prior to the current procedure or operation (e.g., preoperative CT).
  • the locating system may be registered to the 3D rendering of the target, such as described, for example, with respect to Fig. 8 below.
  • the correction of the offset between the medical device and the target may be then performed by updating the registration of the locating system to the 3D rendering.
  • a transformation between coordinate system of reference of the fluoroscopic images and the coordinate system of reference of the locating system should be known.
  • the geometrical positioning of the structure of markers with respect to the locating system may determine such a transformation.
  • the structure of markers and the locating system are positioned such that the same coordinate system of reference would apply to both, or such that the one would be only a translated version of the other.
  • the updating of the registration of the locating system to the 3D rendering may be performed in a local manner and/or in a gradual manner.
  • the registration may be updated only in the surroundings of the target, e.g., only within a certain distance from the target. This is since the update may be less accurate when not performed around the target.
  • the updating may be performed in a gradual manner, e.g., by applying weights according to distance from the target. In addition to accuracy considerations, such gradual updating may be more convenient or easier for the user to look at, process and make the necessary changes during procedure, than abrupt change in the medical device location on the display.
  • the patient may be instructed to stop breathing (or caused to stop breathing) during the capture of the images in order to prevent movements of the target area due to breathing.
  • methods for compensating breathing movements during the capture of the images may be performed.
  • the estimated poses of the fluoroscopic device may be corrected according to the movements of a fiducial marker placed in the target area.
  • a fiducial may be a medical device, e.g., a catheter, placed in the target area.
  • the movement of the catheter for example, may be determined based on the locating system.
  • a breathing pattern of the patient may be determined according to the movements of a fiducial marker, such as a catheter, located in the target area.
  • the movements may be determined via a locating system. Based on that pattern, only images of inhale or exhale may be considered when determining the pose of the imaging device.
  • the imaging device three-dimensional position and orientation are estimated based on a set of static markers positioned on the patient bed. This process requires knowledge about the markers 3D positions in the volume, as well as the compatible 2D coordinates of the projections in the image plane. Adding one or more markers from different planes in the volume of interest may lead to more robust and accurate pose estimation.
  • One possible marker that can be utilized in such a process is the catheter tip (or other medical device tip positioned through the catheter).
  • the tip is visible throughout the video captured by fluoroscopic imaging and the compatible 3D positions may be provided by a navigation or tracking system (e.g., an electromagnetic navigation tracking system) as the tool is navigated to the target (e.g., through the electromagnetic field). Therefore, the only remaining task is to deduce the exact 2D coordinates from the video frames.
  • a navigation or tracking system e.g., an electromagnetic navigation tracking system
  • the tip detection step may include fully automated detection and tracking of the tip throughout the video.
  • Another embodiment may implement semi-supervised tracking in which the user manually marks the tip in one or more frames and the detection process computes the tip coordinates for the rest of the frames.
  • the semi- supervised tracking process may be implemented in accordance with solving each frame at a time by template matching between current frame and previous ones, using optical flow to estimate the tip movement along the video, and/or model-based trackers.
  • Model-based trackers train a detector to estimate the probability of each pixel to belong to the catheter tip, which is followed by a step of combining the detections to a single most probable list of coordinates along the video.
  • One possible embodiment of the model-based trackers involves dynamic programming. Such an optimization approach enables finding a seam (connected list of coordinates along the video frames 3D space - first two dimensions belongs to the image plane and the third axis is time) with maximal probability.
  • Another possible way to achieve a seam of two-dimensional coordinates is training a detector to estimate the tip coordinate in each frame while incorporating a regularization to the loss function of proximity between detections in adjacent frames.
  • Fig. 8 illustrates an exemplary system 800 for constructing fluoroscopic- based three-dimensional volumetric data in accordance with the disclosure.
  • System 800 may be configured to construct fluoroscopic-based three-dimensional volumetric data of a target area including at least a portion of the lungs of a patient from 2D fluoroscopic images.
  • System 800 may be further configured to facilitate approach of a medical device to the target area by using Electromagnetic Navigation Bronchoscopy (ENB) and for determining the location of a medical device with respect to the target.
  • ENB Electromagnetic Navigation Bronchoscopy
  • System 800 may be configured for reviewing CT image data to identify one or more targets, planning a pathway to an identified target (planning phase), navigating an extended working channel (EWC) 812 of a catheter assembly to a target (navigation phase) via a user interface, and confirming placement of EWC 812 relative to the target.
  • EWC extended working channel
  • One such EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY ® system currently sold by Medtronic PLC.
  • the target may be tissue of interest identified by review of the CT image data during the planning phase.
  • a medical device such as a biopsy tool or other tool, may be inserted into EWC 812 to obtain a tissue sample from the tissue located at, or proximate to, the target.
  • FIG. 8 illustrates EWC 812 which is part of a catheter guide assembly 840.
  • EWC 812 is inserted into a bronchoscope 830 for access to a luminal network of the patient“P.”
  • EWC 812 of catheter guide assembly 840 may be inserted into a working channel of bronchoscope 830 for navigation through a patient’s luminal network.
  • a locatable guide (LG) 832, including a sensor 844 is inserted into EWC 812 and locked into position such that sensor 844 extends a desired distance beyond the distal tip of EWC 812. The position and orientation of sensor 844 relative to the reference coordinate system, and thus the distal portion of EWC 812, within an electromagnetic field can be derived.
  • LG locatable guide
  • Catheter guide assemblies 840 are currently marketed and sold by Medtronic PLC under the brand names SUPERDIMENSION ® Procedure Kits, or EDGETM Procedure Kits, and are contemplated as useable with the disclosure.
  • catheter guide assemblies 840 reference is made to commonly-owned U.S. Patent Publication No. 2014/0046315, filed on March 15, 2013, by Ladtkow et al, U.S. Patent No. 7,233,820, and U.S. Patent No. 9,044,254, the entire contents of each of which are hereby incorporated by reference.
  • System 800 generally includes an operating table 820 configured to support a patient“P,” a bronchoscope 830 configured for insertion through the patient’s“P’s” mouth into the patient’s “P’s” airways; monitoring equipment 835 coupled to bronchoscope 830 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 830); a locating system 850 including a locating module 852, a plurality of reference sensors 854 and a transmitter mat coupled to a structure of markers 856; and a computing device 825 including software and/or hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical device to the target, and confirmation of placement of EWC 812, or a suitable device therethrough, relative to the target.
  • Computing device 825 may be similar to workstation 80 of Fig. 2A and may be configured, inter alia, to execute the method of Fig. 1.
  • a fluoroscopic imaging device 810 capable of acquiring fluoroscopic or x-ray images or video of the patient“P” is also included in this particular aspect of system 800.
  • the images, sequence of images, or video captured by fluoroscopic imaging device 810 may be stored within fluoroscopic imaging device 810 or transmitted to computing device 825 for storage, processing, and display, as described with respect to Fig. 2A.
  • fluoroscopic imaging device 810 may move relative to the patient“P” so that images may be acquired from different angles or perspectives relative to patient“P” to create a sequence of fluoroscopic images, such as a fluoroscopic video.
  • the pose of fluoroscopic imaging device 810 relative to patient“P” and for the images may be estimated via the structure of markers and according to the method of Fig. 1.
  • the structure of markers is positioned under patient“P,” between patient“P” and operating table 820 and between patient“P” and a radiation source of fluoroscopic imaging device 810.
  • Structure of markers is coupled to the transmitter mat (both indicated 856) and positioned under patient “P” on operating table 820.
  • Structure of markers and transmitter mat 856 are positioned under the target area within the patient in a stationary manner.
  • Structure of markers and transmitter mat 856 may be two separate elements which may be coupled in a fixed manner or alternatively may be manufactured as one unit.
  • Fluoroscopic imaging device 810 may include a single imaging device or more than one imaging device. In embodiments including multiple imaging devices, each imaging device may be a different type of imaging device or the same type. Further details regarding the imaging device 810 are described in U.S. Patent No. 8,565,858, which is incorporated by reference in its entirety herein.
  • Computing device 185 may be any suitable computing device including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium.
  • Computing device 185 may further include a database configured to store patient data, CT data sets including CT images, fluoroscopic data sets including fluoroscopic images and video, navigation plans, and any other such data.
  • computing device 185 may include inputs, or may otherwise be configured to receive, CT data sets, fluoroscopic images/video and other data described herein.
  • computing device 185 includes a display configured to display graphical user interfaces. Computing device 185 may be connected to one or more networks through which one or more databases may be accessed.
  • computing device 185 utilizes previously acquired CT image data for generating and viewing a three dimensional model of the patient’s“P’s” airways, enables the identification of a target on the three dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through the patient’s“P’s” airways to tissue located at and around the target. More specifically, CT images acquired from previous CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three-dimensional model of the patient’s“P’s” airways. The three-dimensional model may be displayed on a display associated with computing device 185, or in any other suitable fashion.
  • the enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data.
  • the three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through the patient’s“P’s” airways to access tissue located at the target can be made. Once selected, the pathway plan, three dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s).
  • One such planning software is the ILOGIC ® planning suite currently sold by Medtronic PLC.
  • a six degrees-of-freedom electromagnetic locating or tracking system 850 e.g., similar to those disclosed in U.S. Patent Nos. 8,467,589, 6,188,355, and published PCT Application Nos. WO 00/10456 and WO 01/67035, the entire contents of each of which are incorporated herein by reference, or other suitable positioning measuring system, is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated.
  • Tracking system 850 includes a locating or tracking module 852, a plurality of reference sensors 854, and a transmitter mat 856. Tracking system 850 is configured for use with a locatable guide 832 and particularly sensor 844.
  • locatable guide 832 and sensor 844 are configured for insertion through an EWC 182 into a patient’s“P’s” airways (either with or without bronchoscope 830) and are selectively lockable relative to one another via a locking mechanism.
  • Transmitter mat 856 is positioned beneath patient“P.” Transmitter mat 856 generates an electromagnetic field around at least a portion of the patient“P” within which the position of a plurality of reference sensors 854 and the sensor 844 can be determined with use of a tracking module 852. One or more of reference sensors 854 are attached to the chest of the patient“P.” The six degrees of freedom coordinates of reference sensors 854 are sent to computing device 825 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference.
  • Registration is generally performed to coordinate locations of the three-dimensional model and two-dimensional images from the planning phase with the patient’s“P’s” airways as observed through the bronchoscope 830, and allow for the navigation phase to be undertaken with precise knowledge of the location of the sensor 844, even in portions of the airway where the bronchoscope 830 cannot reach. Further details of such a registration technique and their implementation in luminal navigation can be found in U.S. Patent Application Pub. No. 2011/0085720, the entire content of which is incorporated herein by reference, although other suitable techniques are also contemplated.
  • Registration of the patient’s“P’s” location on the transmitter mat 856 is performed by moving LG 832 through the airways of the patient’s “P.” More specifically, data pertaining to locations of sensor 844, while locatable guide 832 is moving through the airways, is recorded using transmitter mat 856, reference sensors 854, and tracking module 852. A shape resulting from this location data is compared to an interior geometry of passages of the three dimensional model generated in the planning phase, and a location correlation between the shape and the three dimensional model based on the comparison is determined, e.g., utilizing the software on computing device 825. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional model.
  • non-tissue space e.g., air filled cavities
  • the software aligns, or registers, an image representing a location of sensor 844 with the three-dimensional model and two-dimensional images generated from the three-dimension model, which are based on the recorded location data and an assumption that locatable guide 832 remains located in non-tissue space in the patient’s “P’s” airways.
  • a manual registration technique may be employed by navigating the bronchoscope 830 with the sensor 844 to pre-specified locations in the lungs of the patient“P”, and manually correlating the images from the bronchoscope to the model data of the three dimensional model.
  • a user interface is displayed in the navigation software which sets for the pathway that the clinician is to follow to reach the target.
  • One such navigation software is the ILOGIC ® navigation suite currently sold by Medtronic PLC.
  • the locatable guide 832 may be unlocked from EWC 812 and removed, leaving EWC 812 in place as a guide channel for guiding medical devices including without limitation, optical systems, ultrasound probes, marker placement tools, biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target.
  • medical devices including without limitation, optical systems, ultrasound probes, marker placement tools, biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target.
  • the disclosed exemplary system 800 may be employed by the method of Fig. 7 to construct fluoroscopic-based three-dimensional volumetric data of a target located in the lungs area and to correct the location of a medical device navigated to the target area with respect to the target.
  • System 800 or similar version of it in conjunction with the method of Fig. 7 may be used in various procedures, other than ENB procedures with the required modifications, and such as laparoscopy or robotic-assisted surgery.
  • Systems and methods in accordance with the disclosure may be usable for facilitating the navigation of a medical device to a target and/or its area using real-time two-dimensional fluoroscopic images of the target area.
  • the navigation is facilitated by using local three-dimensional volumetric data, in which small soft-tissue objects are visible, constructed from a sequence of fluoroscopic images captured by a standard fluoroscopic imaging device available in most procedure rooms.
  • the fluoroscopic-based constructed local three-dimensional volumetric data may be used to correct a location of a medical device with respect to a target or may be locally registered with previously acquired volumetric data (e.g., CT data).
  • the location of the medical device may be determined by a tracking system, for example, an electromagnetic tracking system.
  • the tracking system may be registered with the previously acquired volumetric data.
  • a local registration of the real-time three-dimensional fluoroscopic data to the previously acquired volumetric data may be then performed via the tracking system.
  • Such real-time data may be used, for example, for guidance, navigation planning, improved navigation accuracy, navigation confirmation, and treatment confirmation.
  • the methods disclosed may further include a step for generating a 3D rendering of the target area based on a pre-operative CT scan.
  • a display of the target area may then include a display of the 3D rendering.
  • the tracking system may be registered with the 3D rendering.
  • a correction of the location of the medical device with respect to the target may then include the local updating of the registration between the tracking system and the 3D rendering in the target area.
  • the methods disclosed may further include a step for registering the fluoroscopic 3D reconstruction to the tracking system.
  • a local registration between the fluoroscopic 3D reconstruction and the 3D rendering may be performed in the target area.
  • a medical instrument such as a biopsy tool or an energy device, such as a microwave ablation catheter, that is positionable through one or more branched luminal networks of a patient to treat tissue
  • a medical instrument such as a biopsy tool or an energy device, such as a microwave ablation catheter
  • Access to luminal networks may be percutaneous or through natural orifice using navigation techniques.
  • navigation through a luminal network may be accomplished using image-guidance.
  • image-guidance systems may be separate or integrated with the energy device or a separate access tool and may include MRI, CT, fluoroscopy, ultrasound, electrical impedance tomography, optical, and/or device tracking systems.
  • Methodologies for locating the access tool include EM, IR, echolocation, optical, and others.
  • Tracking systems may be integrated to an imaging device, where tracking is done in virtual space or fused with preoperative or live images.
  • the treatment target may be directly accessed from within the lumen, such as for the treatment of the endobronchial wall for COPD, Asthma, lung cancer, etc.
  • the energy device and/or an additional access tool may be required to pierce the lumen and extend into other tissues to reach the target, such as for the treatment of disease within the parenchyma.
  • Final localization and confirmation of energy device or tool placement may be performed with imaging and/or navigational guidance using a standard fluoroscopic imaging device incorporated with methods and systems described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Human Computer Interaction (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Robotics (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A system and method for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device.

Description

SYSTEM AND METHOD FOR POSE ESTIMATION OF AN IMAGING DEVICE AND FOR DETERMINING THE LOCATION OF A MEDICAL DEVICE WITH
RESPECT TO A TARGET
BACKGROUND
[0001] The disclosure relates to the field of imaging, and particularly to the estimation of a pose of an imaging device and to three-dimensional imaging of body organs.
[0002] Pose estimation of an imaging device, such as a camera or a fluoroscopic device, may be required or used for variety of applications, including registration between different imaging modalities or the generation of augmented reality. One of the known uses of a pose estimation of an imaging device is the construction of a three-dimensional volume from a set of two-dimensional images captured by the imaging device while in different poses. Such three-dimensional construction is commonly used in the medical field and has a significant impact.
[0003] There are several commonly applied medical methods, such as endoscopic procedures or minimally invasive procedures, for treating various maladies affecting organs including the liver, brain, heart, lung, gall bladder, kidney and bones. Often, one or more imaging modalities, such as magnetic resonance imaging, ultrasound imaging, computed tomography (CT), fluoroscopy as well as others are employed by clinicians to identify and navigate to areas of interest within a patient and ultimately targets for treatment. In some procedures, pre-operative scans may be utilized for target identification and intraoperative guidance. However, real-time imaging may be often required in order to obtain a more accurate and current image of the target area. Furthermore, real-time image data displaying the current location of a medical device with respect to the target and its surrounding may be required in order to navigate the medical device to the target in a more safe and accurate manner (e.g., with unnecessary or no damage caused to other tissues and organs).
SUMMARY
[0004] According to one aspect of the disclosure, a system for constructing fluoroscopic -based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device is provided. The system includes a structure of markers and a computing device. A sequence of images of the target area and of the structure of markers is acquired via the fluoroscopic imaging device. The computing device is configured to estimate a pose of the fluoroscopic imaging device for a plurality of images of the sequence of images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and construct fluoroscopic -based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
[0005] In an aspect, the computing device is further configured to facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquiring the sequence of images, and determine an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
[0006] In an aspect, the system further comprises a locating system indicating a location of the medical device within the patient. Additionally, the computing device may be further configured to display the target area and the location of the medical device with respect to the target, facilitate navigation of the medical device to the target area via the locating system and the display, and correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
[0007] In an aspect, the computing device is further configured to display a 3D rendering of the target area on the display, and register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
[0008] In an aspect, the locating system is an electromagnetic locating system.
[0009] In an aspect, the target area comprises at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
[0010] In an aspect, the structure of markers is at least one of a periodic pattern or a two-dimensional pattern. The target area may include at least a portion of lungs and the target may be a soft tissue target.
[0011] In yet another aspect of the disclosure, a method for constructing fluoroscopic -based three dimensional volumetric data of a target area within a patient from a sequence of two-dimensional (2D) fluoroscopic images of a target area and of a structure of markers acquired via a fluoroscopic imaging device is provided. The structure of markers is positioned between the patient and the fluoroscopic imaging device. The method includes using at least one hardware processor for estimating a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of 2D fluoroscopic images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and constructing fluoroscopic -based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
[0012] In an aspect, a medical device is positioned in the target area prior to acquiring the sequence of images, and wherein the method further comprises using the at least one hardware processor for determining an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
[0013] In an aspect, the method further includes facilitating navigation of the medical device to the target area via a locating system indicating a location of the medical device and via a display, and correcting a display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
[0014] In an aspect, the method further includes displaying a 3D rendering of the target area on the display, and registering the locating system to the 3D rendering, where the correcting of the location of the medical device with respect to the target comprises updating the registration of the locating system to the 3D rendering.
[0015] In an aspect, the method further includes using the at least one hardware processor for generating the 3D rendering of the target area based on previously acquired CT volumetric data of the target area.
[0016] In an aspect, the target area includes at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
[0017] In an aspect, the structure of markers is at least one of a periodic pattern or a two-dimensional pattern. The target area may include at least a portion of lungs and the target may be a soft-tissue target.
[0018] In yet another aspect of the disclosure, a system for constructing fluoroscopic -based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device is provided. The system includes a computing device configured to estimate a pose of the fluoroscopic imaging device for a plurality of images of a sequence of images based on detection of a possible and most probable projection of a structure of markers as a whole on each image of the plurality of images, and construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
[0019] In an aspect, the computing device is further configured to facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquisition of the sequence of images, and determine an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
[0020] In an aspect, the computing device is further configured to display the target area and the location of the medical device with respect to the target, facilitate navigation of the medical device to the target area via the locating system and the display, and correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
[0021] In an aspect, the computing device is further configured to display a 3D rendering of the target area on the display, and register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] Various exemplary embodiments are illustrated in the accompanying figures with the intent that these examples not be restrictive. It will be appreciated that for simplicity and clarity of the illustration, elements shown in the figures referenced below are not necessarily drawn to scale. Also, where considered appropriate, reference numerals may be repeated among the figures to indicate like, corresponding or analogous elements. The figures are listed below.
[0023] Fig. 1 is a flow chart of a method for estimating the pose of an imaging device by utilizing a structure of markers in accordance with one aspect of the disclosure;
[0024] Fig. 2A is a schematic diagram of a system configured for use with the method of Fig. 1 in accordance with one aspect of the disclosure;
[0025] Fig. 2B is a schematic illustration of a two-dimensional grid structure of sphere markers in accordance with one aspect of the disclosure;
[0026] Fig. 3 shows an exemplary image captured by a fluoroscopic device of an artificial chest volume of a Multipurpose Chest Phantom Nl "LUNGMAN", by Kyoto Kagaku, placed over the grid structure of radio-opaque markers of Fig. 2B; [0027] Fig. 4 is a probability map generated for the image of Fig. 3 in accordance with one aspect of the disclosure;
[0028] Figs. 5A-5C show different exemplary candidates for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3 overlaid on the probability map of Fig. 4;
[0029] Fig. 6A shows a selected candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
[0030] Fig. 6B shows an improved candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on the image of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
[0031] Fig. 6C shows a further improved candidate for the projection of the 2D grid structure of sphere markers of Fig. 2B on image 300 of Fig. 3, overlaid on the probability map of Fig. 4 in accordance with one aspect of the disclosure;
[0032] Fig. 7 is a flow chart of an exemplary method for constructing fluoroscopic three-dimensional volumetric data in accordance with one aspect of the disclosure; and
[0033] Fig. 8 is a view of one illustrative embodiment of an exemplary system for constructing fluoroscopic-based three-dimensional volumetric data in accordance with the disclosure.
DETAILED DESCRIPTION
[0034] Prior art methods and systems for pose estimation may be inappropriate for real time use, inaccurate or non-robust. Therefore, there is a need for a method and system, which provide a relatively fast, accurate and robust pose estimation, particularly in the field of medical imaging.
[0035] In order to navigate medical devices to a remote target for example, for biopsy or treatment, both the medical device and the target should be visible in some sort of a three-dimensional guidance system. When the target is a small soft-tissue object, such as a tumor or a lesion, an X-ray volumetric reconstruction is needed in order to be able to identify it. Several solutions exist that provide three-dimensional volume reconstruction such as CT and Cone-beam CT which are extensively used in the medical world. These machines algorithmically combine multiple X-ray projections from known, calibrated X-ray source positions into three dimensional volume in which, inter alia, soft-tissues are visible. For example, a CT machine can be used with iterative scans during procedure to provide guidance through the body until the tools reach the target. This is a tedious procedure as it requires several full CT scans, a dedicated CT room and blind navigation between scans. In addition, each scan requires the staff to leave the room due to high-levels of ionizing radiation and exposes the patient to such radiation. Another option is a Cone-beam CT machine which is available in some operation rooms and is somewhat easier to operate, but is expensive and like the CT only provides blind navigation between scans, requires multiple iterations for navigation and requires the staff to leave the room. In addition, a CT-based imaging system is extremely costly, and in many cases not available in the same location as the location where a procedure is carried out.
[0036] A fluoroscopic imaging device is commonly located in the operating room during navigation procedures. The standard fluoroscopic imaging device may be used by a clinician, for example, to visualize and confirm the placement of a medical device after it has been navigated to a desired location. However, although standard fluoroscopic images display highly dense objects such as metal tools and bones as well as large soft- tissue objects such as the heart, the fluoroscopic images have difficulty resolving small soft-tissue objects of interest such as lesions. Furthermore, the fluoroscope image is only a two-dimensional projection, while in order to accurately and safely navigate within the body, a volumetric or three-dimensional imaging is required.
[0037] An endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs. To enable the endoscopic, and more particularly the bronchoscopic, approach in the lungs, endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three dimensional rendering or volume of the particular body part such as the lungs.
[0038] The resulting volume generated from the MRI scan or CT scan is then utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable medical device) through a bronchoscope and a branch of the bronchus of a patient to an area of interest. A locating system, such as an electromagnetic tracking system, may be utilized in conjunction with the CT data to facilitate guidance of the navigation catheter through the branch of the bronchus to the area of interest. In certain instances, the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical instruments. [0039] As another example, minimally invasive procedures, such as laparoscopy procedures, including robotic-assisted surgery, may employ intraoperative fluoroscopy in order to increase visualization, e.g., for guidance and lesion locating, or in order to prevents injury and complications.
[0040] Therefore, a fast, accurate and robust three-dimensional reconstruction of images is required, which is generated based on a standard fluoroscopic imaging performed during medical procedures.
[0041] Fig. 1 illustrates a flow chart of a method for estimating the pose of an imaging device by utilizing a structure of markers in accordance with an aspect of the disclosure. In a step 100, a probability map may be generated for an image captured by an imaging device. The image includes a projection of a structure of markers. The probability map may indicate the probability of each pixel of the image to belong to the projection of a marker of the structure of markers. In some embodiments, the structure of markers may be of a two-dimensional pattern. In some embodiments, the structure of markers may be of a periodic pattern, such as a grid. The image may include a projection of at least a portion of the structure of markers.
[0042] Reference is now made to Figs. 2B and 3. Fig. 2B is a schematic illustration of a two-dimensional (2D) grid structure of sphere markers 220 in accordance with the disclosure. Fig. 3 is an exemplary image 300 captured by a fluoroscopic device of an artificial chest volume of a Multipurpose Chest Phantom Nl "LUNGMAN", by Kyoto Kagaku, placed over the 2D grid structure of sphere markers 220 of Fig. 2B. 2D grid structure of sphere markers 220 includes a plurality of sphere shaped markers, such as sphere markers 230a and 230b, arranged in a two-dimensional grid pattern. Image 300 includes a projection of a portion of 2D grid structure of sphere markers 220 and a projection of a catheter 320. The projection of 2D grid structure of sphere markers 220 on image 300 includes projections of the sphere markers, such as sphere marker projections 3l0a, 3l0b and 3l0c.
[0043] The probability map may be generated, for example, by feeding the image into a simple marker (blob) detector, such as a Harris corner detector, which outputs a new image of smooth densities, corresponding to the probability of each pixel to belong to a marker. Fig. 4 illustrates a probability map 400 generated for image 300 of Fig. 3. Probability map 400 includes pixels or densities, such as densities 4l0a, 4l0b and 4l0c, which correspond accordingly to markers 3l0a, 3l0b and 3l0c. In some embodiments, the probability map may be downscaled (e.g., reduced in size) in order to make the required computations more simple and efficient. It should be noted that probability map 400, as shown in Figs. 5A-6B is downscaled by four and probability map 400 as shown in Fig. 6C is downscaled by two.
[0044] In a step 110, different candidates may be generated for the projection of the structure of markers on the image. The different candidates may be generated by virtually positioning the imaging device in a range of different possible poses. By “possible poses” of the imaging device, it is meant three-dimensional positions and orientations of the imaging device. In some embodiments, such a range may be limited according to the geometrical structure and/or degrees of freedom of the imaging device. For each such possible pose, a virtual projection of at least a portion of the structure of markers is generated, as if the imaging device actually captured an image of the structure of markers while positioned at that pose.
[0045] In a step 120, the candidate having the highest probability of being the projection of the structure of markers on the image may be identified based on the image probability map. Each candidate, e.g., a virtual projection of the structure of markers, may be overlaid or associated to the probability map. A probability score may be then determined or associated with each marker projection of the candidate. In some embodiments, the probability score may be positive or negative, e.g., there may be a cost in case virtual markers projections falls within pixels of low probability. The probability scores of all of the markers projections of a candidate may be then summed and a total probability score may be determined for each candidate. For example, if the structure of markers is a two-dimensional grid, then the projection will have a grid form. Each point of the projection grid would lie on at least one pixel of the probability map. A 2D grid candidate will receive the highest probability score if its points lie on the highest density pixels, that is, if its points lie on projections of the centeres of the markers on the image. The candidate having the highest probability score may be determined as the candidate which has the highest probability of being the projection of the structure of markers on the image. The pose of the imaging device for the image may be then estimated based on the virtual pose of the imaging device used to generate the identified candidate.
[0046] Figs. 5A-5C illustrate different exemplary candidates 500a-c for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3 overlaid on probability map 400 of Fig. 4. Candidates 500a, 500b and 500c are indicated as a grid of plus signs (“+”), while each such sign indicates the center of a projection of a marker. Candidates 500a, 500b and 500c are virtual projections of 2D grid structure of sphere markers 220, as if the fluoroscope used to capture image 300 is located at three different poses associated correspondingly with these projections. In this example, candidate 500a was generated as if the fluoroscope is located at: position [0, -50, 0], angle: -20 degrees. Candidate 500b was generated as if the fluoroscope is located at: position [0, -10, 0], angle: -20 degrees. Candidate 500c was generated as if the fluoroscope is located at: position [7.5, -40, 11.25], angle: -25 degrees. The above-mentioned coordinates are with respect to 2D grid structure of sphere markers 220. Densities 4l0a of probability map 400 are indicated in Figs. 5A-5C. Plus signs 5l0a, 5l0b and 5l0c are the centers of the markers projections of candidates 500a, 500b and 500c correspondingly, which are the ones closest to densities 4l0a. One can see that plus sign 5l0c is the sign which best fits densities 4l0a and therefore would receive the highest probability score among signs 5l0a, 5l0b and 5l0c of candidates 500a, 500b and 500c correspondingly. One can further see that accordingly, candidate 500c would receive the highest probability score since its markers projections best fit probability map 400. Thus, among these three exemplary candidates, 500a, 500b and 500c, candidate 500c would be identified as the candidate with the highest probability of being the projection of 2D grid structure of sphere markers 220 on image 300.
[0047] Further steps may be performed in order to refine the above described pose estimation. In an optional step 130, a locally deformed version of the candidate may be generated in order to maximize its probability of being the projection of the structure of markers on the image. The locally deformed version may be generated based on the image probability map. A local search algorithm may be utilized to deform the candidate so that it would maximize its score. For example, in case the structure of markers is a 2D grid, each 2D grid point may be treated individually. Each point may be moved towards the neighbouring local maxima on the probability map using gradient ascent method.
[0048] In an optional step 140, an improved candidate for the projection of the structure of markers on the image may be detected based on the locally deformed version of the candidate. The improved candidate is determined such that it fits (exactly or approximately) the locally deformed version of the candidate. Such improved candidate may be determined by identifying a transformation that will fit a new candidate to the local deformed version, e.g., by using homography estimation methods. The virtual pose of the imaging device associated with the improved candidate may be then determined as the estimated pose of the imaging device for the image. [0049] In some embodiments, the generation of a locally deformed version of the candidate and the determination of an improved candidate may be iteratively repeated. These steps may be iteratively repeated until the process converges to a specific virtual projection of the structure of markers on the image, which may be determined as the improved candidate. Thus, since the structure of markers converges as a whole, false local maxima is avoided. In an aspect, as an alternative to using a list of candidates and finding an optimal candidate for estimating the camera pose, the camera pose may be estimated by solving a homography that transforms a 2D fiducial structure in 3D space into image coordinates that matches the fiducial probability map generated from the imaging device output.
[0050] Fig. 6A shows a selected candidate 600a, for projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4. Fig. 6B shows an improved candidate 600B, for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4. Fig. 6C shows a further improved candidate 600c, for the projection of 2D grid structure of sphere markers 220 of Fig. 2B on image 300 of Fig. 3, overlaid on probability map 400 of Fig. 4. As described above, the identified or selected candidate is candidate 500c, which is now indicated 600a. Candidate 600b is the improved candidate which was generated based on a locally deformed version of candidate 600a according to the method disclosed above. Candidate 600c is a further improved candidate with respect to candidate 600b, generated by iteratively repeating the process of locally deforming the resulting candidate and determining an approximation to maximize the candidate probability. Fig. 6C illustrates the results of refined candidates based on a higher resolution probability map. In an aspect, this is done after completing a refinement step using the down-sampled version of the probability map. Plus signs 6l0a, 6l0b and 6l0c are the centers of the markers projections of candidates 600a, 600b and 600c correspondingly, which are the ones closest to densities 4l0a of probability map 400. One can see how the candidates for the projection of 2D grid structure of sphere markers 220 on image 300 converge to the candidate of the highest probability according to probability map 400.
[0051] In some embodiments, the imaging device may be configured to capture a sequence of images. A sequence of images may be captured, automatically or manually, by continuously sweeping the imaging device at a certain angle. When pose estimation of a sequence of images is required, the estimation process may become more efficient by reducing the range or area of possible virtual poses for the imaging device. A plurality of non-sequential images of the sequence of images may be then determined. For example, the first image in the sequence, the last image, and one or more images in- between. The one or more images in-between may be determined such that the sequence is divided into equal image portions. At a first stage, the pose of the imaging device may be estimated only for the determined non-sequential images. At a second stage, the area or range of possible different poses for virtually positioning the imaging device may be reduced. The reduction may be performed based on the estimated poses of the imaging device for the determined non-sequential images. The pose of the imaging device for the rest of the images may be then estimated according to the reduced area or range. For example, the pose of the imaging device for the first and tenth images of the sequence are determined at the first stage. The pose of the imaging device for the second to ninth images must be along a feasible and continuous path between its pose for the first image and its pose for the tenth image, and so on.
[0052] In some embodiments, geometrical parameters of the imaging device may be pre -known, or pre-determined, such as the field of view of the source, height range, rotation angle range and the like, including the device degrees of freedom (e.g., independent motions allowed). In some embodiments, such geometrical parameters of the imaging device may be determined in real-time while estimating the pose of the imaging device for the captured images. Such information may be also used to reduce the area or range of possible poses. In some embodiments, a user practicing the disclosed disclosure may be instructed to limit the motion of the imaging device to certain degrees of freedom or to certain ranges of motion for the sequence of images. Such limitations may be also considered when determining the imaging device possible poses and thus may be used to make the imaging device pose estimation faster.
[0053] In some embodiments, an image pre-processing methods may be first applied to the one or more images in order to correct distortions and/or enhance the visualization of the projection of the structure of markers on the image. For example, in case the imaging device is a fluoroscope, correction of “pincushion” distortion, which slightly warps the image, may be performed. This distortion may be automatically addressed by modelling the warp with a polynomial surface and applying compatible warp which will cancel out the pincushion effect. In case a grid of metal spheres is used, the image may be in versed in order to enhance the projections of the markers. In addition, the image may be blurred using Gaussian filter with sigma value equal, for example, to one half of the spheres diameter, in order to facilitate the search and evaluation of candidates as disclosed above.
[0054] In some embodiments, one or more models of the imaging device may be calibrated to generate calibration data, such as a data file, which may be used to automatically calibrate the specific imaging device. The calibration data may include data referring to the geometric calibration and/or distortion calibration, as disclosed above. In some embodiments, the geometric calibration may be based on data provided by the imaging device manufacturer. In some embodiments, a manual distortion calibration may be performed once for a specific imaging device. In an aspect, the imaging device distortion correction can be calibrated as a preprocessing step during every procedure as the pincushion distortion may change as a result of imaging device maintenance or even as a result of a change in time.
[0055] Fig. 2A illustrates a schematic diagram of a system 200 configured for use with the method of Fig. 1 in accordance with one aspect of the disclosure. System 200 may include a workstation 80, an imaging device 215 and a structure of markers structure 218. In some embodiments, workstation 80 may be coupled with imaging device 215, directly or indirectly, e.g., by wireless communication. Workstation 80 may include a memory 202, a processor 204, a display 206 and an input device 210. Processor or hardware processor 204 may include one or more hardware processors. Workstation 80 may optionally include an output module 212 and a network interface 208. Memory 202 may store an application 81 and image data 214. Application 81 may include instructions executable by processor 204, inter alia, for executing the method of Fig. 1 and a user interface 216. Workstation 80 may be a stationary computing device, such as a personal computer, or a portable computing device such as a tablet computer. Workstation 80 may embed a plurality of computer devices.
[0056] Memory 202 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by processor 204 and which control the operation of workstation 80 and in some embodiments, may also control the operation of imaging device 215. In an embodiment, memory 202 may include one or more solid-state storage devices such as flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown). Although the description of computer-readable media contained herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer- readable storage media can be any available media that can be accessed by the processor 204. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by workstation 80.
[0057] Application 81 may, when executed by processor 204, cause display 206 to present user interface 216. Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Network interface 208 may be used to connect between workstation 80 and imaging device 215. Network interface 208 may be also used to receive image data 214. Input device 210 may be any device by means of which a user may interact with workstation 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
[0058] Imaging device 215 may be any imaging device, which captures 2D images, such as a standard fluoroscopic imaging device or a camera. In some embodiments, markers structure 218, may be a structure of markers having a two-dimensional pattern, such as a grid having two dimensions of width and length (e.g., 2D grid), as shown in Fig. 2B. Using a 2D pattern, as opposed to a 3D pattern, may facilitate the pose estimation process. Furthermore, when for example, a patient is required to lie on markers structure 218 in order to estimate the pose of a medical imaging device while scanning the patient, a 2D pattern would be more convenient for the patient. The markers should be formed such that they will be visible in the imaging modality used. For example, if the imaging device is a fluoroscopic device, then the markers should be made of a material which is at least partially radio-opaque. In some embodiments, the shape of the markers may be symmetric and such that the projection of the markers on the image would be the same at any pose the imaging device may be placed. Such configuration may simplify and enhance the pose estimation process and/or make it more efficient. For example, when the imaging device is rotated around the markers structure, markers having a rotation symmetry may be preferred, such as spheres. The size of the markers structure and/or the number of markers in the structure may be determined according to the specific use of the disclosed systems and methods. For example, if the pose estimation is used to construct a 3D volume of an area of interest within a patient, then the markers structure may be of a size similar or larger than the size of the area of interest. In some embodiments, the pattern of markers structure 218 may be two-dimensional and/or periodic, such as a 2D grid. Using a periodic and/or of a two-dimensional pattern structure of markers may further enhance and facilitate the pose estimation process and make it more efficient.
[0059] Referring now to Fig. 2B, 2D grid structure of sphere markers 220 has a 2D periodic pattern of a grid and includes symmetric markers in the shape of a sphere. Such a configuration simplifies and enhances the pose estimation process, as described in Fig. 1, specifically when generating the virtual candidates for the markers structure projection and when determining the optimal one. The structure of markers, as a fiducial, should be positioned in a stationary manner during the capturing of the one or more images. In an exemplary 2D grid structure of sphere markers such as described above, used in medical imaging of the lungs area, the sphere markers diameter may be 2+0.2 mm and the distance between the spheres may be about 15+0.15 mm isotropic.
[0060] Referring now back to Fig. 2 A, imaging device 215 may capture one or more images (i.e., a sequence of images) such that at least a projection of a portion of markers structure 218 is shown in each image. The image or sequence of images captured by imaging device 215 may be then stored in memory 202 as image data 214. The image data may be then processed by processor 204 and according to the method of Fig. 1, to determine the pose of imaging device 215. The pose estimation data may be then output via output module 212, display 206 and/or network interface 208. Markers structure 218 may be positioned with respect to an area of interest, such as under an area of interest within the body of a patient going through a fluoroscopic scan. Markers structure 218 and the patient will then be positioned such that the one or more images captured by imaging device 215 would capture the area of interest and a portion of markers structure 218. If required, once the pose estimation process is complete, the projection of markers structure 218 on the images may be removed by using well known methods. One such method is described in commonly-owned U.S. Patent Application No. 16/259,612, entitled: "IMAGE RECONSTRUCTION SYSTEM AND METHOD", filed on January 28, 2019, by Alexandroni et al., the entire content of which is hereby incorporated by reference.
[0061] Fig. 7 is a flow chart of an exemplary method for constructing fluoroscopic three-dimensional volumetric data in accordance with the disclosure. A method for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two dimensional fluoroscopic images, is hereby disclosed. In step 700, a sequence of images of the target area and of a structure of markers is acquired via a fluoroscopic imaging device. The structure of markers may be the two-dimensional structure of markers described with respect to Figs. 1, 2A and 2B. The structure of markers may be positioned between the patient and the fluoroscopic imaging device. In some embodiments, the target area may include, for example, at least a portion of the lungs, and as exemplified with respect to the system of Fig. 8. In some embodiments, the target is a soft-tissue target, such as within a lung, kidney, liver and the like.
[0062] In a step 710, a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of images may be estimated. The pose estimation may be performed based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images, and as described with respect to Fig. 1.
[0063] In some embodiments, other methods for estimating the pose of the fluoroscopic device may be used. There are various known methods for determining the poses of imaging devices, such as an external angle measuring device or based on image analysis. Some of such devices and methods are particularly described in commonly- owned U.S. Patent Publication No. 2017/0035379, filed on August 1, 2016, by Weingarten et al, the entire content of which is hereby incorporated by reference.
[0064] In a step 720, a fluoroscopic -based three-dimensional volumetric data of the target area may be constructed based on the estimated poses of the fluoroscopic imaging device. Exemplary systems and methods for constructing such fluoroscopic-based three-dimensional volumetric data are disclosed in the above commonly-owned U.S. Patent Publication No. 2017/0035379, which is incorporated by reference.
[0065] In an optional step 730, a medical device may be positioned in the target area prior to the acquiring of the sequence of images. Thus, the sequence of images and consequently the fluoroscopic-based three-dimensional volumetric data may also include a projection of the medical device in addition to the target. The offset (i.e., Dc, Ay and Dz) between the medical device and the target may be then determined based on the fluoroscopic -based three-dimensional volumetric data. The target may be visible or better exhibited in the generated three-dimensional volumetric data. Therefore, the target may be detected, automatically, or manually by the user, in the three-dimensional volumetric data. The medical device may be detected, automatically or manually by a user, in the sequence of images, as captured, or in the generated three-dimensional volumetric data. The automatic detection of the target and/or the medical device may be performed based on systems and methods as known in the art and such as described, for example, in commonly-owned U.S. Patent Application No. 62/627,911, titled: "SYSTEM AND METHOD FOR CATHETER DETECTION IN FLUOROSCOPIC IMAGES AND UPDATING DISPLAYED POSITION OF CATHETER", filed on February 8, 2018, by Birenbaum et al. The manual detection may be performed by displaying to the user the three-dimensional volumetric data and/or captured images and requesting his input. Once the target and the medical device are detected in the three-dimensional volumetric data and/or the captures images, their location in the fluoroscopic coordinate system of reference may be obtained and the offset between them may be determined.
[0066] The offset between the target and the medical device may be utilized for various medical purposes, including facilitating approach of the medical device to the target area and treatment. The navigation of a medical device to the target area may be facilitated via a locating system and a display. The locating system locates or tracks the motion of the medical device through the patient’s body. The display may display the medical device location to the user with respect to the surroundings of the medical device within the patient’s body and the target. The locating system may be, for example, an electromagnetic or optic locating system, or any other such system as known in the art. When, for example, the target area includes a portion of the lungs, the medical device may be navigated to the target area through the airways luminal network and as described with respect to Fig. 8.
[0067] In an optional step 740, a display of the location of the medical device with respect to the target may be corrected based on the determined offset between the medical device and the target. In some embodiments, a 3D rendering of the target area may be displayed on the display. The 3D rendering of the target area may be generated based on CT volumetric data of the target area which was acquired previously, e.g., prior to the current procedure or operation (e.g., preoperative CT). In some embodiments, the locating system may be registered to the 3D rendering of the target, such as described, for example, with respect to Fig. 8 below. The correction of the offset between the medical device and the target may be then performed by updating the registration of the locating system to the 3D rendering. Generally, to perform such updating, a transformation between coordinate system of reference of the fluoroscopic images and the coordinate system of reference of the locating system should be known. The geometrical positioning of the structure of markers with respect to the locating system may determine such a transformation. In some embodiments, and as shown in the embodiment of Fig. 8, the structure of markers and the locating system are positioned such that the same coordinate system of reference would apply to both, or such that the one would be only a translated version of the other.
[0068] In some embodiments, the updating of the registration of the locating system to the 3D rendering (e.g., CT-base) may be performed in a local manner and/or in a gradual manner. For example, the registration may be updated only in the surroundings of the target, e.g., only within a certain distance from the target. This is since the update may be less accurate when not performed around the target. In some embodiments, the updating may be performed in a gradual manner, e.g., by applying weights according to distance from the target. In addition to accuracy considerations, such gradual updating may be more convenient or easier for the user to look at, process and make the necessary changes during procedure, than abrupt change in the medical device location on the display.
[0069] In some embodiments, the patient may be instructed to stop breathing (or caused to stop breathing) during the capture of the images in order to prevent movements of the target area due to breathing. In other embodiments, methods for compensating breathing movements during the capture of the images may be performed. For example, the estimated poses of the fluoroscopic device may be corrected according to the movements of a fiducial marker placed in the target area. Such a fiducial may be a medical device, e.g., a catheter, placed in the target area. The movement of the catheter, for example, may be determined based on the locating system. In some embodiments, a breathing pattern of the patient may be determined according to the movements of a fiducial marker, such as a catheter, located in the target area. The movements may be determined via a locating system. Based on that pattern, only images of inhale or exhale may be considered when determining the pose of the imaging device. [0070] In embodiments, as described above, for each captured frame, the imaging device three-dimensional position and orientation are estimated based on a set of static markers positioned on the patient bed. This process requires knowledge about the markers 3D positions in the volume, as well as the compatible 2D coordinates of the projections in the image plane. Adding one or more markers from different planes in the volume of interest may lead to more robust and accurate pose estimation. One possible marker that can be utilized in such a process is the catheter tip (or other medical device tip positioned through the catheter). The tip is visible throughout the video captured by fluoroscopic imaging and the compatible 3D positions may be provided by a navigation or tracking system (e.g., an electromagnetic navigation tracking system) as the tool is navigated to the target (e.g., through the electromagnetic field). Therefore, the only remaining task is to deduce the exact 2D coordinates from the video frames. As described above, one embodiment of the tip detection step may include fully automated detection and tracking of the tip throughout the video. Another embodiment may implement semi-supervised tracking in which the user manually marks the tip in one or more frames and the detection process computes the tip coordinates for the rest of the frames.
[0071] In embodiments, the semi- supervised tracking process may be implemented in accordance with solving each frame at a time by template matching between current frame and previous ones, using optical flow to estimate the tip movement along the video, and/or model-based trackers. Model-based trackers train a detector to estimate the probability of each pixel to belong to the catheter tip, which is followed by a step of combining the detections to a single most probable list of coordinates along the video. One possible embodiment of the model-based trackers involves dynamic programming. Such an optimization approach enables finding a seam (connected list of coordinates along the video frames 3D space - first two dimensions belongs to the image plane and the third axis is time) with maximal probability. Another possible way to achieve a seam of two-dimensional coordinates is training a detector to estimate the tip coordinate in each frame while incorporating a regularization to the loss function of proximity between detections in adjacent frames.
[0072] Fig. 8 illustrates an exemplary system 800 for constructing fluoroscopic- based three-dimensional volumetric data in accordance with the disclosure. System 800 may be configured to construct fluoroscopic-based three-dimensional volumetric data of a target area including at least a portion of the lungs of a patient from 2D fluoroscopic images. System 800 may be further configured to facilitate approach of a medical device to the target area by using Electromagnetic Navigation Bronchoscopy (ENB) and for determining the location of a medical device with respect to the target.
[0073] System 800 may be configured for reviewing CT image data to identify one or more targets, planning a pathway to an identified target (planning phase), navigating an extended working channel (EWC) 812 of a catheter assembly to a target (navigation phase) via a user interface, and confirming placement of EWC 812 relative to the target. One such EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY® system currently sold by Medtronic PLC. The target may be tissue of interest identified by review of the CT image data during the planning phase. Following navigation, a medical device, such as a biopsy tool or other tool, may be inserted into EWC 812 to obtain a tissue sample from the tissue located at, or proximate to, the target.
[0074] Fig. 8 illustrates EWC 812 which is part of a catheter guide assembly 840. In practice, EWC 812 is inserted into a bronchoscope 830 for access to a luminal network of the patient“P.” Specifically, EWC 812 of catheter guide assembly 840 may be inserted into a working channel of bronchoscope 830 for navigation through a patient’s luminal network. A locatable guide (LG) 832, including a sensor 844 is inserted into EWC 812 and locked into position such that sensor 844 extends a desired distance beyond the distal tip of EWC 812. The position and orientation of sensor 844 relative to the reference coordinate system, and thus the distal portion of EWC 812, within an electromagnetic field can be derived. Catheter guide assemblies 840 are currently marketed and sold by Medtronic PLC under the brand names SUPERDIMENSION® Procedure Kits, or EDGE™ Procedure Kits, and are contemplated as useable with the disclosure. For a more detailed description of catheter guide assemblies 840, reference is made to commonly-owned U.S. Patent Publication No. 2014/0046315, filed on March 15, 2013, by Ladtkow et al, U.S. Patent No. 7,233,820, and U.S. Patent No. 9,044,254, the entire contents of each of which are hereby incorporated by reference.
[0075] System 800 generally includes an operating table 820 configured to support a patient“P,” a bronchoscope 830 configured for insertion through the patient’s“P’s” mouth into the patient’s “P’s” airways; monitoring equipment 835 coupled to bronchoscope 830 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 830); a locating system 850 including a locating module 852, a plurality of reference sensors 854 and a transmitter mat coupled to a structure of markers 856; and a computing device 825 including software and/or hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical device to the target, and confirmation of placement of EWC 812, or a suitable device therethrough, relative to the target. Computing device 825 may be similar to workstation 80 of Fig. 2A and may be configured, inter alia, to execute the method of Fig. 1.
[0076] A fluoroscopic imaging device 810 capable of acquiring fluoroscopic or x-ray images or video of the patient“P” is also included in this particular aspect of system 800. The images, sequence of images, or video captured by fluoroscopic imaging device 810 may be stored within fluoroscopic imaging device 810 or transmitted to computing device 825 for storage, processing, and display, as described with respect to Fig. 2A. Additionally, fluoroscopic imaging device 810 may move relative to the patient“P” so that images may be acquired from different angles or perspectives relative to patient“P” to create a sequence of fluoroscopic images, such as a fluoroscopic video. The pose of fluoroscopic imaging device 810 relative to patient“P” and for the images may be estimated via the structure of markers and according to the method of Fig. 1. The structure of markers is positioned under patient“P,” between patient“P” and operating table 820 and between patient“P” and a radiation source of fluoroscopic imaging device 810. Structure of markers is coupled to the transmitter mat (both indicated 856) and positioned under patient “P” on operating table 820. Structure of markers and transmitter mat 856 are positioned under the target area within the patient in a stationary manner. Structure of markers and transmitter mat 856 may be two separate elements which may be coupled in a fixed manner or alternatively may be manufactured as one unit. Fluoroscopic imaging device 810 may include a single imaging device or more than one imaging device. In embodiments including multiple imaging devices, each imaging device may be a different type of imaging device or the same type. Further details regarding the imaging device 810 are described in U.S. Patent No. 8,565,858, which is incorporated by reference in its entirety herein.
[0077] Computing device 185 may be any suitable computing device including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium. Computing device 185 may further include a database configured to store patient data, CT data sets including CT images, fluoroscopic data sets including fluoroscopic images and video, navigation plans, and any other such data. Although not explicitly illustrated, computing device 185 may include inputs, or may otherwise be configured to receive, CT data sets, fluoroscopic images/video and other data described herein. Additionally, computing device 185 includes a display configured to display graphical user interfaces. Computing device 185 may be connected to one or more networks through which one or more databases may be accessed.
[0078] With respect to the planning phase, computing device 185 utilizes previously acquired CT image data for generating and viewing a three dimensional model of the patient’s“P’s” airways, enables the identification of a target on the three dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through the patient’s“P’s” airways to tissue located at and around the target. More specifically, CT images acquired from previous CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three-dimensional model of the patient’s“P’s” airways. The three-dimensional model may be displayed on a display associated with computing device 185, or in any other suitable fashion. Using computing device 185, various views of the three-dimensional model or enhanced two-dimensional images generated from the three-dimensional model are presented. The enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data. The three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through the patient’s“P’s” airways to access tissue located at the target can be made. Once selected, the pathway plan, three dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s). One such planning software is the ILOGIC® planning suite currently sold by Medtronic PLC.
[0079] With respect to the navigation phase, a six degrees-of-freedom electromagnetic locating or tracking system 850, e.g., similar to those disclosed in U.S. Patent Nos. 8,467,589, 6,188,355, and published PCT Application Nos. WO 00/10456 and WO 01/67035, the entire contents of each of which are incorporated herein by reference, or other suitable positioning measuring system, is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated. Tracking system 850 includes a locating or tracking module 852, a plurality of reference sensors 854, and a transmitter mat 856. Tracking system 850 is configured for use with a locatable guide 832 and particularly sensor 844. As described above, locatable guide 832 and sensor 844 are configured for insertion through an EWC 182 into a patient’s“P’s” airways (either with or without bronchoscope 830) and are selectively lockable relative to one another via a locking mechanism.
[0080] Transmitter mat 856 is positioned beneath patient“P.” Transmitter mat 856 generates an electromagnetic field around at least a portion of the patient“P” within which the position of a plurality of reference sensors 854 and the sensor 844 can be determined with use of a tracking module 852. One or more of reference sensors 854 are attached to the chest of the patient“P.” The six degrees of freedom coordinates of reference sensors 854 are sent to computing device 825 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference. Registration, is generally performed to coordinate locations of the three-dimensional model and two-dimensional images from the planning phase with the patient’s“P’s” airways as observed through the bronchoscope 830, and allow for the navigation phase to be undertaken with precise knowledge of the location of the sensor 844, even in portions of the airway where the bronchoscope 830 cannot reach. Further details of such a registration technique and their implementation in luminal navigation can be found in U.S. Patent Application Pub. No. 2011/0085720, the entire content of which is incorporated herein by reference, although other suitable techniques are also contemplated.
[0081] Registration of the patient’s“P’s” location on the transmitter mat 856 is performed by moving LG 832 through the airways of the patient’s “P.” More specifically, data pertaining to locations of sensor 844, while locatable guide 832 is moving through the airways, is recorded using transmitter mat 856, reference sensors 854, and tracking module 852. A shape resulting from this location data is compared to an interior geometry of passages of the three dimensional model generated in the planning phase, and a location correlation between the shape and the three dimensional model based on the comparison is determined, e.g., utilizing the software on computing device 825. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional model. The software aligns, or registers, an image representing a location of sensor 844 with the three-dimensional model and two-dimensional images generated from the three-dimension model, which are based on the recorded location data and an assumption that locatable guide 832 remains located in non-tissue space in the patient’s “P’s” airways. Alternatively, a manual registration technique may be employed by navigating the bronchoscope 830 with the sensor 844 to pre-specified locations in the lungs of the patient“P”, and manually correlating the images from the bronchoscope to the model data of the three dimensional model.
[0082] Following registration of the patient“P” to the image data and pathway plan, a user interface is displayed in the navigation software which sets for the pathway that the clinician is to follow to reach the target. One such navigation software is the ILOGIC® navigation suite currently sold by Medtronic PLC.
[0083] Once EWC 812 has been successfully navigated proximate the target as depicted on the user interface, the locatable guide 832 may be unlocked from EWC 812 and removed, leaving EWC 812 in place as a guide channel for guiding medical devices including without limitation, optical systems, ultrasound probes, marker placement tools, biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor probes, and aspirating needles to the target.
[0084] The disclosed exemplary system 800 may be employed by the method of Fig. 7 to construct fluoroscopic-based three-dimensional volumetric data of a target located in the lungs area and to correct the location of a medical device navigated to the target area with respect to the target.
[0085] System 800 or similar version of it in conjunction with the method of Fig. 7 may be used in various procedures, other than ENB procedures with the required modifications, and such as laparoscopy or robotic-assisted surgery.
[0086] Systems and methods in accordance with the disclosure may be usable for facilitating the navigation of a medical device to a target and/or its area using real-time two-dimensional fluoroscopic images of the target area. The navigation is facilitated by using local three-dimensional volumetric data, in which small soft-tissue objects are visible, constructed from a sequence of fluoroscopic images captured by a standard fluoroscopic imaging device available in most procedure rooms. The fluoroscopic-based constructed local three-dimensional volumetric data may be used to correct a location of a medical device with respect to a target or may be locally registered with previously acquired volumetric data (e.g., CT data). In general, the location of the medical device may be determined by a tracking system, for example, an electromagnetic tracking system. The tracking system may be registered with the previously acquired volumetric data. A local registration of the real-time three-dimensional fluoroscopic data to the previously acquired volumetric data may be then performed via the tracking system. Such real-time data, may be used, for example, for guidance, navigation planning, improved navigation accuracy, navigation confirmation, and treatment confirmation. [0087] In some embodiments, the methods disclosed may further include a step for generating a 3D rendering of the target area based on a pre-operative CT scan. A display of the target area may then include a display of the 3D rendering. In another step, the tracking system may be registered with the 3D rendering. As described above, a correction of the location of the medical device with respect to the target, based on the determined offset, may then include the local updating of the registration between the tracking system and the 3D rendering in the target area. In some embodiments, the methods disclosed may further include a step for registering the fluoroscopic 3D reconstruction to the tracking system. In another step, and based on the above, a local registration between the fluoroscopic 3D reconstruction and the 3D rendering may be performed in the target area.
[0088] From the foregoing and with reference to the various figure drawings, those skilled in the art will appreciate that certain modifications can also be made to the disclosure without departing from the scope of the same. For example, although the systems and methods are described as usable with an EMN system for navigation through a luminal network such as the lungs, the systems and methods described herein may be utilized with systems that utilize other navigation and treatment devices such as percutaneous devices. Additionally, although the above-described system and method is described as used within a patient’s luminal network, it is appreciated that the above- described systems and methods may be utilized in other target regions such as the liver. Further, the above-described systems and methods are also usable for transthoracic needle aspiration procedures.
[0089] Detailed embodiments of the disclosure are disclosed herein. However, the disclosed embodiments are merely examples of the disclosure, which may be embodied in various forms and aspects. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the disclosure in virtually any appropriately detailed structure.
[0090] As can be appreciated a medical instrument such as a biopsy tool or an energy device, such as a microwave ablation catheter, that is positionable through one or more branched luminal networks of a patient to treat tissue may prove useful in the surgical arena and the disclosure is directed to systems and methods that are usable with such instruments and tools. Access to luminal networks may be percutaneous or through natural orifice using navigation techniques. Additionally, navigation through a luminal network may be accomplished using image-guidance. These image-guidance systems may be separate or integrated with the energy device or a separate access tool and may include MRI, CT, fluoroscopy, ultrasound, electrical impedance tomography, optical, and/or device tracking systems. Methodologies for locating the access tool include EM, IR, echolocation, optical, and others. Tracking systems may be integrated to an imaging device, where tracking is done in virtual space or fused with preoperative or live images. In some cases the treatment target may be directly accessed from within the lumen, such as for the treatment of the endobronchial wall for COPD, Asthma, lung cancer, etc. In other cases, the energy device and/or an additional access tool may be required to pierce the lumen and extend into other tissues to reach the target, such as for the treatment of disease within the parenchyma. Final localization and confirmation of energy device or tool placement may be performed with imaging and/or navigational guidance using a standard fluoroscopic imaging device incorporated with methods and systems described above.
[0091] While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims

WHAT IS CLAIMED IS:
1. A system for constructing fluoroscopic -based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device, comprising:
a structure of markers, wherein a sequence of images of the target area and of the structure of markers is acquired via the fluoroscopic imaging device; and
a computing device configured to:
estimate a pose of the fluoroscopic imaging device for a plurality of images of the sequence of images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images; and
construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
2. The system of claim 1, wherein the computing device is further configured to: facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquiring the sequence of images; and determine an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
3. The system of claim 2, further comprising a locating system indicating a location of the medical device within the patient, wherein the computing device comprises a display and is configured to:
display the target area and the location of the medical device with respect to the target;
facilitate navigation of the medical device to the target area via the locating system and the display; and
correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
4. The system of claim 3, wherein the computing device is further configured to: display a 3D rendering of the target area on the display; and
register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
5. The system of claim 3, wherein the locating system is an electromagnetic locating system.
6. The system of claim 3, wherein the target area comprises at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
7. The system of claim 1, wherein the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
8. The system of claim 1, wherein the target area comprises at least a portion of lungs and the target is a soft-tissue target.
9. A method for constructing fluoroscopic-based three dimensional volumetric data of a target area within a patient from a sequence of two-dimensional (2D) fluoroscopic images of a target area and of a structure of markers acquired via a fluoroscopic imaging device, wherein the structure of markers is positioned between the patient and the fluoroscopic imaging device, the method comprising using at least one hardware processor for:
estimating a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of 2D fluoroscopic images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images; and
constructing fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
10. The method of claim 9, wherein a medical device is positioned in the target area prior to acquiring the sequence of images, and wherein the method further comprises using the at least one hardware processor for determining an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
11. The method of claim 10, further comprising using the at least one hardware processor for:
facilitating navigation of the medical device to the target area via a locating system indicating a location of the medical device and via a display; and
correcting a display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
12. The method of claim 11, further comprising using the at least one hardware processor for:
displaying a 3D rendering of the target area on the display; and
registering the locating system to the 3D rendering,
wherein the correcting of the location of the medical device with respect to the target comprises updating the registration of the locating system to the 3D rendering.
13. The method of claim 12, further comprising using the at least one hardware processor for generating the 3D rendering of the target area based on previously acquired CT volumetric data of the target area.
14. The method of claim 10, wherein the target area comprises at least a portion of lungs and wherein the medical device is navigable to the target area through airways of a luminal network.
15. The method of claim 11, wherein the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
16. The method of claim 11, wherein the target area comprises at least a portion of lungs and the target is a soft-tissue target.
17. A system for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device, comprising:
a computing device configured to:
estimate a pose of the fluoroscopic imaging device for a plurality of images of a sequence of images based on detection of a possible and most probable projection of a structure of markers as a whole on each image of the plurality of images; and
construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
18. The system of claim 17, wherein the computing device is further configured to:
facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquisition of the sequence of images; and determine an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
19. The system of claim 18, further comprising a locating system indicating a location of the medical device within the patient, wherein the computing device comprises a display and is configured to:
display the target area and the location of the medical device with respect to the target;
facilitate navigation of the medical device to the target area via the locating system and the display; and
correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
20. The system of claim 19, wherein the computing device is further configured to:
display a 3D rendering of the target area on the display; and
register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
21. A system for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device, comprising:
a structure of markers, wherein a sequence of images of the target area and of the structure of markers is acquired via the fluoroscopic imaging device; and
a computing device configured to:
estimate a pose of the fluoroscopic imaging device for a plurality of images of the sequence of images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images; and
construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
22. The system of claim 21, wherein the computing device is further configured to:
facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquiring the sequence of images; and determine an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
23. The system of claim 22, further comprising a locating system indicating a location of the medical device within the patient, wherein the computing device comprises a display and is configured to:
display the target area and the location of the medical device with respect to the target;
facilitate navigation of the medical device to the target area via the locating system and the display; and
correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
24. The system of claim 23, wherein the computing device is further configured to:
display a 3D rendering of the target area on the display; and
register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
25. The system of claim 23, wherein the locating system is an electromagnetic locating system.
26. The system of claim 23, wherein the target area comprises at least a portion of lungs and the medical device is navigable to the target area through airways of a luminal network.
27. The system of claim 21, wherein the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
28. The system of claim 21, wherein the target area comprises at least a portion of lungs and the target is a soft-tissue target.
29. A method for constructing fluoroscopic-based three dimensional volumetric data of a target area within a patient from a sequence of two-dimensional (2D) fluoroscopic images of a target area and of a structure of markers acquired via a fluoroscopic imaging device, wherein the structure of markers is positioned between the patient and the fluoroscopic imaging device, the method comprising using at least one hardware processor for:
estimating a pose of the fluoroscopic imaging device for at least a plurality of images of the sequence of 2D fluoroscopic images based on detection of a possible and most probable projection of the structure of markers as a whole on each image of the plurality of images; and constructing fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
30. The method of claim 29, wherein a medical device is positioned in the target area prior to acquiring the sequence of images, and wherein the method further comprises using the at least one hardware processor for determining an offset between the medical device and the target based on the fluoroscopic-based three-dimensional volumetric data.
31. The method of claim 30, further comprising using the at least one hardware processor for:
facilitating navigation of the medical device to the target area via a locating system indicating a location of the medical device and via a display; and
correcting a display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
32. The method of claim 31, further comprising using the at least one hardware processor for:
displaying a 3D rendering of the target area on the display; and
registering the locating system to the 3D rendering,
wherein the correcting of the location of the medical device with respect to the target comprises updating the registration of the locating system to the 3D rendering.
33. The method of claim 32, further comprising using the at least one hardware processor for generating the 3D rendering of the target area based on previously acquired CT volumetric data of the target area.
34. The method of claim 30, wherein the target area comprises at least a portion of lungs and wherein the medical device is navigable to the target area through airways of a luminal network.
35. The method of claim 31, wherein the structure of markers is at least one of a periodic pattern or a two-dimensional pattern.
36. The method of claim 31, wherein the target area comprises at least a portion of lungs and the target is a soft-tissue target.
37. A system for constructing fluoroscopic-based three-dimensional volumetric data of a target area within a patient from two-dimensional fluoroscopic images acquired via a fluoroscopic imaging device, comprising:
a computing device configured to: estimate a pose of the fluoroscopic imaging device for a plurality of images of a sequence of images based on detection of a possible and most probable projection of a structure of markers as a whole on each image of the plurality of images; and
construct fluoroscopic-based three-dimensional volumetric data of the target area based on the estimated poses of the fluoroscopic imaging device.
38. The system of claim 37, wherein the computing device is further configured to:
facilitate an approach of a medical device to the target area, wherein a medical device is positioned in the target area prior to acquisition of the sequence of images; and determine an offset between the medical device and the target based on the fluoroscopic -based three-dimensional volumetric data.
39. The system of claim 38, further comprising a locating system indicating a location of the medical device within the patient, wherein the computing device comprises a display and is configured to:
display the target area and the location of the medical device with respect to the target;
facilitate navigation of the medical device to the target area via the locating system and the display; and
correct the display of the location of the medical device with respect to the target based on the determined offset between the medical device and the target.
40. The system of claim 39, wherein the computing device is further configured to:
display a 3D rendering of the target area on the display; and
register the locating system to the 3D rendering, wherein correcting the display of the location of the medical device with respect to the target comprises updating the registration between the locating system and the 3D rendering.
PCT/US2019/017231 2018-02-08 2019-02-08 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target WO2019157294A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CA3088277A CA3088277A1 (en) 2018-02-08 2019-02-08 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
JP2020542153A JP7322039B2 (en) 2018-02-08 2019-02-08 Imaging Device Pose Estimation and Systems and Methods for Determining the Position of a Medical Device Relative to a Target
AU2019217999A AU2019217999A1 (en) 2018-02-08 2019-02-08 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
EP19751690.9A EP3750134A4 (en) 2018-02-08 2019-02-08 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
CN201980012379.9A CN111699515B (en) 2018-02-08 2019-02-08 System and method for pose estimation of imaging device and for determining position of medical device relative to target

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201862628017P 2018-02-08 2018-02-08
US62/628,017 2018-02-08
US201862641777P 2018-03-12 2018-03-12
US62/641,777 2018-03-12
US16/022,222 2018-06-28
US16/022,222 US10699448B2 (en) 2017-06-29 2018-06-28 System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US16/270,414 2019-02-07
US16/270,246 US10893842B2 (en) 2018-02-08 2019-02-07 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US16/270,246 2019-02-07
US16/270,414 US11364004B2 (en) 2018-02-08 2019-02-07 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target

Publications (1)

Publication Number Publication Date
WO2019157294A1 true WO2019157294A1 (en) 2019-08-15

Family

ID=67475240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/017231 WO2019157294A1 (en) 2018-02-08 2019-02-08 System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target

Country Status (7)

Country Link
US (4) US10893842B2 (en)
EP (1) EP3750134A4 (en)
JP (1) JP7322039B2 (en)
CN (1) CN111699515B (en)
AU (1) AU2019217999A1 (en)
CA (1) CA3088277A1 (en)
WO (1) WO2019157294A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3524157A1 (en) * 2018-02-08 2019-08-14 Covidien LP System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11992349B2 (en) 2015-08-06 2024-05-28 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10893842B2 (en) * 2018-02-08 2021-01-19 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
WO2020061648A1 (en) * 2018-09-26 2020-04-02 Sitesee Pty Ltd Apparatus and method for three-dimensional object recognition
US11931198B2 (en) 2019-02-15 2024-03-19 Koninklijke Philips N.V. X-ray calibration for display overlays onto X-ray images
WO2020165422A1 (en) * 2019-02-15 2020-08-20 Koninklijke Philips N.V. X-ray ripple markers for x-ray calibration
US11918406B2 (en) 2019-02-15 2024-03-05 Koninklijke Philips N.V. Marker registration correction by virtual model manipulation
JP2020156825A (en) * 2019-03-27 2020-10-01 富士フイルム株式会社 Position information display device, method, and program, and radiography apparatus
JP7440534B2 (en) * 2019-04-04 2024-02-28 センターライン バイオメディカル,インコーポレイテッド Spatial registration of tracking system and images using 2D image projection
WO2021046455A1 (en) * 2019-09-05 2021-03-11 The Johns Hopkins University Fast and automatic pose estimation using intraoperatively located fiducials and single-view fluoroscopy
US11864935B2 (en) * 2019-09-09 2024-01-09 Covidien Lp Systems and methods for pose estimation of a fluoroscopic imaging device and for three-dimensional imaging of body structures
US11847730B2 (en) * 2020-01-24 2023-12-19 Covidien Lp Orientation detection in fluoroscopic images
US11950950B2 (en) * 2020-07-24 2024-04-09 Covidien Lp Zoom detection and fluoroscope movement detection for target overlay
CN113069206B (en) * 2021-03-23 2022-08-05 江西麦帝施科技有限公司 Image guiding method and system based on electromagnetic navigation
WO2023196184A1 (en) * 2022-04-04 2023-10-12 Intuitive Surgical Operations, Inc. Pose-based three-dimensional structure reconstruction systems and methods
WO2024079627A1 (en) * 2022-10-14 2024-04-18 Covidien Lp Systems and methods of detecting and correcting for patient and/or imaging system movement for target overlay

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030130576A1 (en) * 2000-04-28 2003-07-10 Teresa Seeley Fluoroscopic tracking and visualization system
US20110152676A1 (en) * 2009-12-21 2011-06-23 General Electric Company Intra-operative registration for navigated surgical procedures
US20120046521A1 (en) * 2010-08-20 2012-02-23 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US20120289825A1 (en) * 2011-05-11 2012-11-15 Broncus, Technologies, Inc. Fluoroscopy-based surgical device tracking method and system
US20160005168A1 (en) * 2014-07-02 2016-01-07 Covidien Lp Fluoroscopic pose estimation

Family Cites Families (318)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488952A (en) 1982-02-24 1996-02-06 Schoolman Scientific Corp. Stereoscopically display three dimensional ultrasound imaging
US5493595A (en) 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US5057494A (en) 1988-08-03 1991-10-15 Ethicon, Inc. Method for preventing tissue damage after an ischemic episode
CA1309142C (en) 1988-12-02 1992-10-20 Daniel K. Nichols Two-way radio with voice storage
US5023895A (en) 1989-03-02 1991-06-11 Innovative Imaging Systems, Inc. Three dimensional tomographic system
GB8918135D0 (en) 1989-08-09 1989-09-20 Jacobs Barry Improvements in or relating to portable coating apparatus
WO1992006636A1 (en) 1990-10-22 1992-04-30 Innovative Imaging Systems, Inc. Three-dimensional tomographic system
US5321113A (en) 1993-05-14 1994-06-14 Ethicon, Inc. Copolymers of an aromatic anhydride and aliphatic ester
GB9517491D0 (en) 1995-08-25 1995-10-25 British Tech Group Imaging apparatus
US5668846A (en) 1996-10-18 1997-09-16 General Electric Company Methods and apparatus for scanning an object and displaying an image in a computed tomography system
US6101234A (en) 1997-11-26 2000-08-08 General Electric Company Apparatus and method for displaying computed tomography fluoroscopy images
US6442288B1 (en) 1997-12-17 2002-08-27 Siemens Aktiengesellschaft Method for reconstructing a three-dimensional image of an object scanned in the context of a tomosynthesis, and apparatus for tomosynthesis
US6289235B1 (en) 1998-03-05 2001-09-11 Wake Forest University Method and system for creating three-dimensional images using tomosynthetic computed tomography
JP3743594B2 (en) * 1998-03-11 2006-02-08 株式会社モリタ製作所 CT imaging device
US6003517A (en) 1998-04-30 1999-12-21 Ethicon Endo-Surgery, Inc. Method for using an electrosurgical device on lung tissue
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US6206566B1 (en) 1998-11-02 2001-03-27 Siemens Aktiengesellschaft X-ray apparatus for producing a 3D image from a set of 2D projections
US6379041B1 (en) 1998-11-02 2002-04-30 Siemens Aktiengesellschaft X-ray apparatus for producing a 3D image from a set of 2D projections
JP4473358B2 (en) 1999-01-21 2010-06-02 株式会社東芝 Diagnostic equipment
US6285902B1 (en) 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
DE19919907C2 (en) 1999-04-30 2003-10-16 Siemens Ag Method and device for catheter navigation in three-dimensional vascular tree images
US6373916B1 (en) 1999-05-10 2002-04-16 Shimadzu Corporation X-ray CT apparatus
US9833167B2 (en) 1999-05-18 2017-12-05 Mediguide Ltd. Method and system for superimposing virtual anatomical landmarks on an image
US6201849B1 (en) 1999-08-16 2001-03-13 Analogic Corporation Apparatus and method for reconstruction of volumetric images in a helical scanning cone-beam computed tomography system
DE19936364A1 (en) 1999-08-03 2001-02-15 Siemens Ag Identification and localisation of marks in a 3D medical scanning process
US6608081B2 (en) 1999-08-12 2003-08-19 Ortho-Mcneil Pharmaceutical, Inc. Bicyclic heterocyclic substituted phenyl oxazolidinone antibacterials, and related compositions and methods
US6413981B1 (en) 1999-08-12 2002-07-02 Ortho-Mcneil Pharamceutical, Inc. Bicyclic heterocyclic substituted phenyl oxazolidinone antibacterials, and related compositions and methods
FR2799028B1 (en) 1999-09-27 2002-05-03 Ge Medical Syst Sa METHOD FOR RECONSTRUCTING A THREE-DIMENSIONAL IMAGE OF ELEMENTS OF STRONG CONTRAST
US6522712B1 (en) 1999-11-19 2003-02-18 General Electric Company Reconstruction of computed tomographic images using interpolation between projection views
US20010024796A1 (en) 1999-12-17 2001-09-27 Selifonov Sergey A. Methods for parallel detection of compositions having desired characteristics
DE10003524B4 (en) 2000-01-27 2006-07-13 Siemens Ag Mobile X-ray device and method for the determination of projection geometries
US6466638B1 (en) 2000-02-11 2002-10-15 Kabushiki Kaisha Toshiba Image mapping method and system
JP2001299744A (en) 2000-04-18 2001-10-30 Hitachi Medical Corp Medical radiotomographic instrument
US6484049B1 (en) 2000-04-28 2002-11-19 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6856827B2 (en) 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6782287B2 (en) 2000-06-27 2004-08-24 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for tracking a medical instrument based on image registration
US6787750B1 (en) * 2000-06-29 2004-09-07 Siemens Corporate Research, Inc. Method and apparatus for robust optical tracking with beacon markers
US6750034B1 (en) 2000-06-30 2004-06-15 Ortho-Mcneil Pharmaceutical, Inc. DNA encoding human serine protease D-G
US6837892B2 (en) 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
AU2001272727A1 (en) 2000-08-21 2002-03-04 V-Target Technologies Ltd. Radioactive emission detector
US6823207B1 (en) 2000-08-26 2004-11-23 Ge Medical Systems Global Technology Company, Llc Integrated fluoroscopic surgical navigation and imaging workstation with command protocol
AU2001292609A1 (en) 2000-09-11 2002-03-26 Closure Medical Corporation Bronchial occlusion method and apparatus
US6504892B1 (en) 2000-10-13 2003-01-07 University Of Rochester System and method for cone beam volume computed tomography using circle-plus-multiple-arc orbit
IL139259A0 (en) 2000-10-25 2001-11-25 Geus Inc Method and system for remote image reconstitution and processing and imaging data collectors communicating with the system
US6473634B1 (en) 2000-11-22 2002-10-29 Koninklijke Philips Electronics N.V. Medical imaging at two temporal resolutions for tumor treatment planning
US7072501B2 (en) 2000-11-22 2006-07-04 R2 Technology, Inc. Graphical user interface for display of anatomical information
US6909794B2 (en) 2000-11-22 2005-06-21 R2 Technology, Inc. Automated registration of 3-D medical scans of similar anatomical structures
US6472372B1 (en) 2000-12-06 2002-10-29 Ortho-Mcneil Pharmaceuticals, Inc. 6-O-Carbamoyl ketolide antibacterials
US6666579B2 (en) 2000-12-28 2003-12-23 Ge Medical Systems Global Technology Company, Llc Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system
WO2002070980A1 (en) 2001-03-06 2002-09-12 The Johns Hopkins University School Of Medicine Simulation system for image-guided medical procedures
US6643351B2 (en) 2001-03-12 2003-11-04 Shimadzu Corporation Radiographic apparatus
JP3518520B2 (en) 2001-03-13 2004-04-12 株式会社島津製作所 Tomography equipment
US6587539B2 (en) 2001-03-13 2003-07-01 Shimadzu Corporation Radiographic apparatus
US6519355B2 (en) 2001-03-28 2003-02-11 Alan C. Nelson Optical projection imaging system and method for automatically detecting cells having nuclear and cytoplasmic densitometric features associated with disease
FR2823345B1 (en) 2001-04-09 2003-08-22 Ge Med Sys Global Tech Co Llc METHOD FOR IMPROVING THE QUALITY OF A THREE-DIMENSIONAL RADIOGRAPHIC IMAGE OF AN OBJECT AND CORRESPONDING RADIOGRAPHIC DEVICE
JP2005516581A (en) 2001-05-15 2005-06-09 オーソ−マクニール・フアーマシユーチカル・インコーポレーテツド Ex vivo priming to produce cytotoxic lymphocytes specific for non-tumor antigens to treat autoimmune and allergic diseases
US20030032898A1 (en) 2001-05-29 2003-02-13 Inder Raj. S. Makin Method for aiming ultrasound for medical treatment
US7607440B2 (en) 2001-06-07 2009-10-27 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
US6636623B2 (en) 2001-08-10 2003-10-21 Visiongate, Inc. Optical projection imaging system and method for automatically detecting cells with molecular marker compartmentalization associated with malignancy and disease
US20030128801A1 (en) 2002-01-07 2003-07-10 Multi-Dimensional Imaging, Inc. Multi-modality apparatus for dynamic anatomical, physiological and molecular imaging
DE10210646A1 (en) 2002-03-11 2003-10-09 Siemens Ag Method for displaying a medical instrument brought into an examination area of a patient
US7499743B2 (en) 2002-03-15 2009-03-03 General Electric Company Method and system for registration of 3D images within an interventional system
US20030190065A1 (en) 2002-03-26 2003-10-09 Cti Pet Systems, Inc. Fast iterative image reconstruction from linograms
DE10215808B4 (en) 2002-04-10 2005-02-24 Siemens Ag Registration procedure for navigational procedures
US6707878B2 (en) 2002-04-15 2004-03-16 General Electric Company Generalized filtered back-projection reconstruction in digital tomosynthesis
US6735914B2 (en) 2002-07-03 2004-05-18 Peter J. Konopka Load bearing wall
MXPA03006874A (en) 2002-07-31 2004-09-03 Johnson & Johnson Long term oxygen therapy system.
US7251522B2 (en) 2002-09-12 2007-07-31 Brainlab Ag X-ray image-assisted navigation using original, two-dimensional x-ray images
DE10245669B4 (en) 2002-09-30 2006-08-17 Siemens Ag A method for intraoperatively generating an updated volume data set
US7620444B2 (en) 2002-10-05 2009-11-17 General Electric Company Systems and methods for improving usability of images for medical applications
US7577282B2 (en) 2002-11-27 2009-08-18 Hologic, Inc. Image handling and display in X-ray mammography and tomosynthesis
FR2847798B1 (en) 2002-11-28 2006-02-10 Ge Med Sys Global Tech Co Llc METHOD FOR DETERMINING FUNCTIONAL PARAMETERS IN A FLUOROSCOPIC DEVICE
US6751284B1 (en) 2002-12-03 2004-06-15 General Electric Company Method and system for tomosynthesis image enhancement using transverse filtering
US20040120981A1 (en) 2002-12-20 2004-06-24 Aruna Nathan Crosslinked alkyd polyesters for medical applications
EP1593093B1 (en) 2003-01-31 2006-08-16 Philips Intellectual Property & Standards GmbH Method for the reconstruction of three-dimensional objects
US7048440B2 (en) 2003-03-12 2006-05-23 Siemens Aktiengesellschaft C-arm x-ray device
JP4163991B2 (en) 2003-04-30 2008-10-08 株式会社モリタ製作所 X-ray CT imaging apparatus and imaging method
MXPA05011725A (en) 2003-04-30 2006-05-17 Johnson & Johnson Cngh0010 specific polynucleotides, polypeptides, antibodies, compositions, methods and uses.
DE10322137A1 (en) 2003-05-16 2004-12-16 Siemens Ag X-ray machine with improved efficiency
DE10325003A1 (en) 2003-06-03 2004-12-30 Siemens Ag Visualization of 2D / 3D-merged image data for catheter angiography
US6904121B2 (en) 2003-06-25 2005-06-07 General Electric Company Fourier based method, apparatus, and medium for optimal reconstruction in digital tomosynthesis
US7482376B2 (en) 2003-07-03 2009-01-27 3-Dimensional Pharmaceuticals, Inc. Conjugated complement cascade inhibitors
WO2005003381A1 (en) 2003-07-04 2005-01-13 Johnson & Johnson Research Pty. Limited Method for detection of alkylated cytosine in dna
US7398116B2 (en) 2003-08-11 2008-07-08 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
US7204640B2 (en) 2003-08-29 2007-04-17 Accuray, Inc. Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data
US20050059879A1 (en) 2003-09-16 2005-03-17 Robert Sutherland Localization of a sensor device in a body
US6987829B2 (en) 2003-09-16 2006-01-17 General Electric Company Non-iterative algebraic reconstruction technique for tomosynthesis
JP3961468B2 (en) 2003-09-19 2007-08-22 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Radiation computed tomography apparatus and radiation detector used therefor
JP3863873B2 (en) 2003-09-30 2006-12-27 株式会社日立製作所 Radiation inspection equipment
JP3863872B2 (en) 2003-09-30 2006-12-27 株式会社日立製作所 Positron emission tomography system
US6980624B2 (en) 2003-11-26 2005-12-27 Ge Medical Systems Global Technology Company, Llc Non-uniform view weighting tomosynthesis method and apparatus
DE10360025B4 (en) 2003-12-19 2006-07-06 Siemens Ag Method for image support of a surgical procedure performed with a medical instrument
US7103136B2 (en) 2003-12-22 2006-09-05 General Electric Company Fluoroscopic tomosynthesis system and method
US7693318B1 (en) 2004-01-12 2010-04-06 Pme Ip Australia Pty Ltd Method and apparatus for reconstruction of 3D image volumes from projection images
US7120283B2 (en) 2004-01-12 2006-10-10 Mercury Computer Systems, Inc. Methods and apparatus for back-projection and forward-projection
US8126224B2 (en) 2004-02-03 2012-02-28 Ge Medical Systems Global Technology Company, Llc Method and apparatus for instrument tracking on a scrolling series of 2D fluoroscopic images
CA2562969A1 (en) 2004-03-29 2005-10-20 Janssen Pharmaceutica N.V. Prokineticin 2beta peptide and its use
US7142633B2 (en) 2004-03-31 2006-11-28 General Electric Company Enhanced X-ray imaging system and method
JP4260060B2 (en) 2004-05-12 2009-04-30 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー X-ray CT apparatus and image reconstruction apparatus
US7310436B2 (en) 2004-06-01 2007-12-18 General Electric Co. Systems, methods and apparatus for specialized filtered back-projection reconstruction for digital tomosynthesis
US7097357B2 (en) * 2004-06-02 2006-08-29 General Electric Company Method and system for improved correction of registration error in a fluoroscopic image
US8774355B2 (en) 2004-06-30 2014-07-08 General Electric Company Method and apparatus for direct reconstruction in tomosynthesis imaging
US7522779B2 (en) 2004-06-30 2009-04-21 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7813469B2 (en) 2004-07-01 2010-10-12 Ge Healthcare Finland Oy Method for producing a three-dimensional digital x-ray image
WO2006017172A1 (en) 2004-07-09 2006-02-16 Fischer Imaging Corporation Diagnostic system for multimodality mammography
US7369695B2 (en) 2004-08-20 2008-05-06 General Electric Company Method and apparatus for metal artifact reduction in 3D X-ray image reconstruction using artifact spatial information
US7327872B2 (en) 2004-10-13 2008-02-05 General Electric Company Method and system for registering 3D models of anatomical regions with projection images of the same
US7778392B1 (en) 2004-11-02 2010-08-17 Pme Ip Australia Pty Ltd Method of reconstructing computed tomography (CT) volumes suitable for execution on commodity central processing units (CPUs) and graphics processors, and apparatus operating in accord with those methods (rotational X-ray on GPUs)
US7913698B2 (en) 2004-11-16 2011-03-29 Uptake Medical Corp. Device and method for lung treatment
JP2008521412A (en) 2004-11-30 2008-06-26 ベリデックス・エルエルシー Lung cancer prognosis judging means
US7551759B2 (en) 2004-12-07 2009-06-23 Siemens Medical Solutions Usa, Inc. Target identification using time-based data sets
FR2882245B1 (en) 2005-02-21 2007-05-18 Gen Electric METHOD FOR DETERMINING THE 3D DISPLACEMENT OF A PATIENT POSITIONED ON A TABLE OF AN IMAGING DEVICE
US7522755B2 (en) 2005-03-01 2009-04-21 General Electric Company Systems, methods and apparatus for filtered back-projection reconstruction in digital tomosynthesis
WO2006114721A2 (en) 2005-04-26 2006-11-02 Koninklijke Philips Electronics N.V. Medical viewing system and method for detecting and enhancing static structures in noisy images using motion of the image acquisition means
US7844094B2 (en) 2005-04-29 2010-11-30 Varian Medical Systems, Inc. Systems and methods for determining geometric parameters of imaging devices
US7603155B2 (en) 2005-05-24 2009-10-13 General Electric Company Method and system of acquiring images with a medical imaging device
DE102005038892A1 (en) 2005-08-17 2007-03-01 Siemens Ag Method for generating three-dimensional X-ray images of object e.g. patient's heart, under (computer tomography) CT scan, involves filtering noise from two-dimensional X-ray image of object taken from arbitrary direction
US20070066881A1 (en) 2005-09-13 2007-03-22 Edwards Jerome R Apparatus and method for image guided accuracy verification
EP3492008B1 (en) * 2005-09-13 2021-06-02 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
US7978886B2 (en) 2005-09-30 2011-07-12 General Electric Company System and method for anatomy based reconstruction
DE102005050917A1 (en) 2005-10-24 2007-04-26 Siemens Ag Reconstruction method for tomographic representation of internal structures of patient, involves using determined projection data and determined filter to reconstruct tomographic representation of object
EP1782734B1 (en) 2005-11-05 2018-10-24 Ziehm Imaging GmbH Device for improving volume reconstruction
US7570732B2 (en) 2005-11-09 2009-08-04 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20070242868A1 (en) 2005-11-09 2007-10-18 Dexela Limited Methods and apparatus for displaying images
JP5270365B2 (en) 2005-12-15 2013-08-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ System and method for cardiac morphology visualization during electrophysiological mapping and treatment
US20070189455A1 (en) 2006-02-14 2007-08-16 Accuray Incorporated Adaptive x-ray control
FR2897461A1 (en) 2006-02-16 2007-08-17 Gen Electric X-RAY DEVICE AND IMAGE PROCESSING METHOD
US7376213B1 (en) 2006-02-23 2008-05-20 General Electric Company CT image reconstruction through employment of function that describes interpreted movement of source around particular point of object
US8526688B2 (en) 2006-03-09 2013-09-03 General Electric Company Methods and systems for registration of surgical navigation data and image data
DE102006012407A1 (en) 2006-03-17 2007-09-20 Siemens Ag Tomosynthetic image reconstruction method and diagnostic device using this method
US9460512B2 (en) 2006-05-12 2016-10-04 Toshiba Medical Systems Corporation Three-dimensional image processing apparatus and reconstruction region specification method
US8233962B2 (en) 2006-05-16 2012-07-31 Siemens Medical Solutions Usa, Inc. Rotational stereo roadmapping
DE102006024425A1 (en) 2006-05-24 2007-11-29 Siemens Ag Medical instrument e.g. catheter, localizing method during electrophysiological procedure, involves obtaining position information of instrument using electromagnetic localization system, and recording two-dimensional X-ray images
US9055906B2 (en) 2006-06-14 2015-06-16 Intuitive Surgical Operations, Inc. In-vivo visualization systems
US7756244B2 (en) 2006-06-22 2010-07-13 Varian Medical Systems, Inc. Systems and methods for determining object position
US11389235B2 (en) 2006-07-14 2022-07-19 Neuwave Medical, Inc. Energy delivery systems and uses thereof
US7826884B2 (en) 2006-07-31 2010-11-02 Siemens Medical Solutions Usa, Inc. Live fluoroscopic roadmapping including targeted automatic pixel shift for misregistration correction
WO2008017051A2 (en) 2006-08-02 2008-02-07 Inneroptic Technology Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
WO2008038283A2 (en) 2006-09-25 2008-04-03 Mazor Surgical Technologies Ltd. C-arm computerized tomography system
DE102006046735A1 (en) 2006-09-29 2008-04-10 Siemens Ag Images e.g. two dimensional-radioscopy image and virtual endoscopy image, fusing device for C-arm device, has image fusion unit for generating image representation from segmented surface with projected image point
US7711409B2 (en) 2006-10-04 2010-05-04 Hampton University Opposed view and dual head detector apparatus for diagnosis and biopsy with image processing methods
US7995819B2 (en) 2006-10-30 2011-08-09 General Electric Company Methods for displaying a location of a point of interest on a 3-D model of an anatomical region
US7744279B2 (en) 2006-11-02 2010-06-29 Carestream Health, Inc. Orientation sensing apparatus for radiation imaging system
US7655004B2 (en) 2007-02-15 2010-02-02 Ethicon Endo-Surgery, Inc. Electroporation ablation apparatus, system, and method
US8781193B2 (en) 2007-03-08 2014-07-15 Sync-Rx, Ltd. Automatic quantitative vessel analysis
DE102007013322B4 (en) 2007-03-20 2009-07-09 Siemens Ag Method for driving an X-ray image recording system and X-ray image recording system
US7899226B2 (en) 2007-04-03 2011-03-01 General Electric Company System and method of navigating an object in an imaged subject
DE102007019827A1 (en) 2007-04-26 2008-11-06 Siemens Ag System and method for determining the position of an instrument
US7853061B2 (en) 2007-04-26 2010-12-14 General Electric Company System and method to improve visibility of an object in an imaged subject
DE102007026115B4 (en) 2007-06-05 2017-10-12 Siemens Healthcare Gmbh Method for generating a 3D reconstruction of a body
IL184151A0 (en) 2007-06-21 2007-10-31 Diagnostica Imaging Software Ltd X-ray measurement method
FR2919096A1 (en) 2007-07-19 2009-01-23 Gen Electric METHOD OF CORRECTING RADIOGRAPHIC IMAGE RECOVERY
US8335359B2 (en) 2007-07-20 2012-12-18 General Electric Company Systems, apparatus and processes for automated medical image segmentation
US7873236B2 (en) 2007-08-28 2011-01-18 General Electric Company Systems, methods and apparatus for consistency-constrained filtered backprojection for out-of-focus artifacts in digital tomosythesis
US8548215B2 (en) 2007-11-23 2013-10-01 Pme Ip Australia Pty Ltd Automatic image segmentation of a volume by comparing and correlating slice histograms with an anatomic atlas of average histograms
DE102008003173B4 (en) 2008-01-04 2016-05-19 Siemens Aktiengesellschaft Method and device for computed tomography for
DE102008009128B4 (en) 2008-02-14 2014-11-06 Siemens Aktiengesellschaft Tomosynthetic image reconstruction method and diagnostic device using this method
FR2927719B1 (en) 2008-02-19 2010-03-26 Gen Electric METHOD FOR PROCESSING IMAGES OBTAINED BY TOMOSYNTHESIS AND APPARATUS THEREFOR
DE102008020948A1 (en) 2008-04-25 2009-11-26 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. X-ray computer tomograph and method for examining a component by means of X-ray computer tomography
EP2321002B1 (en) 2008-05-15 2014-04-23 Intelect Medical Inc. Clinician programmer system and method for calculating volumes of activation
DE102008028387B4 (en) 2008-06-13 2018-05-17 Siemens Healthcare Gmbh A tomographic image reconstruction method for generating an image of an examination object and an imaging device operating according to this method
SE532644C2 (en) 2008-07-03 2010-03-09 Aamic Ab Procedure for analyzing circulating antibodies
US8361066B2 (en) 2009-01-12 2013-01-29 Ethicon Endo-Surgery, Inc. Electrical ablation devices
US7831013B2 (en) 2009-01-16 2010-11-09 Varian Medical Systems, Inc. Real-time motion tracking using tomosynthesis
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US7912180B2 (en) 2009-02-19 2011-03-22 Kabushiki Kaisha Toshiba Scattered radiation correction method and scattered radiation correction apparatus
US20180009767A9 (en) 2009-03-19 2018-01-11 The Johns Hopkins University Psma targeted fluorescent agents for image guided surgery
US10004387B2 (en) 2009-03-26 2018-06-26 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
JP2011019633A (en) 2009-07-14 2011-02-03 Toshiba Corp X-ray diagnostic apparatus and control program for reducing exposure dose
US8675996B2 (en) 2009-07-29 2014-03-18 Siemens Aktiengesellschaft Catheter RF ablation using segmentation-based 2D-3D registration
US9001963B2 (en) 2009-08-06 2015-04-07 Koninklijke Philips N.V. Method and apparatus for generating computed tomography images with offset detector geometries
WO2011021116A1 (en) 2009-08-20 2011-02-24 Koninklijke Philips Electronics N.V. Reconstruction of a region-of-interest image
DE102009038787A1 (en) 2009-08-25 2011-03-10 Siemens Aktiengesellschaft Method of recording an examination object
US8798353B2 (en) 2009-09-08 2014-08-05 General Electric Company Apparatus and method for two-view tomosynthesis imaging
DE102009042922B4 (en) 2009-09-24 2019-01-24 Siemens Healthcare Gmbh Method and apparatus for image determination from x-ray projections taken when traversing a trajectory
US8254518B2 (en) 2009-10-05 2012-08-28 Siemens Medical Solutions Usa, Inc. Acquisition of projection images for tomosynthesis
US8706184B2 (en) 2009-10-07 2014-04-22 Intuitive Surgical Operations, Inc. Methods and apparatus for displaying enhanced imaging data on a clinical image
WO2011128797A1 (en) 2010-04-15 2011-10-20 Koninklijke Philips Electronics N.V. Instrument-based image registration for fusing images with tubular structures
US9401047B2 (en) 2010-04-15 2016-07-26 Siemens Medical Solutions, Usa, Inc. Enhanced visualization of medical image data
WO2011140087A2 (en) 2010-05-03 2011-11-10 Neuwave Medical, Inc. Energy delivery systems and uses thereof
DE102010019632A1 (en) 2010-05-06 2011-11-10 Siemens Aktiengesellschaft Method for recording and reconstructing a three-dimensional image data set and x-ray device
US8625869B2 (en) 2010-05-21 2014-01-07 Siemens Medical Solutions Usa, Inc. Visualization of medical image data with localized enhancement
FR2960332B1 (en) 2010-05-21 2013-07-05 Gen Electric METHOD OF PROCESSING RADIOLOGICAL IMAGES TO DETERMINE A 3D POSITION OF A NEEDLE.
DE102010027227B4 (en) 2010-07-15 2016-10-20 Siemens Healthcare Gmbh Method and computed tomography apparatus for performing an angiographic examination
JP5600272B2 (en) 2010-07-16 2014-10-01 富士フイルム株式会社 Radiation imaging apparatus and method, and program
DE102011007794B4 (en) 2011-04-20 2019-05-23 Siemens Healthcare Gmbh Method for geometrically correct assignment of 3D image data of a patient
US8827934B2 (en) 2011-05-13 2014-09-09 Intuitive Surgical Operations, Inc. Method and system for determining information of extrema during expansion and contraction cycles of an object
US8900131B2 (en) 2011-05-13 2014-12-02 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
DE102011076547A1 (en) 2011-05-26 2012-11-29 Siemens Aktiengesellschaft A method for obtaining a 3D image data set to a picture object
US8565502B2 (en) 2011-05-31 2013-10-22 General Electric Company Method and system for reconstruction of tomographic images
EP2730226A4 (en) 2011-07-06 2015-04-08 Fujifilm Corp X-ray imaging device and calibration method therefor
US20130303944A1 (en) 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Off-axis electromagnetic sensor
US9173626B2 (en) * 2012-01-04 2015-11-03 Siemens Aktiengesellschaft Method for performing dynamic registration, overlays, and 3D views with fluoroscopic images
EP3488803B1 (en) 2012-02-03 2023-09-27 Intuitive Surgical Operations, Inc. Steerable flexible needle with embedded shape sensing
WO2013126659A1 (en) * 2012-02-22 2013-08-29 Veran Medical Technologies, Inc. Systems, methods, and devices for four dimensional soft tissue navigation
DE102012204019B4 (en) 2012-03-14 2018-02-08 Siemens Healthcare Gmbh Method for reducing motion artifacts
US8989342B2 (en) 2012-04-18 2015-03-24 The Boeing Company Methods and systems for volumetric reconstruction using radiography
US20130303945A1 (en) 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Electromagnetic tip sensor
WO2013173234A1 (en) 2012-05-14 2013-11-21 Intuitive Surgical Operations Systems and methods for registration of a medical device using rapid pose search
US10039473B2 (en) 2012-05-14 2018-08-07 Intuitive Surgical Operations, Inc. Systems and methods for navigation based on ordered sensor records
WO2013173229A1 (en) 2012-05-14 2013-11-21 Intuitive Surgical Operations Systems and methods for deformation compensation using shape sensing
EP3524184B1 (en) 2012-05-14 2021-02-24 Intuitive Surgical Operations Inc. Systems for registration of a medical device using a reduced search space
US9429696B2 (en) 2012-06-25 2016-08-30 Intuitive Surgical Operations, Inc. Systems and methods for reducing measurement error in optical fiber shape sensors
US9801551B2 (en) 2012-07-20 2017-10-31 Intuitive Sugical Operations, Inc. Annular vision system
JP6074587B2 (en) 2012-08-06 2017-02-08 株式会社Joled Display panel, display device and electronic device
WO2014028394A1 (en) 2012-08-14 2014-02-20 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems
JP6219396B2 (en) 2012-10-12 2017-10-25 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Positioning of medical devices in bifurcated anatomical structures
US10588597B2 (en) 2012-12-31 2020-03-17 Intuitive Surgical Operations, Inc. Systems and methods for interventional procedure planning
US20140221819A1 (en) * 2013-02-01 2014-08-07 David SARMENT Apparatus, system and method for surgical navigation
EP2968857B1 (en) 2013-03-15 2022-05-04 Intuitive Surgical Operations, Inc. Shape sensor systems for tracking interventional instruments
US9349198B2 (en) 2013-07-26 2016-05-24 General Electric Company Robust artifact reduction in image reconstruction
WO2015017270A1 (en) 2013-07-29 2015-02-05 Intuitive Surgical Operations, Inc. Shape sensor systems with redundant sensing
JP6562919B2 (en) 2013-08-15 2019-08-21 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for medical treatment confirmation
KR102356881B1 (en) 2013-08-15 2022-02-03 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Graphical user interface for catheter positioning and insertion
CN105939647B (en) 2013-10-24 2020-01-21 奥瑞斯健康公司 Robotically-assisted endoluminal surgical systems and related methods
JP2017502728A (en) 2013-12-09 2017-01-26 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for flexible tool alignment for device recognition
WO2015101948A2 (en) 2014-01-06 2015-07-09 Body Vision Medical Ltd. Surgical devices and methods of use thereof
JP6688557B2 (en) 2014-01-07 2020-04-28 キヤノンメディカルシステムズ株式会社 X-ray CT system
EP3979210A1 (en) 2014-02-04 2022-04-06 Intuitive Surgical Operations, Inc. Systems and methods for non-rigid deformation of tissue for virtual navigation of interventional tools
US20150223765A1 (en) 2014-02-07 2015-08-13 Intuitive Surgical Operations, Inc. Systems and methods for using x-ray field emission to determine instrument position and orientation
JP6237326B2 (en) * 2014-02-25 2017-11-29 富士通株式会社 Posture estimation apparatus, posture estimation method, and computer program for posture estimation
EP4282370A3 (en) 2014-03-17 2024-02-21 Intuitive Surgical Operations, Inc. Surgical system including a non-white light general illuminator
US10912523B2 (en) 2014-03-24 2021-02-09 Intuitive Surgical Operations, Inc. Systems and methods for anatomic motion compensation
JP6359312B2 (en) 2014-03-27 2018-07-18 キヤノンメディカルシステムズ株式会社 X-ray diagnostic equipment
EP3125809B1 (en) 2014-03-28 2020-09-09 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US9641830B2 (en) * 2014-04-08 2017-05-02 Lucasfilm Entertainment Company Ltd. Automated camera calibration methods and systems
EP3169247B1 (en) 2014-07-18 2020-05-13 Ethicon, Inc. Mechanical retraction via tethering for lung volume reduction
US20160015394A1 (en) 2014-07-18 2016-01-21 Ethicon, Inc. Methods and Devices for Controlling the Size of Emphysematous Bullae
EP3174490B1 (en) 2014-07-28 2020-04-15 Intuitive Surgical Operations, Inc. Systems and methods for planning multiple interventional procedures
KR20170038012A (en) 2014-07-28 2017-04-05 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for intraoperative segmentation
JP6548730B2 (en) 2014-08-23 2019-07-24 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for display of pathological data in image guided procedures
US10373719B2 (en) 2014-09-10 2019-08-06 Intuitive Surgical Operations, Inc. Systems and methods for pre-operative modeling
US10314513B2 (en) 2014-10-10 2019-06-11 Intuitive Surgical Operations, Inc. Systems and methods for reducing measurement error using optical fiber shape sensors
CN110811488B (en) 2014-10-17 2023-07-14 直观外科手术操作公司 System and method for reducing measurement errors using fiber optic shape sensors
US20180263706A1 (en) 2014-10-20 2018-09-20 Body Vision Medical Ltd. Surgical devices and methods of use thereof
US9986983B2 (en) * 2014-10-31 2018-06-05 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
EP4140432A1 (en) 2014-11-13 2023-03-01 Intuitive Surgical Operations, Inc. Systems and methods for filtering localization data
WO2016106114A1 (en) 2014-12-22 2016-06-30 Intuitive Surgical Operations, Inc. Flexible electromagnetic sensor
WO2016164311A1 (en) 2015-04-06 2016-10-13 Intuitive Surgical Operations, Inc. Systems and methods of registration compensation in image guided surgery
US11116581B2 (en) 2015-05-22 2021-09-14 Intuitive Surgical Operations, Inc. Systems and methods of registration for image guided surgery
US10282638B2 (en) * 2015-07-29 2019-05-07 Siemens Healthcare Gmbh Tracking for detection of TEE probe in fluoroscopy medical imaging
US10702226B2 (en) * 2015-08-06 2020-07-07 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
EP3334325A4 (en) 2015-08-14 2019-05-01 Intuitive Surgical Operations Inc. Systems and methods of registration for image-guided surgery
EP3795061A1 (en) 2015-08-14 2021-03-24 Intuitive Surgical Operations, Inc. Systems and methods of registration for image-guided surgery
US10245034B2 (en) 2015-08-31 2019-04-02 Ethicon Llc Inducing tissue adhesions using surgical adjuncts and medicants
US10569071B2 (en) 2015-08-31 2020-02-25 Ethicon Llc Medicant eluting adjuncts and methods of using medicant eluting adjuncts
WO2017044874A1 (en) 2015-09-10 2017-03-16 Intuitive Surgical Operations, Inc. Systems and methods for using tracking in image-guided medical procedure
AU2016323982A1 (en) 2015-09-18 2018-04-12 Auris Health, Inc. Navigation of tubular networks
CN113367788B (en) 2015-10-26 2024-09-06 纽韦弗医疗设备公司 Energy delivery system and use thereof
US10405753B2 (en) 2015-11-10 2019-09-10 Intuitive Surgical Operations, Inc. Pharmaceutical compositions of near IR closed chain, sulfo-cyanine dyes
WO2017106003A1 (en) 2015-12-14 2017-06-22 Intuitive Surgical Operations, Inc. Apparatus and method for generating 3-d data for an anatomical target using optical fiber shape sensing
US9996361B2 (en) 2015-12-23 2018-06-12 Intel Corporation Byte and nibble sort instructions that produce sorted destination register and destination index mapping
US9881378B2 (en) * 2016-02-12 2018-01-30 Vortex Intellectual Property Holding LLC Position determining techniques using image analysis of marks with encoded or associated position data
US10779055B2 (en) 2016-02-12 2020-09-15 Rovi Guides, Inc. Systems and methods for recording broadcast programs that will be missed due to travel delays
EP3413829B1 (en) 2016-02-12 2024-05-22 Intuitive Surgical Operations, Inc. Systems of pose estimation and calibration of perspective imaging system in image guided surgery
JP7118890B2 (en) 2016-02-12 2022-08-16 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for using registered fluoroscopic images in image-guided surgery
WO2017153839A1 (en) 2016-03-10 2017-09-14 Body Vision Medical Ltd. Methods and systems for using multi view pose estimation
US10702137B2 (en) 2016-03-14 2020-07-07 Intuitive Surgical Operations, Inc.. Endoscopic instrument with compliant thermal interface
US20170296679A1 (en) 2016-04-18 2017-10-19 Intuitive Surgical Operations, Inc. Compositions of Near IR Closed Chain, Sulfo-Cyanine Dyes and Prostate Specific Membrane Antigen Ligands
WO2017218552A1 (en) 2016-06-15 2017-12-21 Intuitive Surgical Operations, Inc. Systems and methods of integrated real-time visualization
WO2018005861A1 (en) 2016-06-30 2018-01-04 Intuitive Surgical Operations, Inc. Graphical user interface for displaying guidance information during an image-guided procedure
WO2018005842A1 (en) 2016-06-30 2018-01-04 Intuitive Surgical Operations, Inc. Graphical user interface for displaying guidance information in a plurality of modes during an image-guided procedure
WO2018035122A1 (en) 2016-08-16 2018-02-22 Intuitive Surgical Operations, Inc. Augmented accuracy using large diameter shape fiber
EP3503834B1 (en) 2016-08-23 2024-06-12 Intuitive Surgical Operations, Inc. Systems for monitoring patient motion during a medical procedure
WO2018057633A1 (en) 2016-09-21 2018-03-29 Intuitive Surgical Operations, Inc. Systems and methods for instrument buckling detection
WO2018064566A1 (en) 2016-09-30 2018-04-05 Intuitive Surgical Operations, Inc. Systems and methods for entry point localization
EP3529579B1 (en) 2016-10-21 2021-08-25 Intuitive Surgical Operations, Inc. Shape sensing with multi-core fiber sensor
CN115631843A (en) 2016-11-02 2023-01-20 直观外科手术操作公司 System and method for continuous registration for image-guided surgery
US20180144092A1 (en) 2016-11-21 2018-05-24 Johnson & Johnson Vision Care, Inc. Biomedical sensing methods and apparatus for the detection and prevention of lung cancer states
CN109922753B (en) 2016-12-08 2023-04-14 直观外科手术操作公司 System and method for navigation in image-guided medical procedures
CN109843143B (en) 2016-12-09 2022-09-20 直观外科手术操作公司 System and method for distributed heat flux sensing of body tissue
US11779396B2 (en) 2017-01-09 2023-10-10 Intuitive Surgical Operations, Inc. Systems and methods for registering elongate devices to three dimensional images in image-guided procedures
CN110167477B (en) 2017-02-01 2023-12-29 直观外科手术操作公司 Registration system and method for image guided surgery
KR102503033B1 (en) 2017-02-01 2023-02-23 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Fixation systems and methods for image-guided procedures
WO2018144726A1 (en) 2017-02-01 2018-08-09 Intuitive Surgical Operations, Inc. Systems and methods for data filtering of passageway sensor data
AU2018243364B2 (en) 2017-03-31 2023-10-05 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
EP3613057A4 (en) 2017-04-18 2021-04-21 Intuitive Surgical Operations, Inc. Graphical user interface for planning a procedure
JP2020518326A (en) 2017-04-18 2020-06-25 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Graphical user interface for monitoring image-guided procedures
JP7195279B2 (en) 2017-05-24 2022-12-23 ボディ・ビジョン・メディカル・リミテッド Method for using a radial endobronchial ultrasound probe for three-dimensional reconstruction of images and improved object localization
US10022192B1 (en) 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
EP3641686A4 (en) 2017-06-23 2021-03-24 Intuitive Surgical Operations, Inc. Systems and methods for navigating to a target location during a medical procedure
CN110809452B (en) 2017-06-28 2023-05-23 奥瑞斯健康公司 Electromagnetic field generator alignment
EP3644886A4 (en) 2017-06-28 2021-03-24 Auris Health, Inc. Electromagnetic distortion detection
US10699448B2 (en) * 2017-06-29 2020-06-30 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11202652B2 (en) * 2017-08-11 2021-12-21 Canon U.S.A., Inc. Registration and motion compensation for patient-mounted needle guide
JP7213867B2 (en) 2017-08-16 2023-01-27 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for monitoring patient movement during medical procedures
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
DE112018005836T5 (en) 2017-11-14 2020-08-06 Intuitive Surgical Operations Inc. SYSTEMS AND METHODS FOR CLEANING ENDOSCOPIC INSTRUMENTS
WO2019113391A1 (en) 2017-12-08 2019-06-13 Auris Health, Inc. System and method for medical instrument navigation and targeting
US10850013B2 (en) 2017-12-08 2020-12-01 Auris Health, Inc. Directed fluidics
CN110869173B (en) 2017-12-14 2023-11-17 奥瑞斯健康公司 System and method for estimating instrument positioning
US11160615B2 (en) 2017-12-18 2021-11-02 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
EP3749239B1 (en) 2018-02-05 2024-08-07 Broncus Medical Inc. Image-guided lung tumor planning and ablation system
US10893842B2 (en) * 2018-02-08 2021-01-19 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US10885630B2 (en) 2018-03-01 2021-01-05 Intuitive Surgical Operations, Inc Systems and methods for segmentation of anatomical structures for image-guided surgery
US10980913B2 (en) 2018-03-05 2021-04-20 Ethicon Llc Sealant foam compositions for lung applications
US20190269819A1 (en) 2018-03-05 2019-09-05 Ethicon Llc Sealant foam compositions for lung applications
US20190298451A1 (en) 2018-03-27 2019-10-03 Intuitive Surgical Operations, Inc. Systems and methods for delivering targeted therapy
KR102489198B1 (en) 2018-03-28 2023-01-18 아우리스 헬스, 인코포레이티드 Systems and Methods for Matching Position Sensors
CN110913791B (en) 2018-03-28 2021-10-08 奥瑞斯健康公司 System and method for displaying estimated instrument positioning
JP7250824B2 (en) 2018-05-30 2023-04-03 オーリス ヘルス インコーポレイテッド Systems and methods for location sensor-based branch prediction
JP7146949B2 (en) 2018-05-31 2022-10-04 オーリス ヘルス インコーポレイテッド Image-based airway analysis and mapping
EP3801189B1 (en) 2018-05-31 2024-09-11 Auris Health, Inc. Path-based navigation of tubular networks
WO2020028603A2 (en) 2018-08-01 2020-02-06 Sony Interactive Entertainment LLC Player induced counter-balancing of loads on a character in a virtual environment
US11080902B2 (en) 2018-08-03 2021-08-03 Intuitive Surgical Operations, Inc. Systems and methods for generating anatomical tree structures
WO2020035730A2 (en) 2018-08-13 2020-02-20 Body Vision Medical Ltd. Methods and systems for multi view pose estimation using digital computational tomography
WO2020036685A1 (en) 2018-08-15 2020-02-20 Auris Health, Inc. Medical instruments for tissue cauterization
EP3806758A4 (en) 2018-08-17 2022-04-06 Auris Health, Inc. Bipolar medical instrument
US11896316B2 (en) 2018-08-23 2024-02-13 Intuitive Surgical Operations, Inc. Systems and methods for generating anatomic tree structures using backward pathway growth
US11737823B2 (en) 2018-10-31 2023-08-29 Intuitive Surgical Operations, Inc. Antenna systems and methods of use
US20200138514A1 (en) 2018-11-02 2020-05-07 Intuitive Surgical Operations, Inc. Tissue penetrating device tips
US11637378B2 (en) 2018-11-02 2023-04-25 Intuitive Surgical Operations, Inc. Coiled dipole antenna
US11280863B2 (en) 2018-11-02 2022-03-22 Intuitive Surgical Operations, Inc. Coiled antenna with fluid cooling
US11730537B2 (en) 2018-11-13 2023-08-22 Intuitive Surgical Operations, Inc. Cooled chokes for ablation systems and methods of use
US11633623B2 (en) 2019-04-19 2023-04-25 University Of Maryland, Baltimore System and method for radiation therapy using spatial-functional mapping and dose sensitivity of branching structures and functional sub-volumes

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030130576A1 (en) * 2000-04-28 2003-07-10 Teresa Seeley Fluoroscopic tracking and visualization system
US20110152676A1 (en) * 2009-12-21 2011-06-23 General Electric Company Intra-operative registration for navigated surgical procedures
US20120046521A1 (en) * 2010-08-20 2012-02-23 Mark Hunter Systems, instruments, and methods for four dimensional soft tissue navigation
US20120289825A1 (en) * 2011-05-11 2012-11-15 Broncus, Technologies, Inc. Fluoroscopy-based surgical device tracking method and system
US20160005168A1 (en) * 2014-07-02 2016-01-07 Covidien Lp Fluoroscopic pose estimation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3750134A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11992349B2 (en) 2015-08-06 2024-05-28 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
EP3524157A1 (en) * 2018-02-08 2019-08-14 Covidien LP System and method for local three dimensional volume reconstruction using a standard fluoroscope

Also Published As

Publication number Publication date
EP3750134A4 (en) 2021-12-01
US11364004B2 (en) 2022-06-21
CA3088277A1 (en) 2019-08-15
US20190239837A1 (en) 2019-08-08
US11712213B2 (en) 2023-08-01
US20190239838A1 (en) 2019-08-08
JP7322039B2 (en) 2023-08-07
US20210145387A1 (en) 2021-05-20
CN111699515A (en) 2020-09-22
EP3750134A1 (en) 2020-12-16
US11896414B2 (en) 2024-02-13
JP2021512692A (en) 2021-05-20
CN111699515B (en) 2023-11-28
US20220313190A1 (en) 2022-10-06
US10893842B2 (en) 2021-01-19
AU2019217999A1 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
US11896414B2 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11547377B2 (en) System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US11992349B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10699448B2 (en) System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
AU2020210140B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US10905498B2 (en) System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US20240138783A1 (en) Systems and methods for pose estimation of a fluoroscopic imaging device and for three-dimensional imaging of body structures
US20240206980A1 (en) Volumetric filter of fluoroscopic sweep video
WO2024079639A1 (en) Systems and methods for confirming position or orientation of medical device relative to target

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19751690

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3088277

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2019217999

Country of ref document: AU

Date of ref document: 20190208

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020542153

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019751690

Country of ref document: EP

Effective date: 20200908