WO2012052929A2 - System and method for facilitating navigation of a tool using a fluoroscope - Google Patents

System and method for facilitating navigation of a tool using a fluoroscope Download PDF

Info

Publication number
WO2012052929A2
WO2012052929A2 PCT/IB2011/054644 IB2011054644W WO2012052929A2 WO 2012052929 A2 WO2012052929 A2 WO 2012052929A2 IB 2011054644 W IB2011054644 W IB 2011054644W WO 2012052929 A2 WO2012052929 A2 WO 2012052929A2
Authority
WO
WIPO (PCT)
Prior art keywords
fluoroscope
tool
image
relative
coordinates
Prior art date
Application number
PCT/IB2011/054644
Other languages
French (fr)
Other versions
WO2012052929A3 (en
Inventor
Pinhas Gilboa
Original Assignee
Activiews Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Activiews Ltd. filed Critical Activiews Ltd.
Publication of WO2012052929A2 publication Critical patent/WO2012052929A2/en
Publication of WO2012052929A3 publication Critical patent/WO2012052929A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm

Definitions

  • the present invention relates to minimally invasive surgical procedures and, in particular, it concerns a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope.
  • a fluoroscopy imaging device (or "fluoroscope”) is defined herein functionally as any two-dimensional (2D) volumetric imaging tool which generates a two-dimensional image which represents information about the volumetric properties of tissue.
  • the fluoroscope generates a 2D image in which each pixel brightness value is inversely related to the total X-ray absorption for that image region along a corresponding path generally along the optical axis of the fluoroscope.
  • Such two dimensional images taken from a single viewing direction do not provide sufficient information for guiding a tool to a desired target with regard to the depth along the optical axis of the fluoroscope.
  • FIGS, la-lc show axial, lateral and posterior (A-P) views, respectively, of a vertebra 100 to be treated together with a desired angle of insertion for a needle 110.
  • A-P anterior, lateral and posterior
  • the treatment is performed by guiding needle 110, through the pedicle (the neck region interconnecting the transverse process to the vertebral body) 104, and into the vertebral body 102, in order to inject and fill it with cement.
  • the guidance is performed assisted by a 2D fluoroscopy imaging device, such as a C-arm fluoroscope.
  • the fluoroscopy imaging device cannot generate an axial view like FIG. la.
  • A-P anterior-posterior
  • FIG. 2 When viewing in an anterior-posterior (A-P) direction, represented by arrow 120 in FIG. la, this generates the view of FIG. 2, which does not allow determination of the orientation of needle 110 from its projection on the image.
  • the same difficulty arises in any direction that can be taken by the fluoroscope, such as the lateral direction (arrow 130 in FIG. la) which generates the view of FIG. 3.
  • guiding of the needle necessarily relies on clues from the image, together with trial and error.
  • FIGS. 4a to 4c A typical trial-and-error approach to performing such a procedure under fluoroscopic imaging will not be described with reference to FIGS. 4a to 4c.
  • CT computer tomography
  • the entry point 405 of the needle path 410 is chosen.
  • the distance 420 from the spinous process (lying on the vertebra's center line) to the determined entry point is measured in the image and then located on the body.
  • the needle is guided to point 430 located at one end of the pedicle, where it appears on the A-P fluoroscopy view (FIG. 4b) on the edge of the pedicle at point 450 and at the lateral image (FIG.
  • This trial and error method exposes the patient as well the physician to unnecessary harmful X-ray radiation as the physician repeatedly checks both the A-P and lateral views with the fluoroscope, as well as unnecessary mechanical damage to the pedicle during repeated withdrawal and reinsertion of the needle.
  • the trial and error approach also carries with it considerable risk of error, with possibly severe consequences to the patient.
  • the present invention is a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope.
  • a method for facilitating navigation of a tool to an inter-body target using a fluoroscope comprising the steps of: (a) determining a position of the fluoroscope relative to a frame of coordinates, the position defining an imaging axis of the fluoroscope; (b) defining relative to the frame of coordinates a reference surface passing through the inter-body target, the reference surface being significantly non-parallel to the imaging axis of the fluoroscope; (c) tracking a position of the tool relative to the frame of coordinates; and (d) generating an indication on a display in the context of an image from the fluoroscope, the indication designating a location at which a current direction of the tool intersects the reference plane, thereby facilitating angular alignment of the tool towards the inter-body target.
  • a system for facilitating navigation of a tool to an inter-body target using a fluoroscope comprising: (a) a processing system including at least one processor; (b) a fluoroscopic imaging device associated with the processing system, the fluoroscopic imaging device defining an imaging axis; (c) a display associated with the processing system for displaying images sampled by the fluoroscopic imaging device; and (d) a tracking sensor arrangement associated with the processing system so as to form a tracking system configured to determine the position of the fluoroscopic imaging device and of the tool relative to a frame of coordinates, wherein the processing system is configured to: (i) derive from outputs of the tracking sensor arrangement a position of the fluoroscopic imaging device relative to the frame of coordinates in which a fluoroscope image was generated; (ii) track a position of the tool relative to the frame of coordinates; and (iii) generate an indication on the display in the context of the fluoroscope image, the indication designating
  • the image from the fluoroscope is a frozen image
  • the method further comprising generating on the display a representation of the current position of the tool in the context of the image as derived from the tracking.
  • the reference surface is defined as a plane at a defined depth below a surface of the body, the defined depth being derived from previously sampled three-dimensional imaging data of the body without registration between the three-dimensional imaging data and the frame of coordinates.
  • the frame of coordinates is defined relative to a set of at least four optical fiducial markers applied to an external surface of the body.
  • the step of tracking includes: sampling images of the optical fiducial markers by use of a camera mounted on the tool, and processing the images to derive a position of the camera relative to the fiducial markers.
  • the optical fiducial markers include at least four radio-opaque markers
  • the determining a position of the fluoroscope includes: identifying a location of the radio-opaque markers in an image derived by the fluoroscope; and processing the image to derive a position of the fluoroscope relative to the radio-opaque markers.
  • FIGS, la-lc are schematic axial, lateral and posterior views, respectively, discussed above, illustrating a desired path of needle insertion for intra- vertebral-body needle access;
  • FIGS. 2 and 3 are A-P and lateral fluoroscopic images of the spine, respectively, described above, illustrating a desired needle insertion direction as viewed in those images;
  • FIGS. 4a-4c and 5a-5c are views referred to above describing a prior art trial-and-error technique for navigating to an inter-body target based upon A-P and lateral fluoroscope images;
  • FIG. 6 is a schematic representation showing a geometrical representation of the principles of the present invention.
  • FIG. 7a is an axial slice taken from pre-operative CT data defining parameters used in an embodiment of the present invention.
  • FIG. 7b is a display according to the teachings of an embodiment of the present invention supplementing an A-P fluoroscope image with angular depth cue indications to facilitate navigation of a tool to the inter-body targets;
  • FIG. 8 is a schematic isometric illustration of a C-arm fluoroscope with added video imaging sensors and a fiducial marker sticker according to a preferred embodiment of the present invention
  • FIG. 9 is a schematic representation of an axial CT image displayed according to the teachings of an embodiment of the present invention for designation of reference surfaces containing the target locations; and
  • FIG. 10 is a schematic representation of various components used to implement a system constructed and operative according to an embodiment of the present invention.
  • the present invention is a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope. Certain embodiments of the invention facilitate aligning a needle with one or more intra-body targets using only a single fluoroscopy view of the body.
  • certain embodiments of the present invention provide a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope in which a tracking system is used to determine the position of a fluoroscope and of a tool relative to a frame of coordinates in which there is also defined a reference surface passing through the inter-body target.
  • An indication is then generated on a display in the context of an image from the fluoroscope designating a location at which a current direction of the tool intersects the reference plane. This facilitates correct angular alignment of the tool towards the inter-body target.
  • the image from the fluoroscope is a frozen image, thereby facilitating performance of the procedure without continuous exposure of the patient and the medical practitioner to ionizing radiation.
  • a representation of the current position of the tool itself as derived from the tracking system is preferably generated on the display in the context of the frozen image.
  • the reference surface is chosen to be significantly non-parallel to the imaging axis of the fluoroscope, defined as being inclined to the imaging axis by at least 30 degrees, more preferably at least 60 degrees, and typically roughly perpendicular thereto, defined as being at and angle of at least 75 degrees to the imaging axis.
  • the reference surface is not necessarily planar, but in certain preferred implementations is chosen to be a plane at a given depth below the local skin surface. This depth may advantageously be derived from previously sampled three-dimensional imaging data (e.g., a CT image) of the body without requiring registration between the three-dimensional imaging data and the frame of coordinates, as will be detailed below.
  • certain preferred implementations have the frame of coordinates defined relative to a set of at least four optical fiducial markers applied to an external surface of the body.
  • tracking is preferably performed by sampling images of the optical fiducial markers by use of a camera mounted on the tool, and processing the images to derive a position of the camera relative to the fiducial markers.
  • the optical fiducial markers also include radio-opaque markers so that the position of the fluoroscope can be derived by identifying locations of the radio-opaque markers in the fluoroscope image and processing the image to derive a position of the fluoroscope relative to the radio-opaque markers.
  • a body 610 is placed in a C-arm fluoroscopy device 600.
  • the target 612 is located within the body on a plane 608, substantially perpendicular to the X-rays propagation direction, which can be taken as the optical axis of the imaging device.
  • plane 608 substantially perpendicular to the X-rays propagation direction, which can be taken as the optical axis of the imaging device.
  • a projection image 614 of the target is formed on image intensifier 601 of the fluoroscopy device.
  • a tool 620 is guided within the body.
  • the extrapolation of the tool direction (the projected virtual path) is shown as line 700, and the points of intersection of the tool direction with planes 730 and 720 are indicated by dashed lines 705 and 706 traversing the tool direction line.
  • the correct needle orientation is identified uniquely when two conditions are satisfied: the intersection of the virtual path 700 with line 705 coincides with the center of the image of the pedicle 701, and the intersection of the virtual path 700 with line 706 coincides with the center of the vertebra 702.
  • the angle is too shallow and needs to be increased. If the distance between crossing images is too short, for instance, instead of being placed in the center of the pedicle being placed at point 704, the angle is too steep and needs to be reduced.
  • the invention as described herein can be implemented with a wide range of tracking systems for determining the position of the tool and the fluoroscope relative to the frame of reference in which the reference plane(s) are defined.
  • a number of tracking systems suitable for this purpose are well known in the art, and are commercially available from various sources.
  • the system employs an artifact which includes features visible in the fluoroscope image, deployed adjacent to, and preferably in a position delineating, the planned point of insertion of the tool through the skin.
  • Figure 8 describes the general setup according to an embodiment of the invention.
  • a reference sticker 800 preferably of a flexible material, is attached on body of the patient at the location of the entry point on the needle.
  • a plurality of radio-opaque fiducial markers 805 may be embedded in the sticker.
  • a fluoroscopy imaging device 810 is used to produce 2D images of the interior of the body.
  • the radio-opaque fiducial markers render the fiducial markers directly visible in the fluoroscope image, allowing tracking of the fluoroscope directly from data of the fluoroscope image. In order to fully determine the relative positions of the markers (which are not predetermined due to the flexibility of the reference sticker), two non-parallel initial fluoroscope views would be needed.
  • FIG. 10 shows a schematic representation of a system for facilitating navigation of a tool to an inter-body target using a fluoroscope according to an embodiment of the present invention, and suitable for implementing a method according to an embodiment of the present invention.
  • the system has a processing system 1000 including at least one processor 1002.
  • a fluoroscopic imaging device or "fluoroscope" 1004 and a display 1006 are associated with processing system 1000.
  • Display 1006 and/or processing system 1000 may be integrated components implemented as an original part of fluoroscope 1004, but may be conveniently implemented by connection of a standard computer to a standard data output interface 1008 of a conventional fluoroscope 1004 including a standard C-arm imaging device 1010, other standard controls and typically its own dedicated display 1012.
  • the system also includes a tracking sensor arrangement associated with processing system 1000 so as to form a tracking system configured to determine the position of C-arm imaging device 1010 and of a surgical tool 1014 relative to a frame of coordinates.
  • the tracking sensor arrangement includes a tracking camera 1016 mounted associated with surgical tool 1014 and a further one or two tracking cameras 1018 associated with the C-arm imaging device 1010, as well as a sticker carrying fiducial markers (not shown). The various tracking cameras provide images of the fiducial markers to processing system 1000 for tracking of the tool and fluoroscope positions.
  • processing system 1000 receives a CT image via a data input 1020 (e.g., a media reader or network connection) and images from the fluoroscopy imaging device output interface 1008.
  • a data input 1020 e.g., a media reader or network connection
  • the computer system preferably also receives images from video cameras mounted on the surgical tool and/or on the fluoroscope for determining the needle path defined by its entry point on the body and its angle with respect to the reference sticker, as well as the orientation of the fluoroscope.
  • the entry point 910 and at least one of targets 920 or 930 are mark by using a pointing device such as a computer mouse or other user input device(s) 1022.
  • the vertical depth distance of the target from the entry point is determined, 942 for target 920, and 944 for target 930 both measured from entry point 910.
  • the horizontal distance 940 of the entry point from either target 920 or target 930 is also determined.
  • the location of the reference sticker with respect to the fluoroscopy imaging device may be determined using technologies known in the prior art.
  • US patent application publication no. 2008/0208041 System and Method for Optical Position Measurement and Guidance of a Rigid or Semi-Flexible Tool to a Target" teaches using of single miniature camera attached to a needle for calculating the orientation of the needle relative to the sticker. Using the video images obtained from the video cameras 822 and 824, the orientation of the sticker may also be calculated using the same mathematics.
  • the entire workflow of a procedure performed according to an embodiment of the present invention is typically as follows: (a) The target and the entry point are identified in the CT scan, from which the vertical distance from the entry point to the target is determined; (b) based on the horizontal distance 940, with addition fluoroscopy images of the body, the sticker is attached at the entry point; (c) the location and orientation of the sticker in the fluoroscopy system of coordinates is determined using one of the technologies described above; (d) a tool is placed at the entry point and its angular direction is determined using one of the above technologies; (e) the point of intersection of the needle direction on the at least one plane defined by the vertical distance of the target from the entry point is displayed; and (f) by aiming the needle so the said intersected point coincided with the image of the target, the correct angle can readily be achieved so that the needle is correctly aligned with the target or with multiple targets defined along the desired path.
  • the system and method described herein does not require registration of the body position to the CT image used during planning. Instead, the CT scan is used to derive the depth from the skin surface of the reference planes containing the targets.
  • the targets themselves are chosen to be features which are directly visible from the fluoroscope images and the depth information is used in derivation of the locations for display of the additional symbols 705 and 706, as described above.
  • the reference surfaces are not necessarily defined as depths or derived from volumetric image data.
  • an alternative implementation may employ an oblique or lateral fluoroscope direction to view the target location(s) from a direction significantly non- parallel with the viewing direction employed for navigation, and the user may designate (e.g., by manual input with a cursor) a plane or other surface containing each target corresponding to a line in the oblique or lateral view.
  • the display of the extrapolated tool direction with the plane may be generated and used as a navigational aid exactly as described above.
  • fluoroscopic imaging the invention is equally applicable to any and all other imaging systems which provide a real-time 2D image representing internal properties of a 3D volume as viewed along an imaging direction, whether based on X-ray transmission, ultrasound reflection or any other imaging technology.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

System and method for facilitating navigation of a tool to an inter-body target using a fluoroscope employs a tracking system to determine the position of a fluoroscope and of a tool relative to a frame of coordinates in which there is also defined a reference surface passing through the inter-body target. An indication is then generated on a display in the context of an image from the fluoroscope designating a location at which a current direction of the tool intersects the reference plane. This facilitates correct angular alignment of the tool towards the inter-body target. The reference surface is preferably defined as a plane at a given depth below the local skin surface as derived from CT image data, but without requiring registration between the fluoroscope image and the CT image data.

Description

System and Method for Facilitating Navigation of a Tool Using a
Fluoroscope
FIELD AND BACKGROUND OF THE INVENTION
The present invention relates to minimally invasive surgical procedures and, in particular, it concerns a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope.
Medical needle procedures are typically performed in the operation room using fluoroscopy imaging devices. A fluoroscopy imaging device (or "fluoroscope") is defined herein functionally as any two-dimensional (2D) volumetric imaging tool which generates a two-dimensional image which represents information about the volumetric properties of tissue. Typically, the fluoroscope generates a 2D image in which each pixel brightness value is inversely related to the total X-ray absorption for that image region along a corresponding path generally along the optical axis of the fluoroscope. Such two dimensional images taken from a single viewing direction do not provide sufficient information for guiding a tool to a desired target with regard to the depth along the optical axis of the fluoroscope.
In order to guide the needle in three dimensional (3D) space, two mutually perpendicular images are typically used for assessing the direction of the needle with respect to the target. The size of the fluoroscope is not big enough to take images of the body from direction head to toe (axial view). Because of the lack of axial images, fluoroscope images are generally not directly helpful for assessing the angle of the needle in an axial plane. In many medical procedures in which a fluoroscope is used to guide the needle, an accurate axial orientation is needed, but can only be achieved indirectly.
An example of such a procedure is vertebroplasty, in which a needle is guided into a vertebra for injecting cement. FIGS, la-lc show axial, lateral and posterior (A-P) views, respectively, of a vertebra 100 to be treated together with a desired angle of insertion for a needle 110. During the guidance of the needle, care should be taken to prevent damage to the nerve system adjacent to the vertebrae. The treatment is performed by guiding needle 110, through the pedicle (the neck region interconnecting the transverse process to the vertebral body) 104, and into the vertebral body 102, in order to inject and fill it with cement. The guidance is performed assisted by a 2D fluoroscopy imaging device, such as a C-arm fluoroscope. As already mentioned, the fluoroscopy imaging device cannot generate an axial view like FIG. la. When viewing in an anterior-posterior (A-P) direction, represented by arrow 120 in FIG. la, this generates the view of FIG. 2, which does not allow determination of the orientation of needle 110 from its projection on the image. The same difficulty arises in any direction that can be taken by the fluoroscope, such as the lateral direction (arrow 130 in FIG. la) which generates the view of FIG. 3. Instead of determining needle angle directly from the image, guiding of the needle necessarily relies on clues from the image, together with trial and error.
A typical trial-and-error approach to performing such a procedure under fluoroscopic imaging will not be described with reference to FIGS. 4a to 4c. In a planning stage, based on a computer tomography (CT) image 400 of the axial cross section of the vertebra, the entry point 405 of the needle path 410 is chosen. The distance 420 from the spinous process (lying on the vertebra's center line) to the determined entry point is measured in the image and then located on the body. Assisted by fluoroscopy, the needle is guided to point 430 located at one end of the pedicle, where it appears on the A-P fluoroscopy view (FIG. 4b) on the edge of the pedicle at point 450 and at the lateral image (FIG. 4c) at point 460. If the needle is inserted in the correct angle, it would reach the other end of the pedicle at point 530 of FIG. 5a. Its A-P view (FIG. 5b) is at point 550, and on the lateral view (FIG. 5c) at point 560. If the angle of the needle is too shallow or too steep, it will reach the end of edge of the pedicle in either the A-P view or the lateral view but not in both on the same time, forcing the physical to reposition the needle until it gets in both ends to the right locations as described above. This trial and error method exposes the patient as well the physician to unnecessary harmful X-ray radiation as the physician repeatedly checks both the A-P and lateral views with the fluoroscope, as well as unnecessary mechanical damage to the pedicle during repeated withdrawal and reinsertion of the needle. The trial and error approach also carries with it considerable risk of error, with possibly severe consequences to the patient.
There is therefore a need for a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope, and particularly, which would provide graphic cues to facilitate guiding a tool in the depth dimension of a two-dimensional volumetric image.
SUMMARY OF THE INVENTION The present invention is a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope.
According to the teachings of the present invention there is provided, a method for facilitating navigation of a tool to an inter-body target using a fluoroscope, the method comprising the steps of: (a) determining a position of the fluoroscope relative to a frame of coordinates, the position defining an imaging axis of the fluoroscope; (b) defining relative to the frame of coordinates a reference surface passing through the inter-body target, the reference surface being significantly non-parallel to the imaging axis of the fluoroscope; (c) tracking a position of the tool relative to the frame of coordinates; and (d) generating an indication on a display in the context of an image from the fluoroscope, the indication designating a location at which a current direction of the tool intersects the reference plane, thereby facilitating angular alignment of the tool towards the inter-body target.
There is also provided according to an embodiment of the present invention, a system for facilitating navigation of a tool to an inter-body target using a fluoroscope, the system comprising: (a) a processing system including at least one processor; (b) a fluoroscopic imaging device associated with the processing system, the fluoroscopic imaging device defining an imaging axis; (c) a display associated with the processing system for displaying images sampled by the fluoroscopic imaging device; and (d) a tracking sensor arrangement associated with the processing system so as to form a tracking system configured to determine the position of the fluoroscopic imaging device and of the tool relative to a frame of coordinates, wherein the processing system is configured to: (i) derive from outputs of the tracking sensor arrangement a position of the fluoroscopic imaging device relative to the frame of coordinates in which a fluoroscope image was generated; (ii) track a position of the tool relative to the frame of coordinates; and (iii) generate an indication on the display in the context of the fluoroscope image, the indication designating a location at which a current direction of the tool intersects a reference surface passing through the inter-body target, the reference surface being significantly non-parallel to the imaging axis of the fluoroscope, thereby facilitating angular alignment of the tool towards the inter-body target.
According to a further feature of an embodiment of the present invention, the image from the fluoroscope is a frozen image, the method further comprising generating on the display a representation of the current position of the tool in the context of the image as derived from the tracking.
According to a further feature of an embodiment of the present invention, the reference surface is defined as a plane at a defined depth below a surface of the body, the defined depth being derived from previously sampled three-dimensional imaging data of the body without registration between the three-dimensional imaging data and the frame of coordinates.
According to a further feature of an embodiment of the present invention, the frame of coordinates is defined relative to a set of at least four optical fiducial markers applied to an external surface of the body.
According to a further feature of an embodiment of the present invention, the step of tracking includes: sampling images of the optical fiducial markers by use of a camera mounted on the tool, and processing the images to derive a position of the camera relative to the fiducial markers.
According to a further feature of an embodiment of the present invention, the optical fiducial markers include at least four radio-opaque markers, and wherein the determining a position of the fluoroscope includes: identifying a location of the radio-opaque markers in an image derived by the fluoroscope; and processing the image to derive a position of the fluoroscope relative to the radio-opaque markers. BRIEF DESCRIPTION OF THE DRAWINGS
The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
FIGS, la-lc are schematic axial, lateral and posterior views, respectively, discussed above, illustrating a desired path of needle insertion for intra- vertebral-body needle access;
FIGS. 2 and 3 are A-P and lateral fluoroscopic images of the spine, respectively, described above, illustrating a desired needle insertion direction as viewed in those images;
FIGS. 4a-4c and 5a-5c are views referred to above describing a prior art trial-and-error technique for navigating to an inter-body target based upon A-P and lateral fluoroscope images;
FIG. 6 is a schematic representation showing a geometrical representation of the principles of the present invention;
FIG. 7a is an axial slice taken from pre-operative CT data defining parameters used in an embodiment of the present invention;
FIG. 7b is a display according to the teachings of an embodiment of the present invention supplementing an A-P fluoroscope image with angular depth cue indications to facilitate navigation of a tool to the inter-body targets;
FIG. 8 is a schematic isometric illustration of a C-arm fluoroscope with added video imaging sensors and a fiducial marker sticker according to a preferred embodiment of the present invention;
FIG. 9 is a schematic representation of an axial CT image displayed according to the teachings of an embodiment of the present invention for designation of reference surfaces containing the target locations; and FIG. 10 is a schematic representation of various components used to implement a system constructed and operative according to an embodiment of the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS The present invention is a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope. certain embodiments of the invention facilitate aligning a needle with one or more intra-body targets using only a single fluoroscopy view of the body.
The principles and operation of systems and methods according to the present invention may be better understood with reference to the drawings and the accompanying description.
Before addressing specific drawings, by way of introduction, certain embodiments of the present invention provide a system and method for facilitating navigation of a tool to an inter-body target using a fluoroscope in which a tracking system is used to determine the position of a fluoroscope and of a tool relative to a frame of coordinates in which there is also defined a reference surface passing through the inter-body target.
An indication is then generated on a display in the context of an image from the fluoroscope designating a location at which a current direction of the tool intersects the reference plane. This facilitates correct angular alignment of the tool towards the inter-body target.
Most preferably, the image from the fluoroscope is a frozen image, thereby facilitating performance of the procedure without continuous exposure of the patient and the medical practitioner to ionizing radiation. In the case of a frozen image, a representation of the current position of the tool itself as derived from the tracking system is preferably generated on the display in the context of the frozen image.
The reference surface is chosen to be significantly non-parallel to the imaging axis of the fluoroscope, defined as being inclined to the imaging axis by at least 30 degrees, more preferably at least 60 degrees, and typically roughly perpendicular thereto, defined as being at and angle of at least 75 degrees to the imaging axis. The reference surface is not necessarily planar, but in certain preferred implementations is chosen to be a plane at a given depth below the local skin surface. This depth may advantageously be derived from previously sampled three-dimensional imaging data (e.g., a CT image) of the body without requiring registration between the three-dimensional imaging data and the frame of coordinates, as will be detailed below.
As further described below, certain preferred implementations have the frame of coordinates defined relative to a set of at least four optical fiducial markers applied to an external surface of the body. In this case, tracking is preferably performed by sampling images of the optical fiducial markers by use of a camera mounted on the tool, and processing the images to derive a position of the camera relative to the fiducial markers. In certain embodiments, the optical fiducial markers also include radio-opaque markers so that the position of the fluoroscope can be derived by identifying locations of the radio-opaque markers in the fluoroscope image and processing the image to derive a position of the fluoroscope relative to the radio-opaque markers.
Referring now to the drawings, reference is made to Figure 6. A body 610 is placed in a C-arm fluoroscopy device 600. The target 612 is located within the body on a plane 608, substantially perpendicular to the X-rays propagation direction, which can be taken as the optical axis of the imaging device. We assume that we know the coordinates of plane 608 relative to the body and relative to the image device. When the X-ray source 602 is operated, a projection image 614 of the target is formed on image intensifier 601 of the fluoroscopy device. A tool 620 is guided within the body. If the orientation of the tool, defined by its 3D location and 2D angles, is known, it is possible to determine a point of intersection 624 of a projection 622 of the tool direction where it meets plane 608, as well as a location for display of a simulated image 626 of point 624 in the fluoroscopy image. The result is a fluoroscopy image with an added feature which shows where the tool, if advanced in its current direction, will hit the plane 608 containing the target. Hence, by manipulating needle 620 so that the simulated image 626 coincides with the image 614 of the target, it becomes possible to reliably achieve alignment of the tool in three dimensions while using only a single 2D fluoroscopy view.
It is also possible to define a path which goes through multiple anatomical landmarks, as in the case of guiding a vertebroplasty needle described in Figure 7a and 7b. Two easily identified landmarks are used: the center of the pedicle 701 and the center of the vertebra 702. From the CT image shown in Figure 7a, two planes are defined: plane 720 intersecting with target point 702 and defined by its vertical depth 722 from the skin, and plane 730 intersecting with the center of the pedicle 701, defined by its vertical depth 732 from the skin. In the A-P fluoroscope view of Figure 7b, the extrapolation of the tool direction (the projected virtual path) is shown as line 700, and the points of intersection of the tool direction with planes 730 and 720 are indicated by dashed lines 705 and 706 traversing the tool direction line. The correct needle orientation is identified uniquely when two conditions are satisfied: the intersection of the virtual path 700 with line 705 coincides with the center of the image of the pedicle 701, and the intersection of the virtual path 700 with line 706 coincides with the center of the vertebra 702. When the images of the intersecting points are too spaced apart, for example, instead of the center of pedicle the relevant intersecting point being placed at point 703, the angle is too shallow and needs to be increased. If the distance between crossing images is too short, for instance, instead of being placed in the center of the pedicle being placed at point 704, the angle is too steep and needs to be reduced.
The invention as described herein can be implemented with a wide range of tracking systems for determining the position of the tool and the fluoroscope relative to the frame of reference in which the reference plane(s) are defined. A number of tracking systems suitable for this purpose are well known in the art, and are commercially available from various sources. According to certain embodiments of the present invention, it is believed to be particularly advantageous to employ a system which measures the tool position and fluoroscope position relative to the skin surface, for example, by use of an artifact attached to the skin. In a most preferred subset of implementations, the system employs an artifact which includes features visible in the fluoroscope image, deployed adjacent to, and preferably in a position delineating, the planned point of insertion of the tool through the skin.
By way of one non-limiting example, Figure 8 describes the general setup according to an embodiment of the invention. A reference sticker 800, preferably of a flexible material, is attached on body of the patient at the location of the entry point on the needle. A plurality of radio-opaque fiducial markers 805 may be embedded in the sticker. A fluoroscopy imaging device 810 is used to produce 2D images of the interior of the body. The radio-opaque fiducial markers render the fiducial markers directly visible in the fluoroscope image, allowing tracking of the fluoroscope directly from data of the fluoroscope image. In order to fully determine the relative positions of the markers (which are not predetermined due to the flexibility of the reference sticker), two non-parallel initial fluoroscope views would be needed. A discussion of suitable calculation techniques for determining the position of the fluoroscope from such images may be found in US patent application publication no. 2008/0208041 entitled "System and Method for Optical Position Measurement and Guidance of a Rigid or Semi-Flexible Tool to a Target" to the present inventor, which is hereby incorporated in its entirety as if fully set-out herein. Unlike a light-based system, it will be noted that the X-ray source of the fluoroscope defines the effective focal point of the imaging geometry.
Alternatively, tracking of the fluoroscope may be performed purely optically, employing one or more video cameras 822 and 824 attached to the fluoroscope imaging device at a known position and orientation with respect to the device. In this case, the tracking is preferably also performed according to the approach described in the aforementioned '041 publication FIG. 10 shows a schematic representation of a system for facilitating navigation of a tool to an inter-body target using a fluoroscope according to an embodiment of the present invention, and suitable for implementing a method according to an embodiment of the present invention. As shown here, the system has a processing system 1000 including at least one processor 1002. A fluoroscopic imaging device or "fluoroscope" 1004 and a display 1006 are associated with processing system 1000. Display 1006 and/or processing system 1000 may be integrated components implemented as an original part of fluoroscope 1004, but may be conveniently implemented by connection of a standard computer to a standard data output interface 1008 of a conventional fluoroscope 1004 including a standard C-arm imaging device 1010, other standard controls and typically its own dedicated display 1012.
The system also includes a tracking sensor arrangement associated with processing system 1000 so as to form a tracking system configured to determine the position of C-arm imaging device 1010 and of a surgical tool 1014 relative to a frame of coordinates. For one particularly preferred implementation employing optical tracking described below, the tracking sensor arrangement includes a tracking camera 1016 mounted associated with surgical tool 1014 and a further one or two tracking cameras 1018 associated with the C-arm imaging device 1010, as well as a sticker carrying fiducial markers (not shown). The various tracking cameras provide images of the fiducial markers to processing system 1000 for tracking of the tool and fluoroscope positions.
In certain preferred embodiments, processing system 1000 receives a CT image via a data input 1020 (e.g., a media reader or network connection) and images from the fluoroscopy imaging device output interface 1008. Where optical tracking is used, as mentioned above, the computer system preferably also receives images from video cameras mounted on the surgical tool and/or on the fluoroscope for determining the needle path defined by its entry point on the body and its angle with respect to the reference sticker, as well as the orientation of the fluoroscope. Using at least one CT cross-section image of the body, as shown in Figure 9, the entry point 910 and at least one of targets 920 or 930 are mark by using a pointing device such as a computer mouse or other user input device(s) 1022. For each of the targets, the vertical depth distance of the target from the entry point is determined, 942 for target 920, and 944 for target 930 both measured from entry point 910. The horizontal distance 940 of the entry point from either target 920 or target 930 is also determined.
The location of the reference sticker with respect to the fluoroscopy imaging device may be determined using technologies known in the prior art. US patent application publication no. 2008/0208041 "System and Method for Optical Position Measurement and Guidance of a Rigid or Semi-Flexible Tool to a Target" teaches using of single miniature camera attached to a needle for calculating the orientation of the needle relative to the sticker. Using the video images obtained from the video cameras 822 and 824, the orientation of the sticker may also be calculated using the same mathematics. It should be noted that, if two video cameras are used, or if two views are taken with the fluoroscope in two different positions, the coordinates of the fiducials 805 on sticker 800 need not be known a priori, but may be determined from a video image pair. Other relevant publications which may be useful reference for providing additional or alternative tracking techniques, and which are hereby incorporated in their entirety by reference, include: US Patent 5,799,055 "Apparatus and Method for Planning a Stereotactic Surgical Procedure Using Coordinated Fluoroscopy" which teaches measuring the orientation of the sticker in the fluoroscopy system of coordinates directly from an image taken by the imaging device, where the coordinates of the fiducials are a priori known; and US Patent Application publication no. US 2006/0094958 "Method and Apparatus for Calibrating Non-Linear Instruments" which describes a system to measure the relative location of a sensor on a body, which may be the sticker with the sensor is attached to it, and a fluoroscopy imaging device with a sensor attached to it, both measured by a tracking device placed at a distance apart from the body and the imaging device. The entire workflow of a procedure performed according to an embodiment of the present invention is typically as follows: (a) The target and the entry point are identified in the CT scan, from which the vertical distance from the entry point to the target is determined; (b) based on the horizontal distance 940, with addition fluoroscopy images of the body, the sticker is attached at the entry point; (c) the location and orientation of the sticker in the fluoroscopy system of coordinates is determined using one of the technologies described above; (d) a tool is placed at the entry point and its angular direction is determined using one of the above technologies; (e) the point of intersection of the needle direction on the at least one plane defined by the vertical distance of the target from the entry point is displayed; and (f) by aiming the needle so the said intersected point coincided with the image of the target, the correct angle can readily be achieved so that the needle is correctly aligned with the target or with multiple targets defined along the desired path.
It should be noted that the system and method described herein does not require registration of the body position to the CT image used during planning. Instead, the CT scan is used to derive the depth from the skin surface of the reference planes containing the targets. The targets themselves are chosen to be features which are directly visible from the fluoroscope images and the depth information is used in derivation of the locations for display of the additional symbols 705 and 706, as described above.
Furthermore, although exemplified here with reference to a case where the surface(s) passing through the target location(s) are depths derived from volumetric image data (e.g., CT data), the reference surfaces are not necessarily defined as depths or derived from volumetric image data. For example, an alternative implementation may employ an oblique or lateral fluoroscope direction to view the target location(s) from a direction significantly non- parallel with the viewing direction employed for navigation, and the user may designate (e.g., by manual input with a cursor) a plane or other surface containing each target corresponding to a line in the oblique or lateral view. So long as the plane thus defined is inclined at more than about 30 degrees, and preferably more than about 60 degrees, to the optical axis of the primary view used for navigation during the procedure, the display of the extrapolated tool direction with the plane may be generated and used as a navigational aid exactly as described above.
Although described herein in the context of a preferred implementation for use in spinal surgery, it should be noted that the present invention may equally be employed to advantage in other surgical procedures performed under fluoroscopic imaging.
Similarly, although we have referred throughout to fluoroscopic imaging, the invention is equally applicable to any and all other imaging systems which provide a real-time 2D image representing internal properties of a 3D volume as viewed along an imaging direction, whether based on X-ray transmission, ultrasound reflection or any other imaging technology.
It will be appreciated that the above descriptions are intended only to serve as examples, and that many other embodiments are possible within the scope of the present invention as defined in the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A method for facilitating navigation of a tool to an inter-body target using a fluoroscope, the method comprising the steps of:
(a) determining a position of the fluoroscope relative to a frame of coordinates, said position defining an imaging axis of the fluoroscope;
(b) defining relative to said frame of coordinates a reference surface passing through the inter-body target, said reference surface being significantly non-parallel to the imaging axis of the fluoroscope;
(c) tracking a position of the tool relative to said frame of coordinates; and
(d) generating an indication on a display in the context of an image from the fluoroscope, said indication designating a location at which a current direction of the tool intersects the reference plane, thereby facilitating angular alignment of the tool towards the inter-body target.
2. The method of claim 1, wherein said image from the fluoroscope is a frozen image, the method further comprising generating on the display a representation of the current position of the tool in the context of said image as derived from said tracking.
3. The method of claim 1, wherein said reference surface is defined as a plane at a defined depth below a surface of the body, said defined depth being derived from previously sampled three-dimensional imaging data of the body without registration between said three-dimensional imaging data and said frame of coordinates.
4. The method of claim 1, wherein said frame of coordinates is defined relative to a set of at least four optical fiducial markers applied to an external surface of the body.
5. The method of claim 4, wherein said step of tracking includes: sampling images of said optical fiducial markers by use of a camera mounted on the tool, and processing said images to derive a position of said camera relative to said fiducial markers.
6. The method of claim 4, wherein said optical fiducial markers include at least four radio-opaque markers, and wherein said determining a position of the fluoroscope includes: identifying a location of said radio-opaque markers in an image derived by the fluoroscope; and processing said image to derive a position of the fluoroscope relative to said radio-opaque markers.
7. A system for facilitating navigation of a tool to an inter-body target using a fluoroscope, the system comprising:
(a) a processing system including at least one processor;
(b) a fluoroscopic imaging device associated with said processing system, said fluoroscopic imaging device defining an imaging axis;
(c) a display associated with said processing system for displaying images sampled by said fluoroscopic imaging device; and
(d) a tracking sensor arrangement associated with said processing system so as to form a tracking system configured to determine the position of said fluoroscopic imaging device and of the tool relative to a frame of coordinates;
wherein said processing system is configured to:
(i) derive from outputs of said tracking sensor arrangement a position of the fluoroscopic imaging device relative to said frame of coordinates in which a fluoroscope image was generated; (ii) track a position of the tool relative to said frame of coordinates; and
(iii) generate an indication on the display in the context of the fluoroscope image, said indication designating a location at which a current direction of the tool intersects a reference surface passing through the inter-body target, said reference surface being significantly non-parallel to the imaging axis of the fluoroscope, thereby facilitating angular alignment of the tool towards the inter-body target.
8. The system of claim 7, wherein said image from the fluoroscope is a frozen image, the processing system being further configured to generate on the display a representation of the current position of the tool in the context of said image as derived from said tracking sensor arrangement.
9. The system of claim 7, further comprising a set of at least four optical fiducial markers for application to an external surface of the body, and wherein said tracking sensor arrangement comprises a camera associated with a proximal part of the tool and directed towards a point of entry of the tool into the body.
10. The system of claim 9, wherein said reference surface is defined as a plane at a defined depth below a surface of the body to which said at least four optical fiducial markers are applied.
11. The system of claim 9, wherein said optical fiducial markers include at least four radio-opaque markers, and wherein said processing system determines a position of the fluoroscope by identifying a location of said radio- opaque markers in an image derived by the fluoroscope; and processing said image to derive a position of the fluoroscope relative to said radio-opaque markers.
PCT/IB2011/054644 2010-10-18 2011-10-18 System and method for facilitating navigation of a tool using a fluoroscope WO2012052929A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US39395310P 2010-10-18 2010-10-18
US61/393,953 2010-10-18

Publications (2)

Publication Number Publication Date
WO2012052929A2 true WO2012052929A2 (en) 2012-04-26
WO2012052929A3 WO2012052929A3 (en) 2012-06-14

Family

ID=45975672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/054644 WO2012052929A2 (en) 2010-10-18 2011-10-18 System and method for facilitating navigation of a tool using a fluoroscope

Country Status (1)

Country Link
WO (1) WO2012052929A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10575756B2 (en) 2014-05-14 2020-03-03 Stryker European Holdings I, Llc Navigation system for and method of tracking the position of a work target
CN116616876A (en) * 2023-06-15 2023-08-22 中国人民解放军总医院第一医学中心 Puncture path intelligent planning method, device, equipment and medium in PVP operation
US12004850B2 (en) 2013-08-15 2024-06-11 Intuitive Surgical Operations, Inc. Graphical user interface for catheter positioning and insertion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6491699B1 (en) * 1999-04-20 2002-12-10 Surgical Navigation Technologies, Inc. Instrument guidance method and system for image guided surgery
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20070191680A1 (en) * 2006-02-14 2007-08-16 Fujifilm Corporation Endoscopic apparatus and diagnosis system
US20080283771A1 (en) * 2007-05-17 2008-11-20 General Electric Company System and method of combining ultrasound image acquisition with fluoroscopic image acquisition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6491699B1 (en) * 1999-04-20 2002-12-10 Surgical Navigation Technologies, Inc. Instrument guidance method and system for image guided surgery
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20070191680A1 (en) * 2006-02-14 2007-08-16 Fujifilm Corporation Endoscopic apparatus and diagnosis system
US20080283771A1 (en) * 2007-05-17 2008-11-20 General Electric Company System and method of combining ultrasound image acquisition with fluoroscopic image acquisition

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12004850B2 (en) 2013-08-15 2024-06-11 Intuitive Surgical Operations, Inc. Graphical user interface for catheter positioning and insertion
US10575756B2 (en) 2014-05-14 2020-03-03 Stryker European Holdings I, Llc Navigation system for and method of tracking the position of a work target
US11540742B2 (en) 2014-05-14 2023-01-03 Stryker European Operations Holdings Llc Navigation system for and method of tracking the position of a work target
CN116616876A (en) * 2023-06-15 2023-08-22 中国人民解放军总医院第一医学中心 Puncture path intelligent planning method, device, equipment and medium in PVP operation
CN116616876B (en) * 2023-06-15 2024-01-09 中国人民解放军总医院第一医学中心 Puncture path intelligent planning method, device, equipment and medium in PVP operation

Also Published As

Publication number Publication date
WO2012052929A3 (en) 2012-06-14

Similar Documents

Publication Publication Date Title
US20230027758A1 (en) Apparatus and methods for use with skeletal procedures
JP5328137B2 (en) User interface system that displays the representation of tools or buried plants
JP5121401B2 (en) System for distance measurement of buried plant
EP3127485B1 (en) System for local three dimensional volume reconstruction using a standard fluoroscope
US8320992B2 (en) Method and system for superimposing three dimensional medical information on a three dimensional image
US6718194B2 (en) Computer assisted intramedullary rod surgery system with enhanced features
EP2004071B1 (en) Targeting device, computer readable medium and program element
US11992349B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US20170065248A1 (en) Device and Method for Image-Guided Surgery
US8335553B2 (en) CT-free spinal surgical imaging system
US20090080737A1 (en) System and Method for Use of Fluoroscope and Computed Tomography Registration for Sinuplasty Navigation
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
EP3815613A1 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11672607B2 (en) Systems, devices, and methods for surgical navigation with anatomical tracking
WO2012052929A2 (en) System and method for facilitating navigation of a tool using a fluoroscope
Luís et al. Navigated Spinal Fusion 31
CN114533267A (en) 2D image surgery positioning navigation system and method
IL186417A (en) Method and system for superimposing three dimensional medical information on a three dimensional image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11833951

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11833951

Country of ref document: EP

Kind code of ref document: A2