WO2016108110A1 - Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods - Google Patents

Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods Download PDF

Info

Publication number
WO2016108110A1
WO2016108110A1 PCT/IB2015/059497 IB2015059497W WO2016108110A1 WO 2016108110 A1 WO2016108110 A1 WO 2016108110A1 IB 2015059497 W IB2015059497 W IB 2015059497W WO 2016108110 A1 WO2016108110 A1 WO 2016108110A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
quantitative
display
data
tracking
Prior art date
Application number
PCT/IB2015/059497
Other languages
French (fr)
Inventor
John Allan BRACKEN
Shyam Bharat
Ameet Kumar Jain
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2016108110A1 publication Critical patent/WO2016108110A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2074Interface software

Definitions

  • the invention relates to the field of medical technology, and in particular to a method and apparatus for interventional treatment including the determination and display of relative quantitative positional and orientation information between an interventional device and target annotations in an imaging system.
  • interventional medical treatment e.g. cardiology procedures
  • devices are often used and deployed to treat a patient.
  • interventional devices include catheters, guidewires, stents, valve clips, plugs and prosthetic valves.
  • these kinds of devices are often used to treat structural heart disease patients.
  • Live x-ray imaging (fluoroscopy) and live transesophageal echocardiography (TEE) images are required to provide image guidance for these complex procedures.
  • TEE is a form of ultrasound imaging. This information is helpful, but provides only qualitative real-time information about the position and orientation of a device used to treat the patient. Having additional quantitative positional and orientation information (i.e. quantifiable data) of a device easily displayed and visualized as it approaches an anatomical target to be treated would be very helpful to give the clinician (e.g. cardiologist) more confidence that the device is positioned correctly and can be deployed appropriately.
  • Figure 1 illustrates how a fused live x-ray/TEE imaging system can be used to guide a device 10 to a target 12 in a qualitative manner.
  • the x-ray imaging 15 shows devices, while the TEE images 17 show detailed anatomical target information.
  • a trans-septal puncture through the inter-atrial septum is being performed during a mitral clip case.
  • a marker labeled 'Septum' was placed manually on a point of interest on the inter-atrial septum in the TEE images in the fused x-ray/TEE image guidance system. This point is fixed in 3D space and its position does not currently track with a moving target in the images at the present time using fused x-ray/TEE imaging systems.
  • the corresponding position of the "septum" marker is also shown in the x-ray images, which also clearly show the catheter 16 at the septum position.
  • Embodiments of the invention may provide an apparatus, systems, methods, and computer-readable storage medium for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof.
  • An embodiment that may achieve this is directed to a system for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof, the system including at least one tracking sensor associated with the interventional device, and a determination unit configured to receive device tracking data from the at least one tracking sensor associated with the interventional device, receive target data corresponding to an annotation of the anatomical target, and determine and update, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data.
  • a display unit is configured to display quantitative clinician guidance for the interventional device based upon the relative device-to-target quantitative positional and orientation information from the determination unit.
  • the at least one tracking sensor comprises at least one of a transducer, electromagnetic device, radiofrequency device and optical sensor. In an embodiment, such a tracking sensor is positioned at a distal end of the interventional device.
  • each of the device tracking data and the target data comprises three-dimensional coordinate data.
  • an image guidance system provides the determination unit with the target data.
  • the image guidance system may be an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and/or a fused x-ray /TEE device.
  • TEE transesophageal echocardiography
  • the display unit comprises one or more display monitors configured to display the quantitative clinician guidance and images of the interventional device, anatomical target and annotation from the image guidance system.
  • the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information.
  • Another embodiment is directed to a method for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof.
  • the method includes acquiring device tracking data from at least one tracking sensor associated with the interventional device, and acquiring target data corresponding to an annotation of the anatomical target from an image guidance system.
  • the method includes determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. Also, the relative device-to-target quantitative positional and orientation information is output for display to provide quantitative clinician guidance for the interventional device.
  • Another embodiment is directed to a non-transitory computer-readable storage medium having stored therein machine readable instructions configured to be executed by a processor for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof.
  • the processor executes a process comprising: acquiring device tracking data from at least one tracking sensor associated with the interventional device; acquiring target data corresponding to an annotation of the anatomical target from an image guidance system; determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data; and outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance for the interventional device.
  • FIG. 1 is a schematic block diagram illustrating features of a known x-ray /TEE imaging system.
  • FIG. 2 is a schematic block diagram illustrating details of the system for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof in accordance with features of an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating various steps in a method for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof in accordance with features of an embodiment of the present invention.
  • 'a', 'an' and 'the' include both singular and plural referents, unless the context clearly dictates otherwise.
  • 'a device' includes one device and plural devices.
  • two or more parts or components are “coupled” shall mean that the parts are joined or operate together either directly or indirectly, i.e., through one or more intermediate parts or components, so long as a link occurs.
  • directly coupled means that two elements are directly in contact with each other.
  • fixedly coupled or “fixed” means that two components are coupled so as to move as one while maintaining a constant orientation relative to each other.
  • Interventional medical treatment e.g. a cardiology procedure
  • devices are used and deployed to treat a patient.
  • Interventional devices include catheters, guidewires, stents, valve clips, plugs and prosthetic valves, for example. These kinds of devices are typically used to treat structural heart disease patients. Live x-ray imaging (fluoroscopy) and live transesophageal echocardiography (TEE) images may be needed to provide image guidance for these complex procedures.
  • FIG. 2 schematically illustrates the system 20 for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof
  • the system 20 includes one or more tracking sensors 21 associated with the interventional device 28, and a determination unit 26 configured to receive device tracking data from the tracking sensor 21 associated with the interventional device 28, receive target data corresponding to an annotation 22 of the anatomical target, and determine and update, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data.
  • a display unit 25, 27, 29 is configured to display quantitative clinician guidance 30 for the interventional device 28 based upon the relative device-to-target quantitative positional and orientation information from the determination unit 26.
  • a custom or dedicated display 32 may also be provided to provide the quantitative clinician guidance 30.
  • Real time is a level of responsiveness that a user (e.g. the clinician) senses as sufficiently immediate.
  • the tracking sensor 21 may include one or more of a transducer, electromagnetic device, radiofrequency device and optical sensor, for example.
  • a tracking sensor may be positioned at a distal end of the interventional device 24.
  • a mini-transducer could be attached or integrated with a catheter at the tip thereof.
  • a mini-transducer could also be attached to a supplemental device (e.g. a cannula) that is inserted along with the interventional device 24.
  • each of the device tracking data and the target data comprises three-dimensional coordinate data.
  • an image guidance system 23 provides the determination unit 26 with the target data.
  • the image guidance system 23 may be, for example, an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and/or a fused x- ray/TEE device.
  • TEE transesophageal echocardiography
  • the display unit 25, 27, 29 comprises one or more display monitors configured to display the quantitative clinician guidance 30 and images of the interventional device 28, anatomical target and annotation 22 from the image guidance system.
  • the display 25 is an x-ray display
  • the display 27 is an ultrasound or TEE display
  • the display 29 represents a fused x-ray/TEE display.
  • the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information, for example.
  • the method is for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof.
  • the method begins (block 40) and includes acquiring 41 device tracking data from at least one tracking sensor 21 associated with the interventional device 28, and acquiring 42 target data corresponding to an annotation 22 of the anatomical target from an image guidance system 23.
  • the method includes determining and updating 43, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. Also, the relative device-to-target quantitative positional and orientation information is output 44 for display 45 to provide quantitative clinician guidance 30 for the
  • this information could be a coordinate of the annotation 22 in the 3D space of either the patient and/or the catheterization laboratory (cath-lab) imaging system 23.
  • a more complex annotation such as an ellipse, circle or curve describing the anatomical target of interest
  • multiple coordinates could be stored to define the outline and orientation of the target.
  • the tracking technology on the device 28 could also be registered with either the patient or cath-lab coordinate system and provide real time positional and orientation information of the device.
  • the dedicated determination unit 26 or computation unit
  • This information could then be presented and displayed to both the cardiologist and echocardiographer in a simple, effective and clinically appropriate way to provide better guidance and workflow for an intervention.
  • This visualized quantitative information e.g. quantitative clinician guidance 30
  • This displayed information could be the position and orientation information directly, instructions and warnings for device guidance based on this information, or even additional visualizations of devices and targets based on automated analysis of this quantitative information.
  • Target annotation i.e. markers, ellipses, curves
  • 3D co-ordinate data from the imaging system 23 e.g. fused x-ray/TEE or TEE-only system
  • the imaging system 23 e.g. fused x-ray/TEE or TEE-only system
  • corresponding 3D co-ordinate data form the tracking sensors 21 on the device could also be sent to the determination unit 26 and updated accordingly.
  • the tracking sensors 21 on the device 28 could use a variety of different technologies, including miniature transducers/sensors (e.g. InSitu by Philips), electromagnetic (EM) devices, radiofrequency or optical tracking.
  • EM electromagnetic
  • Ultrasound tracking technology is attractive to use in this approach, since it may provide device position estimates directly in the TEE frame of reference, therefore needing no additional registration.
  • EM technology is another option, though may require the generation of a common EM-based coordinate system, to which all imaging modalities that are used should be registered.
  • the unit will then calculate and output the relative position and orientation information between a device and target annotation and send real-time updates of this information back to the cath-lab display for the cardiologist and echocardiographer.
  • This information could be displayed on the main screen 29 of the fused x-ray/TEE imaging system 23 and/or directly on the x-ray 25 and TEE-only 27 imaging systems themselves.
  • the determination unit 26 can receive coordinates from multiple target annotations 22 or multiple devices 28 and then compute multiple corresponding relative position/orientation information points between any combination of them. This information can then be displayed. The user will have the option of selecting which target annotations and devices should be selected to obtain this information.
  • a color-coded warning system can be included with the displayed position/orientation information to notify the cardiologist and echocardiographer whether or not a device is within an acceptable position for deployment (e.g. green-can deploy, yellow-getting very close to deployment position, red-well outside tolerance etc.).
  • This warning system can be further enhanced by specifying the clinical direction in which the device is or is not within a suitable deployment position, such as the lateral, medial, left, right, anterior, posterior, superior or inferior directions in the patient.
  • a notification can also be displayed to tell the cardiologist how to move or adjust the device in a specific direction to get within deployment tolerance (this adjustment could also be automated).
  • the quantitative position/orientation information can also be enhanced to display one or several of these specific clinical directions for the device position/orientation with respect to the target.
  • the position or orientation of a device relative to a target annotation can be displayed as a color bar for example.
  • the degree of color could indicate the distance the device is from a target (in a specific direction) and/or the orientation of the device.
  • the bar could change color as the device moves towards or away the target or its orientation changes.
  • the bar could contain a fixed color set arranged as a gradient, with a moving arrow or other similar indicator overlaid on the bar showing the position and/or orientation of the device relative to the target.
  • Another embodiment may involve using the device position information from the tracking sensor 21 on the intervention device 28 to automatically place and track an annotation on the tip or some other position on the device within the fused x-ray/TEE imaging system.
  • annotations may not only be placed on anatomical targets, but devices as well.
  • the annotation would always remain on the device during the procedure, without relying on simultaneous x-ray/TEE image-based registration. Since the tracking sensor is always active, this is a very robust way of tracking a specific part of the device in the fused x-ray/TEE imaging system, without having to constantly re-register the images by activating TEE and x-ray at the same time.
  • TEE-only systems this would be less of a concern since only one imaging modality is available and an annotation could be tracked more easily without requiring registration of different imaging modalities.
  • the annotation on the device and the fact that the device annotation would track with device motion, could also help to make the device more visible in the TEE images.
  • To track the device tip some sensor technology may entail the use of two or more sensors to estimate orientation of the device shaft, and thereby, extrapolate the position of the device tip. If a 6 degree of freedom (DOF) EM sensor is used, then device tip tracking may be done using just one such sensor.
  • DOF 6 degree of freedom
  • a further embodiment may include using the relative distance/orientation information between a tracked device and an annotated target to develop real-time trajectories or paths between the device and target, to help guide positioning and deployment of the device.
  • These trajectories or paths would be able to be updated in realtime as the target moves and as the device is positioned.
  • These trajectories could be further enhanced using deformable models of the anatomy being targeted (and models of the surrounding anatomy as well). Combined with the target annotations, these models would move with the anatomy and could provide additional information to update the trajectories for the device on the display in real-time, along with providing the ability to display to the cardiologist whether or not such a trajectory for device guidance is even possible.
  • the model information (combined with the device and annotational position/orientation data) could be used to generate and display details about where and how a device will hit an anatomical target for a currently displayed trajectory. This information could then be used to deliver instructions to the cardiologist on how to manipulate a device such that a possible trajectory could be created and followed to guide the device to the intended target.
  • Another embodiment may involve using the device tracking sensor information to deform or displace the annotations on an anatomical target as a result of a device interacting with the target.
  • This could also apply to deformation and displacement of anatomical models if used.
  • An example of this could be tenting caused by pushing a catheter against the inter-atrial septum. This tenting deforms the septum and the device tracking information can be used to adjust the position or shape of the annotation on the septum accordingly as a result of this interaction.
  • Marker and annotation tracking on anatomical targets in TEE images may result in the need to update the co-ordinates of target annotations in real time for the display of relative position/orientation information accurately during a procedure.
  • the device tracking information could also help to enhance image-based tracking of annotations and markers if the TEE or x-ray image quality is degraded at specific time points during the procedure.
  • Another embodiment may involve using the relative position/orientation
  • the x-ray system geometry information (this is known by the x-ray C-arm system) and homography, it would also be possible to generate and display virtual x-ray images of the device at any C-arm angle (and the target and/or anatomy as well) from a single x-ray projection run (e.g. an anterior-posterior x-ray image sequence).
  • These virtual x-ray images would then give the cardiologist a sense of how a device would be positioned and oriented relative to the target at any C-arm angle, without actually having to move the C-arm or x-ray the patient. This would help in device deployment and positioning, treatment planning and decision making.
  • an optimal C-arm angle could be manually or automatically determined for a particular task in an intervention. The C-arm could then be automatically moved to this position for this particular task, improving workflow and reducing ionizing radiation use.
  • Graphics rendering software to display light and shadow over structures in 3D TEE images may enhance specific anatomical structures and targets.
  • the quantitative device and target annotation position information described herein may be used to place a virtual light source emanating from the device 'tip' such that it shines at a specific orientation or angle on the target in the TEE images.
  • the light display on the target in the TEE image would of course change in real-time as the device is moved and re-oriented.
  • the virtual light on the TEE image would elucidate additional anatomical and depth information and additional visual information on where the device is directed towards the target.
  • TEE probe heads can also be tracked using the previously described technology.
  • Another embodiment may include using a combination of device position/orientation data from the probe head and target annotation position/orientation data, information and instructions could be sent to the TEE display to help the echocardiographer orient and adjust the probe for an optimal view of the target for a specific interventional task (e.g. septum crossing) or pre-determined clinical view of a target.
  • a specific interventional task e.g. septum crossing
  • pre-determined clinical view of a target e.g. septum crossing
  • kinematic model motion/mechanical model of a particular device (e.g. a catheter).
  • a kinematic model can be developed from finite element analysis for example.
  • This kinematic model (or multiple kinematic models of different devices) can be pre-loaded on the image guidance system. By comparing the target annotation and device position/orientation data with this kinematic model, clearer instructions could be given to the cardiologist to guide the device to the target, since the mechanical limitations and constraints of the device would be factored into the intervention.
  • Embodiments of the invention may also be directed to a non-transitory computer- readable storage medium having stored therein machine readable instructions configured to be executed by a processor (e.g. determination unit 26) for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof.
  • a processor e.g. determination unit 26
  • the processor executes a process comprising: acquiring device tracking data from at least one tracking sensor 21 associated with the interventional device 28; acquiring target data corresponding to an annotation 22 of the anatomical target from an image guidance system 23; determining and updating, in real time, relative device- to-target quantitative positional and orientation information based upon the device tracking data and the target data; and outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance 30 for the interventional device 28.
  • a 'computer-readable storage medium' as used herein encompasses any tangible storage medium which may store instructions which are executable by a processor of a computing device.
  • the computer-readable storage medium may be referred to as a computer-readable non-transitory storage medium.
  • the computer-readable storage medium may also be referred to as a tangible computer-readable medium.
  • a computer-readable storage medium may also be able to store data which is able to be accessed by the processor of the computing device.
  • Examples of computer-readable storage media include, but are not limited to: a floppy disk, a magnetic hard disk drive, a solid state hard disk, flash memory, a USB thumb drive, Random Access Memory (RAM), Read Only Memory (ROM), an optical disk, a magneto-optical disk, and the register file of the processor.
  • Examples of optical disks include Compact Disks (CD) and Digital Versatile Disks (DVD), for example CD-ROM, CD-RW, CD-R, DVD-ROM, DVD-RW, or DVD-R disks.
  • the term computer readable-storage medium also refers to various types of recording media capable of being accessed by the computer device via a network or communication link.
  • a data may be retrieved over a modem, over the internet, or over a local area network.
  • References to a computer-readable storage medium should be interpreted as possibly being multiple computer-readable storage mediums.
  • Various executable components of a program or programs may be stored in different locations.
  • the computer-readable storage medium may for instance be multiple computer-readable storage medium within the same computer system.
  • the computer-readable storage medium may also be computer-readable storage medium distributed amongst multiple computer systems or computing devices.
  • Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to: RAM memory, registers, and register files. References to 'computer memory' or 'memory' should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
  • Computer storage is any non-volatile computer-readable storage medium.
  • Examples of computer storage include, but are not limited to: a hard disk drive, a USB thumb drive, a floppy drive, a smart card, a DVD, a CD-ROM, and a solid state hard drive.
  • computer storage may also be computer memory or vice versa.
  • References to 'computer storage' or 'storage' should be interpreted as possibly including multiple storage devices or components.
  • the storage may include multiple storage devices within the same computer system or computing device.
  • the storage may also include multiple storages distributed amongst multiple computer systems or computing devices.
  • a 'processor' as used herein encompasses an electronic component which is able to execute a program or machine executable instruction.
  • References to the computing device comprising "a processor” should be interpreted as possibly containing more than one processor or processing core.
  • the processor may for instance be a multi-core processor.
  • a processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems.
  • the term computing device should also be interpreted to possibly refer to a collection or network of computing devices each comprising a processor or processors. Many programs have their instructions performed by multiple processors that may be within the same computing device or which may even be distributed across multiple computing devices.
  • a 'user interface' as used herein is an interface which allows a user or operator to interact with a computer or computer system.
  • a 'user interface' may also be referred to as a 'human interface device.
  • a user interface may provide information or data to the operator and/or receive information or data from the operator.
  • a user interface may enable input from an operator to be received by the computer and may provide output to the user from the computer.
  • the user interface may allow an operator to control or manipulate a computer and the interface may allow the computer indicate the effects of the operator's control or manipulation.
  • the display of data or information on a display or a graphical user interface is an example of providing information to an operator.
  • the receiving of data through a touch screen, keyboard, mouse, trackball, touchpad, pointing stick, graphics tablet, joystick, gamepad, webcam, headset, gear sticks, steering wheel, wired glove, wireless remote control, and accelerometer are all examples of user interface components which enable the receiving of information or data from an operator.
  • a 'hardware interface' as used herein encompasses an interface which enables the processor of a computer system to interact with and/or control an external computing device and/or apparatus.
  • a hardware interface may allow a processor to send control signals or instructions to an external computing device and/or apparatus.
  • a hardware interface may also enable a processor to exchange data with an external computing device and/or apparatus. Examples of a hardware interface include, but are not limited to: a universal serial bus, IEEE 1394 port, parallel port, IEEE 1284 port, serial port, RS-232 port, IEEE-488 port, Bluetooth connection, Wireless local area network connection, TCP/IP connection, Ethernet connection, control voltage interface, MIDI interface, analog input interface, and digital input interface.
  • a 'display' or 'display device' as used herein encompasses an output device or a user interface adapted for displaying images or data.
  • a display may output visual, audio, and or tactile data.
  • Examples of a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light-emitting diode displays (OLED), a projector, and Head-mounted display. While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Abstract

The apparatus, systems, methods, and computer-readable storage medium are for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof. The approach includes the use of at least one tracking sensor associated with the interventional device, and a determination unit configured to receive device tracking data from the at least one tracking sensor associated with the interventional device, receive target data corresponding to an annotation of the anatomical target, and determine and update, in real time, relative device- to-target quantitative positional and orientation information based upon the device tracking data and the target data. A display unit is configured to display quantitative clinician guidance for the interventional device based upon the relative device-to-target quantitative positional and orientation information from the determination unit.

Description

RELATIVE POSITION/ORIENTATION TRACKING AND VISUALIZATION BETWEEN AN INTERVENTIONAL DEVICE AND PATIENT ANATOMICAL TARGETS IN IMAGE GUIDANCE SYSTEMS AND METHODS TECHNICAL FIELD
The invention relates to the field of medical technology, and in particular to a method and apparatus for interventional treatment including the determination and display of relative quantitative positional and orientation information between an interventional device and target annotations in an imaging system.
BACKGROUND
During interventional medical treatment (e.g. cardiology procedures), devices are often used and deployed to treat a patient. Examples of such interventional devices include catheters, guidewires, stents, valve clips, plugs and prosthetic valves. In particular, these kinds of devices are often used to treat structural heart disease patients. Live x-ray imaging (fluoroscopy) and live transesophageal echocardiography (TEE) images are required to provide image guidance for these complex procedures. TEE is a form of ultrasound imaging. This information is helpful, but provides only qualitative real-time information about the position and orientation of a device used to treat the patient. Having additional quantitative positional and orientation information (i.e. quantifiable data) of a device easily displayed and visualized as it approaches an anatomical target to be treated would be very helpful to give the clinician (e.g. cardiologist) more confidence that the device is positioned correctly and can be deployed appropriately.
Figure 1 illustrates how a fused live x-ray/TEE imaging system can be used to guide a device 10 to a target 12 in a qualitative manner. The x-ray imaging 15 shows devices, while the TEE images 17 show detailed anatomical target information. In this case, a trans-septal puncture through the inter-atrial septum is being performed during a mitral clip case. A marker labeled 'Septum' was placed manually on a point of interest on the inter-atrial septum in the TEE images in the fused x-ray/TEE image guidance system. This point is fixed in 3D space and its position does not currently track with a moving target in the images at the present time using fused x-ray/TEE imaging systems. Since the TEE images were registered to the x-ray images by automatically aligning a 3D model of the TEE probe with x-ray images of the probe (outline 14 shown on bottom left of figure), the corresponding position of the "septum" marker is also shown in the x-ray images, which also clearly show the catheter 16 at the septum position.
All this information is purely qualitative, and additional quantitative information on how the catheter is positioned and oriented relative to a target is not available or displayed. This extra information would be very helpful to simplify complex tasks and provide more confidence to the cardiologist during difficult structural cardiac interventions.
SUMMARY
Embodiments of the invention may provide an apparatus, systems, methods, and computer-readable storage medium for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof.
An embodiment that may achieve this is directed to a system for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof, the system including at least one tracking sensor associated with the interventional device, and a determination unit configured to receive device tracking data from the at least one tracking sensor associated with the interventional device, receive target data corresponding to an annotation of the anatomical target, and determine and update, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. A display unit is configured to display quantitative clinician guidance for the interventional device based upon the relative device-to-target quantitative positional and orientation information from the determination unit.
In an embodiment, the at least one tracking sensor comprises at least one of a transducer, electromagnetic device, radiofrequency device and optical sensor. In an embodiment, such a tracking sensor is positioned at a distal end of the interventional device.
In an embodiment, each of the device tracking data and the target data comprises three-dimensional coordinate data.
In an embodiment, an image guidance system provides the determination unit with the target data. The image guidance system may be an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and/or a fused x-ray /TEE device.
In an embodiment, the display unit comprises one or more display monitors configured to display the quantitative clinician guidance and images of the interventional device, anatomical target and annotation from the image guidance system.
In an embodiment, the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information.
Another embodiment is directed to a method for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof. The method includes acquiring device tracking data from at least one tracking sensor associated with the interventional device, and acquiring target data corresponding to an annotation of the anatomical target from an image guidance system. The method includes determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. Also, the relative device-to-target quantitative positional and orientation information is output for display to provide quantitative clinician guidance for the interventional device.
Another embodiment is directed to a non-transitory computer-readable storage medium having stored therein machine readable instructions configured to be executed by a processor for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof. The processor executes a process comprising: acquiring device tracking data from at least one tracking sensor associated with the interventional device; acquiring target data corresponding to an annotation of the anatomical target from an image guidance system; determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data; and outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance for the interventional device.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be more readily understood from the detailed description of exemplary embodiments presented below considered in conjunction with the
accompanying drawings, as follows.
FIG. 1 is a schematic block diagram illustrating features of a known x-ray /TEE imaging system.
FIG. 2 is a schematic block diagram illustrating details of the system for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof in accordance with features of an embodiment of the present invention.
FIG. 3 is a flowchart illustrating various steps in a method for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof in accordance with features of an embodiment of the present invention.
DETAILED DESCRIPTION
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the present invention are shown. The present invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided as teaching examples of the invention.
It is to be understood that the terminology used herein is for purposes of describing particular embodiments only, and is not intended to be limiting. Any defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings.
As used in the specification and appended claims, the terms 'a', 'an' and 'the' include both singular and plural referents, unless the context clearly dictates otherwise. Thus, for example, 'a device' includes one device and plural devices.
As used in the specification and appended claims, and in addition to their ordinary meanings, the terms 'substantial' or 'substantially' mean to with acceptable limits or degree. For example, 'substantially cancelled' means that one skilled in the art would consider the cancellation to be acceptable.
As used in the specification and the appended claims and in addition to its ordinary meaning, the term 'approximately' means to within an acceptable limit or amount to one having ordinary skill in the art. For example, 'approximately the same' means that one of ordinary skill in the art would consider the items being compared to be the same.
As used herein, the statement that two or more parts or components are "coupled" shall mean that the parts are joined or operate together either directly or indirectly, i.e., through one or more intermediate parts or components, so long as a link occurs. As used herein, "directly coupled" means that two elements are directly in contact with each other. As used herein, "fixedly coupled" or "fixed" means that two components are coupled so as to move as one while maintaining a constant orientation relative to each other.
Directional phrases used herein, such as, for example and without limitation, top, bottom, left, right, upper, lower, front, back, and derivatives thereof, relate to the orientation of the elements shown in the drawings and are not limiting upon the claims unless expressly recited therein. Relative terms, such as "above," "below," "top," "bottom," "upper" and "lower" may be used to describe the various elements' relationships to one another, as illustrated in the accompanying drawings. These relative terms are intended to encompass different orientations of the device and/or elements in addition to the
orientation depicted in the drawings. For example, if the device were inverted with respect to the view in the drawings, an element described as "above" another element, for example, would now be "below" that element. Similarly, if the device were rotated by 90° with respect to the view in the drawings, an element described "above" or "below" another element would now be "adjacent" to the other element; where "adjacent" means either abutting the other element, or having one or more layers, materials, structures, etc., between the elements.
Like numbered elements in these figures are either equivalent elements or perform the same function. Elements which have been discussed previously will not necessarily be discussed in later figures if the function is equivalent.
Initially, it is noted that in a typical interventional medical treatment (e.g. a cardiology procedure), devices are used and deployed to treat a patient. Interventional devices include catheters, guidewires, stents, valve clips, plugs and prosthetic valves, for example. These kinds of devices are typically used to treat structural heart disease patients. Live x-ray imaging (fluoroscopy) and live transesophageal echocardiography (TEE) images may be needed to provide image guidance for these complex procedures.
Referring to FIG. 2, a system 20 for the determination and display of quantitative clinician guidance 30 in accordance with features of an embodiment of the invention will be described. FIG. 2 schematically illustrates the system 20 for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof, the system 20 includes one or more tracking sensors 21 associated with the interventional device 28, and a determination unit 26 configured to receive device tracking data from the tracking sensor 21 associated with the interventional device 28, receive target data corresponding to an annotation 22 of the anatomical target, and determine and update, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. A display unit 25, 27, 29 is configured to display quantitative clinician guidance 30 for the interventional device 28 based upon the relative device-to-target quantitative positional and orientation information from the determination unit 26. A custom or dedicated display 32 may also be provided to provide the quantitative clinician guidance 30.
"Real time" is a level of responsiveness that a user (e.g. the clinician) senses as sufficiently immediate.
In various embodiments, the tracking sensor 21 may include one or more of a transducer, electromagnetic device, radiofrequency device and optical sensor, for example. In an embodiment, such a tracking sensor may be positioned at a distal end of the interventional device 24. For example, a mini-transducer could be attached or integrated with a catheter at the tip thereof. Such a mini-transducer could also be attached to a supplemental device (e.g. a cannula) that is inserted along with the interventional device 24.
In an embodiment, each of the device tracking data and the target data comprises three-dimensional coordinate data.
In an embodiment, an image guidance system 23 provides the determination unit 26 with the target data. The image guidance system 23 may be, for example, an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and/or a fused x- ray/TEE device.
In an embodiment, the display unit 25, 27, 29 comprises one or more display monitors configured to display the quantitative clinician guidance 30 and images of the interventional device 28, anatomical target and annotation 22 from the image guidance system. As shown, the display 25 is an x-ray display, the display 27 is an ultrasound or TEE display and the display 29 represents a fused x-ray/TEE display.
In an embodiment, the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information, for example.
Further details of a method in accordance with features of the present approach are discussed with additional reference to the flowchart of FIG. 3. The method is for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof. The method begins (block 40) and includes acquiring 41 device tracking data from at least one tracking sensor 21 associated with the interventional device 28, and acquiring 42 target data corresponding to an annotation 22 of the anatomical target from an image guidance system 23. The method includes determining and updating 43, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data. Also, the relative device-to-target quantitative positional and orientation information is output 44 for display 45 to provide quantitative clinician guidance 30 for the
interventional device 28.
In the case of a marker, this information could be a coordinate of the annotation 22 in the 3D space of either the patient and/or the catheterization laboratory (cath-lab) imaging system 23. In the case of a more complex annotation such as an ellipse, circle or curve describing the anatomical target of interest, multiple coordinates could be stored to define the outline and orientation of the target. Concurrently, the tracking technology on the device 28 could also be registered with either the patient or cath-lab coordinate system and provide real time positional and orientation information of the device. The dedicated determination unit 26 (or computation unit) could then use this information to compute relative positions and orientations between the device and target annotation in real-time. This information could then be presented and displayed to both the cardiologist and echocardiographer in a simple, effective and clinically appropriate way to provide better guidance and workflow for an intervention. This visualized quantitative information (e.g. quantitative clinician guidance 30) could also be used to assist with clinical decision making during the procedure. This displayed information could be the position and orientation information directly, instructions and warnings for device guidance based on this information, or even additional visualizations of devices and targets based on automated analysis of this quantitative information.
Thus, as set forth above, the present approach may be used to generate and display the relative position and orientation between the anatomical target annotations 22 and the device 28. Target annotation (i.e. markers, ellipses, curves) 3D co-ordinate data from the imaging system 23 (e.g. fused x-ray/TEE or TEE-only system) could be recorded and sent to the determination unit 26 and updated in real-time as needed. Simultaneously, corresponding 3D co-ordinate data form the tracking sensors 21 on the device could also be sent to the determination unit 26 and updated accordingly. The tracking sensors 21 on the device 28 could use a variety of different technologies, including miniature transducers/sensors (e.g. InSitu by Philips), electromagnetic (EM) devices, radiofrequency or optical tracking. Ultrasound tracking technology is attractive to use in this approach, since it may provide device position estimates directly in the TEE frame of reference, therefore needing no additional registration. EM technology is another option, though may require the generation of a common EM-based coordinate system, to which all imaging modalities that are used should be registered.
Once these data are sent to the determination unit 26, the unit will then calculate and output the relative position and orientation information between a device and target annotation and send real-time updates of this information back to the cath-lab display for the cardiologist and echocardiographer. This information could be displayed on the main screen 29 of the fused x-ray/TEE imaging system 23 and/or directly on the x-ray 25 and TEE-only 27 imaging systems themselves. As discussed above, it is also possible to allocate a completely separate display 32 for the determination unit 26 that can also be placed on the main screen in the cath-lab. This information would be updated and displayed in real-time during the procedure.
Applications for the present approach are to provide additional quantitative relative positional and orientation information of devices and targets when using fused x-ray/TEE imaging (and TEE imaging only), e.g. in the field of interventional cardiology. However, this approach could also be extended to other medical applications and devices that use combined x-ray and ultrasound imaging, such as brachytherapy. This technique may be an enhancement to an image guidance tool (e.g. the Philips EchoNavigator, which uses images received from a Philips Allura x-ray C-arm system and from TEE images obtained on the CX50, EPIQ and iE33 ultrasound systems). An approach utilizing only TEE images is also contemplated, where both tracking and anatomy visualization is done using ultrasound exclusively. Thus, X-ray fluoroscopy may not be needed, thereby not exposing the patient to ionizing radiation. Reduced x-ray use may be important in the field of cardiac interventions.
As an additional embodiment, the determination unit 26 can receive coordinates from multiple target annotations 22 or multiple devices 28 and then compute multiple corresponding relative position/orientation information points between any combination of them. This information can then be displayed. The user will have the option of selecting which target annotations and devices should be selected to obtain this information.
As another embodiment, a color-coded warning system can be included with the displayed position/orientation information to notify the cardiologist and echocardiographer whether or not a device is within an acceptable position for deployment (e.g. green-can deploy, yellow-getting very close to deployment position, red-well outside tolerance etc.). This warning system can be further enhanced by specifying the clinical direction in which the device is or is not within a suitable deployment position, such as the lateral, medial, left, right, anterior, posterior, superior or inferior directions in the patient. A notification can also be displayed to tell the cardiologist how to move or adjust the device in a specific direction to get within deployment tolerance (this adjustment could also be automated). The quantitative position/orientation information can also be enhanced to display one or several of these specific clinical directions for the device position/orientation with respect to the target. The position or orientation of a device relative to a target annotation can be displayed as a color bar for example. The degree of color could indicate the distance the device is from a target (in a specific direction) and/or the orientation of the device. The bar could change color as the device moves towards or away the target or its orientation changes. Alternatively, the bar could contain a fixed color set arranged as a gradient, with a moving arrow or other similar indicator overlaid on the bar showing the position and/or orientation of the device relative to the target.
Another embodiment may involve using the device position information from the tracking sensor 21 on the intervention device 28 to automatically place and track an annotation on the tip or some other position on the device within the fused x-ray/TEE imaging system. In this case, annotations may not only be placed on anatomical targets, but devices as well. The annotation would always remain on the device during the procedure, without relying on simultaneous x-ray/TEE image-based registration. Since the tracking sensor is always active, this is a very robust way of tracking a specific part of the device in the fused x-ray/TEE imaging system, without having to constantly re-register the images by activating TEE and x-ray at the same time. In TEE-only systems, this would be less of a concern since only one imaging modality is available and an annotation could be tracked more easily without requiring registration of different imaging modalities. The annotation on the device and the fact that the device annotation would track with device motion, could also help to make the device more visible in the TEE images. To track the device tip, some sensor technology may entail the use of two or more sensors to estimate orientation of the device shaft, and thereby, extrapolate the position of the device tip. If a 6 degree of freedom (DOF) EM sensor is used, then device tip tracking may be done using just one such sensor.
A further embodiment may include using the relative distance/orientation information between a tracked device and an annotated target to develop real-time trajectories or paths between the device and target, to help guide positioning and deployment of the device. These trajectories or paths would be able to be updated in realtime as the target moves and as the device is positioned. These trajectories could be further enhanced using deformable models of the anatomy being targeted (and models of the surrounding anatomy as well). Combined with the target annotations, these models would move with the anatomy and could provide additional information to update the trajectories for the device on the display in real-time, along with providing the ability to display to the cardiologist whether or not such a trajectory for device guidance is even possible. The model information (combined with the device and annotational position/orientation data) could be used to generate and display details about where and how a device will hit an anatomical target for a currently displayed trajectory. This information could then be used to deliver instructions to the cardiologist on how to manipulate a device such that a possible trajectory could be created and followed to guide the device to the intended target.
Another embodiment may involve using the device tracking sensor information to deform or displace the annotations on an anatomical target as a result of a device interacting with the target. This could also apply to deformation and displacement of anatomical models if used. An example of this could be tenting caused by pushing a catheter against the inter-atrial septum. This tenting deforms the septum and the device tracking information can be used to adjust the position or shape of the annotation on the septum accordingly as a result of this interaction. This of course could apply to other anatomical structures that a device interacts with. Marker and annotation tracking on anatomical targets in TEE images may result in the need to update the co-ordinates of target annotations in real time for the display of relative position/orientation information accurately during a procedure. The device tracking information could also help to enhance image-based tracking of annotations and markers if the TEE or x-ray image quality is degraded at specific time points during the procedure.
Another embodiment may involve using the relative position/orientation
information between the device and target annotation to generate and display the 2D distance between the target and the device on 2D x-ray fluoroscopy images. This distance (and its display) would continually be updated as the device moves and re-orients towards the target. Currently, distance measurements are fixed on fused x-ray/TEE imaging systems, and the measurement would need to be redone manually if the relative position/orientation of the device with respect to the target annotation changes.
Using a combination of the device and target annotation position/orientation information, the x-ray system geometry information (this is known by the x-ray C-arm system) and homography, it would also be possible to generate and display virtual x-ray images of the device at any C-arm angle (and the target and/or anatomy as well) from a single x-ray projection run (e.g. an anterior-posterior x-ray image sequence). These virtual x-ray images would then give the cardiologist a sense of how a device would be positioned and oriented relative to the target at any C-arm angle, without actually having to move the C-arm or x-ray the patient. This would help in device deployment and positioning, treatment planning and decision making. It would also be possible to overlay the target onto the virtual x-ray image. With these details, an optimal C-arm angle could be manually or automatically determined for a particular task in an intervention. The C-arm could then be automatically moved to this position for this particular task, improving workflow and reducing ionizing radiation use.
Graphics rendering software to display light and shadow over structures in 3D TEE images may enhance specific anatomical structures and targets. As another embodiment, the quantitative device and target annotation position information described herein (along with such graphics rendering software) may be used to place a virtual light source emanating from the device 'tip' such that it shines at a specific orientation or angle on the target in the TEE images. The light display on the target in the TEE image would of course change in real-time as the device is moved and re-oriented. The virtual light on the TEE image would elucidate additional anatomical and depth information and additional visual information on where the device is directed towards the target.
TEE probe heads can also be tracked using the previously described technology. Another embodiment may include using a combination of device position/orientation data from the probe head and target annotation position/orientation data, information and instructions could be sent to the TEE display to help the echocardiographer orient and adjust the probe for an optimal view of the target for a specific interventional task (e.g. septum crossing) or pre-determined clinical view of a target.
Also, it would possible to combine the relative annotation and device
position/orientation data with a previously constructed kinematic (motion/mechanical) model of a particular device (e.g. a catheter). Such a model could be developed from finite element analysis for example. This kinematic model (or multiple kinematic models of different devices) can be pre-loaded on the image guidance system. By comparing the target annotation and device position/orientation data with this kinematic model, clearer instructions could be given to the cardiologist to guide the device to the target, since the mechanical limitations and constraints of the device would be factored into the intervention.
Embodiments of the invention may also be directed to a non-transitory computer- readable storage medium having stored therein machine readable instructions configured to be executed by a processor (e.g. determination unit 26) for the determination and display of quantitative clinician guidance 30 between an interventional device 28 within a patient and an anatomical target thereof. The processor executes a process comprising: acquiring device tracking data from at least one tracking sensor 21 associated with the interventional device 28; acquiring target data corresponding to an annotation 22 of the anatomical target from an image guidance system 23; determining and updating, in real time, relative device- to-target quantitative positional and orientation information based upon the device tracking data and the target data; and outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance 30 for the interventional device 28.
A 'computer-readable storage medium' as used herein encompasses any tangible storage medium which may store instructions which are executable by a processor of a computing device. The computer-readable storage medium may be referred to as a computer-readable non-transitory storage medium. The computer-readable storage medium may also be referred to as a tangible computer-readable medium. In some embodiments, a computer-readable storage medium may also be able to store data which is able to be accessed by the processor of the computing device. Examples of computer-readable storage media include, but are not limited to: a floppy disk, a magnetic hard disk drive, a solid state hard disk, flash memory, a USB thumb drive, Random Access Memory (RAM), Read Only Memory (ROM), an optical disk, a magneto-optical disk, and the register file of the processor. Examples of optical disks include Compact Disks (CD) and Digital Versatile Disks (DVD), for example CD-ROM, CD-RW, CD-R, DVD-ROM, DVD-RW, or DVD-R disks. The term computer readable-storage medium also refers to various types of recording media capable of being accessed by the computer device via a network or communication link. For example a data may be retrieved over a modem, over the internet, or over a local area network. References to a computer-readable storage medium should be interpreted as possibly being multiple computer-readable storage mediums. Various executable components of a program or programs may be stored in different locations. The computer-readable storage medium may for instance be multiple computer-readable storage medium within the same computer system. The computer-readable storage medium may also be computer-readable storage medium distributed amongst multiple computer systems or computing devices.
'Computer memory' or 'memory' is an example of a computer-readable storage medium. Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to: RAM memory, registers, and register files. References to 'computer memory' or 'memory' should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
'Computer storage' or 'storage' is an example of a computer-readable storage medium. Computer storage is any non-volatile computer-readable storage medium.
Examples of computer storage include, but are not limited to: a hard disk drive, a USB thumb drive, a floppy drive, a smart card, a DVD, a CD-ROM, and a solid state hard drive. In some embodiments computer storage may also be computer memory or vice versa. References to 'computer storage' or 'storage' should be interpreted as possibly including multiple storage devices or components. For instance, the storage may include multiple storage devices within the same computer system or computing device. The storage may also include multiple storages distributed amongst multiple computer systems or computing devices.
A 'processor' as used herein encompasses an electronic component which is able to execute a program or machine executable instruction. References to the computing device comprising "a processor" should be interpreted as possibly containing more than one processor or processing core. The processor may for instance be a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems. The term computing device should also be interpreted to possibly refer to a collection or network of computing devices each comprising a processor or processors. Many programs have their instructions performed by multiple processors that may be within the same computing device or which may even be distributed across multiple computing devices.
A 'user interface' as used herein is an interface which allows a user or operator to interact with a computer or computer system. A 'user interface' may also be referred to as a 'human interface device.' A user interface may provide information or data to the operator and/or receive information or data from the operator. A user interface may enable input from an operator to be received by the computer and may provide output to the user from the computer. In other words, the user interface may allow an operator to control or manipulate a computer and the interface may allow the computer indicate the effects of the operator's control or manipulation. The display of data or information on a display or a graphical user interface is an example of providing information to an operator. The receiving of data through a touch screen, keyboard, mouse, trackball, touchpad, pointing stick, graphics tablet, joystick, gamepad, webcam, headset, gear sticks, steering wheel, wired glove, wireless remote control, and accelerometer are all examples of user interface components which enable the receiving of information or data from an operator.
A 'hardware interface' as used herein encompasses an interface which enables the processor of a computer system to interact with and/or control an external computing device and/or apparatus. A hardware interface may allow a processor to send control signals or instructions to an external computing device and/or apparatus. A hardware interface may also enable a processor to exchange data with an external computing device and/or apparatus. Examples of a hardware interface include, but are not limited to: a universal serial bus, IEEE 1394 port, parallel port, IEEE 1284 port, serial port, RS-232 port, IEEE-488 port, Bluetooth connection, Wireless local area network connection, TCP/IP connection, Ethernet connection, control voltage interface, MIDI interface, analog input interface, and digital input interface.
A 'display' or 'display device' as used herein encompasses an output device or a user interface adapted for displaying images or data. A display may output visual, audio, and or tactile data. Examples of a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light-emitting diode displays (OLED), a projector, and Head-mounted display. While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS What is claimed is:
1. A system for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof, the system comprising:
at least one tracking sensor associated with the interventional device;
a determination unit configured to
receive device tracking data from the at least one tracking sensor associated with the interventional device,
receive target data corresponding to an annotation of the anatomical target, and
determine and update, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data; and
a display unit configured to display quantitative clinician guidance for the interventional device based upon the relative device-to-target quantitative positional and orientation information from the determination unit.
2. The system of claim 1, wherein the at least one tracking sensor comprises at least one of a transducer, electromagnetic device, radiofrequency device and optical sensor.
3. The system of claim 1, wherein the at least one tracking sensor is positioned at a distal end of the interventional device.
4. The system of claim 1, wherein each of the device tracking data and the target data comprises three-dimensional coordinate data.
5. The system of claim 1, further comprising an image guidance system to provide the determination unit with the target data.
6. The system of claim 5, wherein the image guidance system comprises at least one of an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and a fused x-ray/TEE device.
7. The system of claim 5, wherein the display unit comprises one or more display monitors configured to display the quantitative clinician guidance and images of the interventional device, anatomical target and annotation from the image guidance system.
8. The system of claim 1, wherein the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information.
9. A method for the determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof, the method comprising:
acquiring device tracking data from at least one tracking sensor associated with the interventional device;
acquiring target data corresponding to an annotation of the anatomical target from an image guidance system;
determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data; and
outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance for the interventional device.
10. The method of claim 9, wherein the at least one tracking sensor comprises at least one of a transducer, electromagnetic device, radiofrequency device and optical sensor.
11. The method of claim 9, wherein the at least one tracking sensor is positioned at a distal end of the interventional device.
12. The method of claim 9, wherein each of the device tracking data and the target data comprises three-dimensional coordinate data.
13. The method of claim 9, wherein the image guidance system comprises at least one of an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and a fused x-ray/TEE device.
14. The method of claim 9, further comprising displaying, on one or more display monitors, the quantitative clinician guidance and images of the interventional device, anatomical target and annotation from the image guidance system.
15. The method of claim 9, wherein the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information.
16. A non-transitory computer-readable storage medium having stored therein machine readable instructions configured to be executed by a processor for the
determination and display of quantitative clinician guidance between an interventional device within a patient and an anatomical target thereof, the machine readable instructions causing the processor to execute a process comprising:
acquiring device tracking data from at least one tracking sensor associated with the interventional device;
acquiring target data corresponding to an annotation of the anatomical target from an image guidance system;
determining and updating, in real time, relative device-to-target quantitative positional and orientation information based upon the device tracking data and the target data; and
outputting the relative device-to-target quantitative positional and orientation information for display to provide quantitative clinician guidance for the interventional device.
17. The non-transitory computer-readable storage medium of claim 16, wherein the tracking data is acquired from the at least one tracking sensor comprising at least one of a transducer, electromagnetic device, radiofrequency device and optical sensor.
18. The non-transitory computer-readable storage medium of claim 16, wherein each of the device tracking data and the target data comprises three-dimensional coordinate data.
19. The non-transitory computer-readable storage medium of claim 16, wherein acquiring target data from the image guidance system comprises acquiring target data from at least one of an x-ray device, an ultrasound device, a transesophageal echocardiography (TEE) device and a fused x-ray/TEE device.
20. The non-transitory computer-readable storage medium of claim 16, wherein outputting comprises displaying, on one or more display monitors, the quantitative clinician guidance and images of the interventional device, anatomical target and annotation from the image guidance system.
21. The non-transitory computer-readable storage medium of claim 16, wherein the relative device-to-target quantitative positional and orientation information comprises at least relative device-to-target distance and angular orientation information.
PCT/IB2015/059497 2014-12-31 2015-12-10 Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods WO2016108110A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462098570P 2014-12-31 2014-12-31
US62/098,570 2014-12-31

Publications (1)

Publication Number Publication Date
WO2016108110A1 true WO2016108110A1 (en) 2016-07-07

Family

ID=55025297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/059497 WO2016108110A1 (en) 2014-12-31 2015-12-10 Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods

Country Status (1)

Country Link
WO (1) WO2016108110A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072003A1 (en) * 2016-10-21 2018-04-26 Synaptive Medical (Barbados) Inc. Methods and systems for providing depth information
CN111278381A (en) * 2017-08-28 2020-06-12 皇家飞利浦有限公司 Automatic field of view update for location tracked interventional devices
US10828114B2 (en) 2016-10-18 2020-11-10 Synaptive Medical (Barbados) Inc. Methods and systems for providing depth information
US11628014B2 (en) 2016-12-20 2023-04-18 Koninklijke Philips N.V. Navigation platform for a medical device, particularly an intracardiac catheter
EP4197446A1 (en) * 2021-12-20 2023-06-21 GE Precision Healthcare LLC Methods and system for positioning a c-arm
CN111278381B (en) * 2017-08-28 2024-04-09 皇家飞利浦有限公司 Automatic field of view update for location tracked interventional devices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1571581A1 (en) * 2003-01-30 2005-09-07 Surgical Navigation Technologies, Inc. Method and apparatus for preplanning a surgical procedure
US20080243142A1 (en) * 2007-02-20 2008-10-02 Gildenberg Philip L Videotactic and audiotactic assisted surgical methods and procedures
US20100179418A1 (en) * 2008-06-16 2010-07-15 Matthias Mueller Instrument aligning method using a free reference
US20110237936A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20120259209A1 (en) * 2011-04-11 2012-10-11 Harhen Edward P Ultrasound guided positioning of cardiac replacement valves

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1571581A1 (en) * 2003-01-30 2005-09-07 Surgical Navigation Technologies, Inc. Method and apparatus for preplanning a surgical procedure
US20080243142A1 (en) * 2007-02-20 2008-10-02 Gildenberg Philip L Videotactic and audiotactic assisted surgical methods and procedures
US20100179418A1 (en) * 2008-06-16 2010-07-15 Matthias Mueller Instrument aligning method using a free reference
US20110237936A1 (en) * 2010-03-25 2011-09-29 Medtronic, Inc. Method and Apparatus for Guiding an External Needle to an Implantable Device
US20120259209A1 (en) * 2011-04-11 2012-10-11 Harhen Edward P Ultrasound guided positioning of cardiac replacement valves

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10828114B2 (en) 2016-10-18 2020-11-10 Synaptive Medical (Barbados) Inc. Methods and systems for providing depth information
WO2018072003A1 (en) * 2016-10-21 2018-04-26 Synaptive Medical (Barbados) Inc. Methods and systems for providing depth information
GB2570835A (en) * 2016-10-21 2019-08-07 Synaptive Medical Barbados Inc Methods and systems for providing depth information
GB2570835B (en) * 2016-10-21 2021-11-03 Synaptive Medical Inc Methods and systems for providing depth information
US11672609B2 (en) 2016-10-21 2023-06-13 Synaptive Medical Inc. Methods and systems for providing depth information
US11628014B2 (en) 2016-12-20 2023-04-18 Koninklijke Philips N.V. Navigation platform for a medical device, particularly an intracardiac catheter
CN111278381A (en) * 2017-08-28 2020-06-12 皇家飞利浦有限公司 Automatic field of view update for location tracked interventional devices
CN111278381B (en) * 2017-08-28 2024-04-09 皇家飞利浦有限公司 Automatic field of view update for location tracked interventional devices
EP4197446A1 (en) * 2021-12-20 2023-06-21 GE Precision Healthcare LLC Methods and system for positioning a c-arm

Similar Documents

Publication Publication Date Title
US11754971B2 (en) Method and system for displaying holographic images within a real object
JP2021049416A (en) Image registration and guidance using concurrent x-plane imaging
US10166079B2 (en) Depth-encoded fiducial marker for intraoperative surgical registration
US10575755B2 (en) Computer-implemented technique for calculating a position of a surgical device
US10390890B2 (en) Navigational feedback for intraoperative waypoint
US9436993B1 (en) System and method for fused image based navigation with late marker placement
US10238361B2 (en) Combination of ultrasound and x-ray systems
CN103619278B (en) The system guiding injection during endoscopic surgery
Zhang et al. Electromagnetic tracking for abdominal interventions in computer aided surgery
CN113614844A (en) Dynamic intervention three-dimensional model deformation
US20160163105A1 (en) Method of operating a surgical navigation system and a system using the same
US20080123910A1 (en) Method and system for providing accuracy evaluation of image guided surgery
Andrews et al. Registration techniques for clinical applications of three-dimensional augmented reality devices
CN105611877A (en) Method and system for guided ultrasound image acquisition
JP6078559B2 (en) Imaging device for imaging objects
Meng et al. A remote‐controlled vascular interventional robot: system structure and image guidance
EP3544538B1 (en) System for navigating interventional instrumentation
WO2016108110A1 (en) Relative position/orientation tracking and visualization between an interventional device and patient anatomical targets in image guidance systems and methods
Traub et al. Advanced display and visualization concepts for image guided surgery
US10898272B2 (en) Visualizing navigation of a medical device in a patient organ using a dummy device and a physical 3D model
US20200359994A1 (en) System and method for guiding ultrasound probe
US20220270247A1 (en) Apparatus for moving a medical object and method for providing a control instruction
Koo et al. Simulation Method for the Physical Deformation of a Three-Dimensional Soft Body in Augmented Reality-Based External Ventricular Drainage
WO2012001550A1 (en) Method and system for creating physician-centric coordinate system
Perrotta Development and validation of an in-vitro tracking system for estimating the 3D position of a steerable needle for neurosurgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15816888

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15816888

Country of ref document: EP

Kind code of ref document: A1