US20140316256A1 - Display Of An Acquired Cine Loop For Procedure Navigation - Google Patents

Display Of An Acquired Cine Loop For Procedure Navigation Download PDF

Info

Publication number
US20140316256A1
US20140316256A1 US14/319,973 US201414319973A US2014316256A1 US 20140316256 A1 US20140316256 A1 US 20140316256A1 US 201414319973 A US201414319973 A US 201414319973A US 2014316256 A1 US2014316256 A1 US 2014316256A1
Authority
US
United States
Prior art keywords
drf
location
patient
image
tracked
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/319,973
Inventor
Michael R. Neidert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Inc
Original Assignee
Medtronic Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medtronic Inc filed Critical Medtronic Inc
Priority to US14/319,973 priority Critical patent/US20140316256A1/en
Assigned to MEDTRONIC, INC. reassignment MEDTRONIC, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NEIDERT, MICHAEL R.
Publication of US20140316256A1 publication Critical patent/US20140316256A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00243Type of minimally invasive operation cardiac
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure

Definitions

  • the present disclosure relates to acquisition of image data of a subject, and particularly to acquisition and display of image data based upon a physical location of a tracked portion of the subject.
  • Image data can be collected of a subject, such as a patient, at a selected rate and over a period of time.
  • a “cine loop” image data can include a plurality of frames that are collected at specific time points of the subject.
  • the frames can be collected at a specific rate, such as about 10 to about 50 frames per second, to acquire a plurality of image data information of the subject.
  • the plurality of frames can be displayed at a selected rate to illustrate motion of the subject over time, similar to a cinematic movie displaying a plurality of frames to represent motion or change in position of a particular object from one frame to the next.
  • the display of the image data on a display device can represent the structure of the subject.
  • structures within the subject may move over time.
  • a heart in the patient may move over time, such that a single static representation or illustration of the heart may not represent the actual position of the heart at a certain moment. Determining what frame of a plurality of collected frames, such as from the cine loop, to display to illustrate the exact location or configuration of the heart in a specific time point, can be troublesome.
  • a structure of a subject such as a heart or heart wall of a patient, can be illustrated over time based upon a determination of a position of the heart wall.
  • a tracking device such as a reference tracking device
  • the tracked location of the reference tracking device can be related to a frame in a cine loop image data of the heart.
  • the cine loop image data can include a plurality of frames where each one relates to a tracked or determined location of the reference tracking device. Accordingly, subsequently tracking the reference tracking device can be used to determine which of the frames should be displayed on a display device to illustrate the current location or configuration of the heart.
  • the reference tracking device can be connected to any appropriate structure, including a heart wall, diaphragm, abdomen wall, or the like and the location of the reference tracking device that relates to selected frames can be made.
  • the correlation can be used to display the appropriate frame on a display device that relates to a later determined and tracked position of the reference tracking device.
  • FIG. 1 is an environmental view of a subject with an imaging and navigation system
  • FIG. 2 is a flowchart of a method for collecting and displaying cine loop images
  • FIG. 3 is a diagram illustrating a correlation between a reference tracking device and cine loop images
  • FIG. 4 is a diagram illustrating a correlation between a tracked location reference tracking device, a physiological measurement, and a frame cine loop image
  • FIG. 5 is a diagram illustrating a predictive correlation between a determined location of a reference tracking device and a cine loop image frame
  • FIG. 6 is a diagram illustrating a sequential correlation between a determined location of a reference tracking device and a cine loop image frame and a further tracking and non-sequential selection of images from the cine loop image frames.
  • a cine loop can refer to a plurality of images acquired at a selected rate of any portion.
  • the plurality of images can then be viewed in sequence at a selected rate to indicate motion or movement of the portion.
  • the portion can be an anatomical portion, such as a heart, or a non-anatomical portion, such as a moving engine or other moving system.
  • FIG. 1 is a diagram illustrating an overview of a navigation system 10 that can be used for various procedures.
  • the navigation system 10 can be used to track the location of an item, such as an implant or an instrument (e.g. instrument 80 as discussed herein), relative to a subject, such as a patient 14 .
  • the navigation system 10 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, ablation instruments, stent placement, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc.
  • Non-human or surgical procedures may also use the instrument 80 and the navigation system 10 .
  • the instruments may be used to navigate or map any region of the body.
  • the navigation system 10 and the various tracked items may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
  • the navigation system 10 can interface with or integrally include an imaging system 12 that is used to acquire pre-operative, intra-operative, or post-operative, or real-time image data of the patient 14 . It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject.
  • the navigation system 10 can be used to track various tracking devices, as discussed herein, to determine locations of the patient 14 . The tracked locations of the patient 14 can be used to determine or select images for display to be used with the navigation system 10 .
  • the initial discussion is directed to the navigation system 10 and the exemplary imaging system 12 .
  • the imaging system 12 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo., USA.
  • the imaging device 12 includes imaging portions such as a generally annular gantry housing 20 that encloses an image capturing portion 22 .
  • the image capturing portion 22 may include an x-ray source or emission portion 26 and an x-ray receiving or image receiving portion 28 .
  • the emission portion 26 and the image receiving portion 28 are generally spaced about 180 degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion 22 .
  • the image capturing portion 22 can be operable to rotate 360 degrees during image acquisition.
  • the image capturing portion 22 may rotate around a central point or axis, allowing image data of the patient 14 to be acquired from multiple directions or in multiple planes.
  • the imaging system 12 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421; 7,106,825; 7,001,045; and 6,940,941; all of which are incorporated herein by reference.
  • the imaging system 12 can also include or be associated with various image processing systems, as discussed herein.
  • Other possible imaging systems can include C-arm fluoroscopic imaging systems which can also be used to generate three-dimensional views of the patient 14 .
  • the patient 14 can be fixed onto an operating table 29 , but is not required to be fixed to the table 29 .
  • the table 29 can include a plurality of straps 29 s .
  • the straps 29 s can be secured around the patient 14 to fix the patient 14 relative to the table 29 .
  • Various apparatuses may be used to position the patient 14 in a static position on the operating table 29 . Examples of such patient positioning devices are set forth in commonly assigned U.S. patent application Ser. No. 10/405,068, published as U.S. Pat. App. Pub. No. 2004-0199072 on Oct. 7, 2004, entitled “An Integrated Electromagnetic Navigation And Patient Positioning Device”, filed Apr. 1, 2003 which is hereby incorporated by reference.
  • Other known apparatuses may include a Mayfield® clamp.
  • the navigation system 10 includes a tracking system 30 that can be used to track instruments relative to the patient 14 or within a navigation space.
  • the navigation system 10 can use image data from the imaging system 12 and information from the tracking system 30 to illustrate locations of the tracked instruments, as discussed herein.
  • the tracking system 30 can include a plurality of types of tracking systems including an optical tracking system that includes an optical localizer 40 and/or an electromagnetic (EM) tracking system that can include an EM localizer 42 that communicates with or through an EM controller 44 .
  • the optical tracking system 40 and the EM tracking system with the EM localizer 42 can be used together to track multiple instruments or used together to redundantly track the same instrument.
  • tracking devices can be tracked with the tracking system 30 and the information can be used by the navigation system 10 to allow for an output system to output, such as a display device to display, a position of an item.
  • tracking devices can include a patient or reference tracking device (to track the patient 14 ) 48 , an imaging device tracking device 50 (to track the imaging device 12 ), and an instrument tracking device 52 (to track the instrument 80 ), allow selected portions of the operating theater to be tracked relative to one another with the appropriate tracking system, including the optical localizer 40 and/or the EM localizer 42 .
  • the reference tracking device 48 can be positioned on an instrument 82 (e.g. a catheter) to be positioned within the patient 14 , such as within a heart 15 of the patient 14 .
  • any of the tracking devices 48 - 52 can be optical or EM tracking devices, or both, depending upon the tracking localizer used to track the respective tracking devices. It will be further understood that any appropriate tracking system can be used with the navigation system 10 . Alternative tracking systems can include radar tracking systems, acoustic tracking systems, ultrasound tracking systems, and the like.
  • An exemplarily EM tracking system can include the STEALTHSTATION® AXIEMTM Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo.
  • Exemplary tracking systems are also disclosed in U.S. Pat. No. 7,751,865, issued Jul. 6, 2010 and entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION”; U.S. Pat. No. 5,913,820, titled “Position Location System,” issued Jun. 22, 1999 and U.S. Pat. No. 5,592,939, titled “Method and System for Navigating a Catheter Probe,” issued Jan. 14, 1997, all herein incorporated by reference.
  • shielding systems include those in U.S. Pat. No. 7,797,032, issued on Sep. 14, 2010 and U.S. Pat. No. 6,747,539, issued on Jun. 8, 2004; distortion compensation systems can include those disclosed in U.S. patent Ser. No. 10/649,214, filed on Jan. 9, 2004, published as U.S. Pat. App. Pub. No. 2004/0116803, all of which are incorporated herein by reference.
  • the localizer 42 and the various tracking devices can communicate through the EM controller 44 .
  • the EM controller 44 can include various amplifiers, filters, electrical isolation, and other systems.
  • the EM controller 44 can also control the coils of the localizer 42 to either emit or receive an EM field for tracking.
  • a wireless communications channel such as that disclosed in U.S. Pat. No. 6,474,341, entitled “Surgical Communication Power System,” issued Nov. 5, 2002, herein incorporated by reference, can be used as opposed to being coupled directly to the EM controller 44 .
  • the tracking system may also be or include any appropriate tracking system, including a STEALTHSTATION® TRIA®, TREON®, and/or S7TM Navigation System having an optical localizer, similar to the optical localizer 40 , sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo.
  • Further alternative tracking systems are disclosed in U.S. Pat. No. 5,983,126, to Wittkampf et al. titled “Catheter Location System and Method,” issued Nov. 9, 1999, which is hereby incorporated by reference.
  • Other tracking systems include an acoustic, radiation, radar, etc. tracking or navigation systems.
  • the imaging system 12 can further include a support housing or cart 56 that can house a separate image processing unit 58 .
  • the cart can be connected to the gantry 20 .
  • the navigation system 10 can include a navigation processing unit 60 that can communicate or include a navigation memory 62 .
  • the navigation processing unit 60 can include a processor (e.g. a computer processor) that executes instructions to determine locations of the tracking devices 48 - 52 based on signals from the tracking devices.
  • the navigation processing unit 60 can receive information, including image data, from the imaging system 12 and tracking information from the tracking systems 30 , including the respective tracking devices 48 - 52 and the localizers 40 - 42 .
  • Image data can be displayed as an image 64 on a display device 66 of a workstation or other computer system 68 (e.g.
  • the workstation 68 can include appropriate input devices, such as a keyboard 70 . It will be understood that other appropriate input devices can be included, such as a mouse, a foot pedal or the like which can be used separately or in combination. Also, all of the disclosed processing units or systems can be a single processor (e.g. a single central processing chip) that can execute different instructions to perform different tasks.
  • the image processing unit 58 processes image data from the imaging system 12 and transmits it to the navigation processor 60 . It will be further understood, however, that the imaging system 12 need not perform any image processing and it can transmit the image data directly to the navigation processing unit 60 . Accordingly, the navigation system 10 may include or operate with a single or multiple processing centers or units that can access single or multiple memory systems based upon system design.
  • the imaging system 12 can generate image data that can be registered to the patient space or navigation space.
  • the position of the patient 14 relative to the imaging system 12 can be determined by the navigation system 10 with the patient tracking device 48 and the imaging system tracking device 50 to assist in registration. Accordingly, the position of the patient 14 relative to the imaging system 12 can be determined.
  • the imaging system 12 can know its position and be repositioned to the same position within about 10 microns. This allows for a substantially precise placement of the imaging system 12 and precise determination of the position of the imaging device 12 . Precise positioning of the imaging portion 22 is further described in U.S. Pat. Nos. 7,188,998; 7,108,421; 7,106,825; 7,001,045; and 6,940,941; all of which are incorporated herein by reference.
  • Subject or patient space and image space can be registered by identifying matching points or fiducial points in the patient space and related or identical points in the image space.
  • the position of the imaging device 12 is known, either through tracking or its “known” position (e.g. O-arm® imaging device sold by Medtronic, Inc.), or both, the image data is generated at a precise and known position. This can allow image data that is automatically or “inherently registered” to the patient 14 upon acquisition of the image data.
  • the position of the patient 14 is known precisely relative to the imaging system 12 due to the accurate positioning of the imaging system 12 . This allows points in the image data to be known relative to points of the patient 14 because of the known precise location of the imaging system 12 .
  • registration can occur by matching fiducial points in image data with fiducial points on the patient 14 .
  • Registration of image space to patient space allows for the generation of a translation map between the patient space and the image space.
  • registration can occur by determining points that are substantially identical in the image space and the patient space.
  • the identical points can include anatomical fiducial points or implanted fiducial points. Exemplary registration techniques are disclosed in Ser. No. 12/400,273, filed on Mar. 9, 2009, incorporated herein by reference.
  • the navigation system 10 with or including the imaging system 12 , can be used to perform selected procedures. Selected procedures can use the image data generated or acquired with the imaging system 12 . Further, the imaging system 12 can be used to acquire image data at different times relative to a procedure. As discussed herein, image data can be acquired of the patient 14 prior to the procedure for collection of automatically registered image data or cine loop image data. Also, the imaging system 12 can be used to acquire images for confirmation of a portion of the procedure.
  • a system or flowchart 100 is illustrated to show a process or method of acquiring image data of a subject, such as the patient 14 and for subsequent illustration of the image data on the display 68 and navigation of a procedure with the instrument 80 .
  • the subject may also be any other imagable portion, such as a fuselage, non-human animal, etc.
  • a plurality of images can be acquired of the patient 14 , including the heart 15 of the patient 14 , which can be displayed on the display 66 to illustrate a current location of the instrument 80 that is positioned within the heart 15 of the patient 14 .
  • image data can be acquired of any portion of the patient 14 , particularly a portion of the patient that moves over time.
  • a portion of the patient 14 can include lungs, diaphragm, or other circulatory systems that may move over time.
  • the image data can be acquired of the patient 14 and can be registered to the patient to allow for a cinematic “cine loop” illustration of the patient or portion of the patient when the instrument 100 is positioned within the patient 14 , such as within the heart 15 .
  • the cine loop can include a plurality of images collected in succession and operable to be displayed in succession to illustrate a representation of motion and/or change in configuration of a structure, such as the heart 15 .
  • the method 100 can begin by collecting automatically registered images in block 102 .
  • the automatically registered images can be acquired with the imaging system 12 , illustrated in FIG. 1 .
  • the imaging system 12 can collect images of the patient 14 , including the heart 15 , that are registered to the position of the patient 14 .
  • Various methods and mechanisms, as discussed above, can be used to automatically register the images acquired with the imaging system 12 relative to the patient 14 .
  • the position of the patient 14 and the imaging device 12 can be known during image acquisition.
  • the images collected and automatically registered in block 102 allow intraoperative image data acquisition of the patient 14 for positioning various portions or items within the patient 14 .
  • a dynamic reference frame such as the dynamic reference frame 48 discussed above, can be navigated into the subject in block 104 .
  • the navigation of the internal dynamic reference frame 48 into the patient 14 can be done under fluoroscope guidance, such as with the imaging system 12 .
  • the internal dynamic reference frame 48 can be positioned in the patient 14 using any generally known technique, and can be tracked using the tracking system 30 .
  • the DRF 48 can be fixed or placed a portion of the patient in block 106 .
  • the internal DRF 48 can be fixed to the patient 14 in any appropriate location, such as affixing the internal DRF 48 to the right ventricular in the heart, such as the right ventricle apex, or the coronary sinus.
  • the internal DRF 48 can be positioned in any appropriate location within the heart 15 and a location or internal DRF 48 can then be used to track and dynamically determine motion of the heart 15 within the patient 14 .
  • the internal DRF 48 can be fixed to a structure or well of the heart 15 , such as with a helix screw.
  • the internal DRF 48 can be tracked with the tracking system 30 which may be the EM tracking system. Accordingly, a movement of the internal DRF 48 can be determined over time as the heart beats at its normal rhythm, or a rhythm imposed upon the heart 15 during the operative procedure. As will be discussed in greater detail herein, the internal DRF 48 can move to different three dimensional positions within the heart 15 as the heart 15 moves during time. Accordingly, as the heart 15 beats over time, the internal DRF 48 can move to a plurality of three dimension locations. However, the beating of the heart 15 is generally cyclic and the internal DRF 48 will generally move within a cycle, such as within a limited range of motion within the heart 15 .
  • the DRF 48 will pass through the same positions over time as the heart 15 beats. For example, when the heart 15 is contracted, the DRF 48 will generally be at the same location during each contraction. Nevertheless, the movement of the heart or a portion of the heart to which internal DRF 48 is attached can be continuously tracked with the tracking system 30 using the internal DRF 48 that is fixed or replaced in the heart 15 .
  • a determination block can determine whether cine loop image data was acquired prior to placing the DRF in block 108 . Although two possibilities can occur from the decision block 108 , initially the following discussion will be directed to following a NO path 110 to a collection of cine loop image data while simultaneously tracking locations of the placed DRF 48 in block 112 .
  • Collecting the cine loop image data while simultaneously tracking the DRF 48 includes imaging the patient 14 over time and tracking location of the DRF 48 placed in the patient 14 in block 106 over the same time period.
  • an imaging system such as the imaging system 12 that can include the O-arm® imaging system or other fluoroscopic imaging systems, can acquire image data of the patient 14 , including the heart 15 , over a specified time period.
  • a selected frame rate of images can be acquired of the patient 14 , for example, about 30 frames (e.g. image frames) per second can be captured. It will be understood, however, that any appropriate frame rate, such as 10 frames per second, 20 frames per second, or 60 frames per second, can be collected.
  • the frames can be displayed in sequence to illustrate a motion of the patient 14 , such as the heart 15 , over time. Similar to a cinematic movie where a plurality of frames are shown in a selected sequence and speed to achieve an illusion of motion on a display. Accordingly, the plurality of frames collected can be displayed on a display device, such as the display device 66 to illustrate motion of the heart 15 over time in a “cine loop”.
  • the time period can be any appropriate time period and each frame can be collected at a time point t x .
  • a time period can be from time point t 1 to t n , where each frame is collected at each time point t x in the time point t 1 and t n .
  • the DRF 48 positioned in the patient 14 can be tracked over the same time period. Additionally, the location of the DRF 48 can be tracked at a similar rate or at a faster rate or at a slower rate relative to the acquisition of the image data. Nevertheless, it can be selected to track or determine the location of the DRF 48 during or for each of the frames at time points t 1 to t n . Accordingly, a location of the DRF 48 for each of the frames collected in the image data can be correlated to a tracked location of the DRF 48 .
  • the DRF 48 can be tracked in any appropriate degrees of freedom of movement including an x, y, z three dimensional position and selected orientation positions, if selected.
  • a correlation can be made between the multiple frames collected of the image data and the tracked locations of the DRF 48 in block 114 . As discussed above, it can be selected to correlate each of the frames of the image data acquired to a tracked location of the DRF 48 .
  • the following discussion relates to correlation of images to tracked locations of the DRF 48 . It will be understood, that alternatively, a procedure can be navigated with images that are correlated using various techniques including gating frame selection techniques as disclosed in U.S. patent application Ser. No. 12/183,688, filed on Jul. 31, 2008, and published as U.S. Pat. App. Pub. No. 2010/0030061, published on Feb. 4, 2010, incorporated herein by reference.
  • the location of the DRF 48 can be tracked for the period and at each of the time points t 1 through t n .
  • Each of these DRF 48 locations can be correlated to a cine loop image or frame, as discussed above.
  • the plurality of images can be placed in a cine loop to illustrate motion of the heart 15 .
  • each of the tracked locations at each of the time points relate to an image or frame in the cine loop at each of the time points t 1 through t n , as illustrated by the arrows 3 a.
  • the correlation can be done during the tracking of the DRF 48 and the acquisition of the images or after the acquisition of the images and the tracked location of the DRF 48 .
  • the location of the DRF 48 can be tracked or determined and correlated substantially simultaneously.
  • the DRF 48 can be tracked by a lead tracking system 30 and the cine loop images can be acquired by the imaging system 12 .
  • the specific times for each of the images at time points t 1 through t n can be saved with each of the images any the specific times of the tracked location of the DRF 48 at time t 1 though t n can be saved by the tracking system 30 .
  • An appropriate system such as the navigation system 10 including the processor system 68 , can then correlate the two specific times for a tracked location of the DRF 48 and an image in the cine loop acquired by the image system 12 .
  • the system such as the processor system 68 can execute instructions to perform the correlation based on the time points of the acquisition of the cine loop image data (e.g. frames) and the determined locations of the DRF 48 at the same time points.
  • each frame will have at least one DRF 48 location that is correlated with that frame.
  • the location of the DRF 48 can be used to select or determine which correlated frame illustrates the configuration and/or location of the heart 15 .
  • the DRF 48 is determined to be at a location that location can be used to select which of the frames, based on the correlated DRF 48 location, should be illustrated.
  • the acquisition of the image data by the imaging device 12 and the tracked location of the DRF 48 can be coordinated and correlated with the navigation system 10 .
  • the beginning of the tracking and the beginning of the image acquisition may happen at a time 0 and each time a frame of the cine loop is collected by the imaging system 12 , a tracked location of the DRF 48 can be saved.
  • the imaging and tracked location determination can happen simultaneously and be coordinated as such. They can then be correlated within the navigation system 10 .
  • the correlated cine loop images and DRF tracked locations can be stored in block 116 .
  • the stored correlated images and tracked locations can be stored within the memory system of the navigation memory 62 or as a part of the image processing unit 58 . Nevertheless, the images can be stored in a selected memory, such as a substantially permanent memory or flash memory, for display and/or recall at a selected time.
  • the stored images can be correlated to the tracked locations of the DRF 48 such as with a look-up table or index.
  • various frames of the cine loop can be displayed based upon a tracked location of the DRF 48 .
  • the decision block 108 on whether the cine loop image data was acquired prior to placing the DRF 48 can follow a YES path 120 . Accordingly, prior to continuing discussion of performing a procedure with image data correlated with the DRF 48 location, a discussion of acquiring image data prior to placing the DRF 48 in block 106 will be discussed by following the YES path 120 from the decision block 106 of whether the cine loop image data was acquired prior to placing the DRF 48 .
  • a simultaneous measurement of selected physiological subject data and a tracked location of the DRF can occur in block 122 .
  • a path can be followed to recall or collect cine loop images simultaneously while measuring the selected physiological data of the subject in block 124 .
  • tracked and determined locations of the DRF 48 that is placed in block 106 can be correlated to a physiological measurement of the subject and/or can be correlated to image data that is acquired of the subject. Accordingly, the processes in blocks 122 or 124 may happen in any selected order and one need not occur before the other.
  • simultaneously measuring the subject data and tracking location of the DRF 48 in block 122 will be selected to occur first.
  • the DRF 48 is paced in the subject, such as the patient 14 in block 106 .
  • a selected physiological measurement of the patient 14 can then be made after the DRF 48 is positioned in the patient 14 .
  • the physiological measurement of the patient can be any appropriate measurement, such as a respiration, heart rhythm measurement, various polarizations or depolarization's within the heart, or any selected physiological measurements.
  • a physiological measurement can include measurements of the patient 14 that relate to the physiology of the patient 14 based on the patient's 14 anatomy.
  • an electro-cardiogram can be made of the patient 14 that relates to a rhythm of the heart 15 of the patient 14 .
  • the measurement of the heart 15 with an EKG can be made while tracking the location of the DRF 48 .
  • a correlation between the tracked location of the DRF 48 and various peaks (e.g. amplitudes) and valleys (e.g. amplitudes) in EKG can then be made or saved. Accordingly, the location of the DRF 48 as it relates to any particular portion of the measured EKG can be determined.
  • Cine loop frames image can be acquired of the patient 14 using any appropriate imaging system, such as a magnetic resonance imaging (MRI) or computed tomography (CT) imaging devices.
  • MRI magnetic resonance imaging
  • CT computed tomography
  • the DRF can be placed and an image system, such as the O-arm® imaging system, can image the subject with the DRF in place, but the DRF is not required to be in place while imaging occurs, according to various embodiments.
  • the DRF 48 cannot (e.g. magnetic coils in a magnet of a MRI) or can be selected to not be positioned in the patient 14 during the acquisition of the cine loop frames.
  • the cine loop data can include the acquisition of a plurality of image frames of the patient 14 over a selected period of time such as between time points t 1 through t n .
  • physiological measurement of the patient 14 can also be collected.
  • the cine loop can be acquired while measuring an EKG of the patient 14 .
  • the physiological measurement, or a selected portion thereof, can be correlated to each image in the cine loop frame or image that is collected in block 124 .
  • a physiological measurement such as an EKG measurement of the patient 14
  • An image in the cine loop can also be collected at each of the time points t 1 through t n as illustrated in the right column 4 r .
  • a correlation to the amplitude in column 4 m at time point t 1 and the image at time point t 1 can be made while acquiring images or at any selected point thereafter, similar to that discussed above.
  • the physiology of the patient 14 can be measured beginning at time 0 and the image data can be acquired at time 0 and moving through the time period the physiological measurements can continue as can the acquisition of image frames.
  • the correlation of the image frames to the physiology measurements can be made at any selected time either during or after the acquisition of both the image data and the physiological measurements.
  • the acquisition of the image data and the simultaneous measurement of the physiological data of the patient in block 124 can be collected before or after a simultaneous measurement of the same physiological data of the patient 14 and the tracking of a location of the DRF 48 in block 122 . Nevertheless, after both the image data is collected along with a selected physiological measurement and the tracked measurement of the DRF 48 is made along with the collection of the same physiologic data, a correlation of the cine loop frames and the location of the DRF 48 can be made in block 126 . As illustrated in FIG. 4 , in the left hand column 41 , the tracked location of the DRF 48 for each of the time points t 1 though t n is illustrated.
  • arrows 4 a 1 show correlation of the physiological measurement and the flows of the cine loop at the various time points. Due to the correlation of the tracked location of the DRF 48 to the physiological data and the physiological data to the cine loop image frames, the tracked location of the DRF 48 can be correlated to each of the frames of the cine loop image data as illustrated by the connecting arrow 130 in FIG. 4 .
  • individual tracked locations of the DRF 48 can be directly correlated to individual images in the acquired cine loop image data as illustrated in FIG. 4 .
  • the correlated cine loop images in block 116 can include correlated images with a tracked location of the DRF 48 that were acquired prior to positioning the DRF 48 within the patient 14 .
  • the DRF 48 can be positioned within the patient 14 subsequent to acquisition of the cine loop image data, the position of the DRF 48 will be correlated to the various frames in the cine loop image data based upon a correlating data set, such as the measured physiological data in column 4 m of FIG. 4 .
  • each frame or a selected plurality of the image frames from the acquired cine loop can be correlated to a selected or tracked location of the DRF 48 .
  • the correlation can be direct (e.g. based on tracking the DRF 48 while acquiring the image data) or indirect (e.g. based on using an intermediate measurement, such as a physiological measurement).
  • the location of the DRF 48 at time t 1 is correlated to an image that relates to the same time t 1 .
  • a later tracked location of the DRF 48 in block 140 can be used to select which of the images for display or analysis.
  • the DRF 48 can be further tracked in block 140 .
  • the tracking system 30 can track the location of the DRF 48 within the patient 14 , such as within the heart 15 .
  • the location of the DRF 48 within the heart 15 can be determined to determine the motion or position of the related structure to which the DRF 48 is attached, within the heart 15 .
  • the DRF 48 can be positioned at the right ventricle apex and the DRF 48 can therefore determine the position of the right ventricle apex over time as the location of the DRF 48 is tracked.
  • a recall and or display of a selected frame from the cine loop image data can be made based on the tracked location of the DRF 48 in block 142 .
  • a tracked location of the DRF 48 has been correlated to each image in the cine loop image data. Accordingly, the location of the DRF 48 , including x, y, z coordinates and any orientation coordinate relating to, for example, time point t 3 , are tracked with a tracking system 30 then the image that relates to time point t 3 can be displayed on the display device 66 . Similarly, as illustrated in FIG. 4 , a measured physiology of the patient can be used to select the image from the cine loop image that relates to the time period, such as time point t 3 that relates to the image at time point t 3 . Accordingly, as illustrated in FIGS.
  • a tracked location of the DRF 48 in block 142 can be used to select an image that correlates to the tracked location of the DRF 48 from the stored data in block 116 .
  • the correlated image can then be displayed on the display 66 based upon the tracked location of the DRF 48 .
  • the sampling rate of the location of DRF 48 can be any appropriate sampling rate, such as one that correlates to a speed at which the frames in the cine loop were collected. Accordingly, if the images in the cine loop were collected at 30 frames per second, then the position of the DRF 48 can be sampled at about 30 times per second. Each time the position of the DRF 48 is sampled a determination of the position of the DRF 48 can be made and a determination of which the time point t 1 -t n the tracked location from block 140 can be determined.
  • a predictive image can be selected. For example, if the navigation system 10 determines that the DRF 48 is tracked in block 140 at the location of time point t 1 then the image at time point t 3 can be selected for display in block 142 .
  • the predictive time period can be any appropriate time period such as one to five or one to ten subsequent time point images later relative to the determined location of the DRF 48 .
  • the time delay can be any appropriate delay, such as about 1 to 10 seconds, and including about 1-5 seconds, and for example including about 3 seconds.
  • the tracked location of the DRF in block 140 that relates to time period t 1 can be used to predict that the image to be displayed in block 142 can be one that is about 3 seconds later rather than the image that directly correlates to the determined actual tracked location of the DRF 48 .
  • a one to one correlation, that being the time point t 1 tracked location of the DRF and a display of the image for time point t 1 is not required for display on the display 66 .
  • images may also be selected or recalled, in block 142 , out of sequence from the acquisition sequence.
  • a cine loop or plurality of images can be acquired in a selected order Image t 1 , image t 2 , to image t n .
  • Each of the images image t 1 to image tn can be correlated to a specific tracked locations x,y,z, ⁇ , ⁇ t 1 to x,y,z, ⁇ , ⁇ t n of the DRF 48 . While tracking the DRF 48 in block 140 , the location of the DRF 48 may not move in the same sequence as when the cine loop was acquired.
  • the heart 15 of the patient 14 may not beat in a normal or average sinus rhythm due to the procedure of other trauma to the patient 14 .
  • the locations x,y,z, ⁇ , ⁇ of the DRF 48 may be out of order of the acquired cine loop of images. Nevertheless, because each image t1 to image tn is correlated to a specific DRF location (e.g. by appropriate methods as discussed above), when one DRF location x,y,z, ⁇ , ⁇ is determined the image to which it relates can be selected for display or analysis.
  • the tracked location of the DRF can include, in tracked locations x,y,z, ⁇ , ⁇ t 5 , x,y,z, ⁇ , ⁇ t 7 , x,y,z, ⁇ , ⁇ t 1 , and x,y,z, ⁇ , ⁇ t 22 in that order.
  • the image that relates to each of these specific tracked locations can be displayed in the order as tracked, or as selected, including image t 5 , image t 7 , image t 1 , and image t 22 .
  • tracking the DRF 48 in block 140 does not simply synchronize the display of the image data, but actually allows for a specific determination and selection of an image for display. It is further understood, that interpolation and “guessing” or averaging algorithms can be used even if the DRF 48 is not tracked at a location identical to a location determined for the correlation.
  • a decision of whether a procedure is to be performed may be made in decision block 150 . If no procedure is to be performed then a NO path 148 can be followed to end the method in block 154 . However, if it is determined to perform a procedure then a YES path 158 can be followed to track a location of the instrument 80 in block 160 .
  • the location of the instrument 80 can be based upon a tracked location of the tracking device 52 that is connected with the instrument 80 .
  • the tracked location of the instrument 80 with the tracking device 52 can be used to determine the location of the instrument 80 within the heart 15 .
  • the location of the instrument 80 can then be displayed as an icon 166 representing the location of the tracked instrument 80 in block 162 . As illustrated in FIG.
  • the icon 166 can be displayed on the display device 68 relative to the image 64 of the heart 15 .
  • the image of the heart 64 can be the selected frame from the cine loop in block 142 based upon the tracked location of the DRF 48 from block 140 .
  • the display of the icon 166 representing the location of the tracked location of the instrument in block 162 can relate to a substantially instantaneous and natural and accurate location of the instrument 80 within the heart 15 .
  • This is in the alternative to illustrating a single image of the heart 15 as the image 64 which make it appear that the heart 15 is static and motionless and where the instrument 80 may not be in the exact illustrated location due to the movement of the heart 15 .
  • the imprecise location illustration of the instrument 80 may be due to the fact that the heart 15 may differ in location from a static image of the heart that is displayed on the display device 66 due to motion of the heart 15 .
  • the image on the display device 66 can change over time and be based upon a substantially mechanically tracked location of the heart 15 .
  • the DRF 48 can be connected directly to or substantially near a selected portion of the heart 15 , a physical location of the heart 15 is tracked with the DRF 48 . This can allow for a substantially precise and appropriate selection of the image in block 142 for display on the display device 66 .
  • the heart 15 may undergo changes due to the procedure that is occurring, but the physical location of the heart 15 can be tracked with a tracking device 48 connected to the heart 15 . Accordingly, even assuming that the heart 15 of the patient 14 may have a rhythm that is interrupted due to the procedure that is occurring on the patient 14 , such as due to the positioning of the instrument 80 within the patient, the physical location of the heart 15 that is tracked with the DRF 48 can be used to select an image of the cine loop to be displayed on the display device 66 .
  • an appropriate image can be selected for display on the display device 66 that is unaffected by change in the physiology of the patient 14 due to the procedure and is based substantially or mostly on a physical location of the portion of the patient, such as the heart 15 , on which the procedure is occurring.
  • acquisition of the cine loop image frames and/or the determination of the location of the DRF need not be gated to the patient 14 .
  • the determination of the location of the instrument 80 in the patient 14 can be tracked over time and illustrated relative to an image frame of the cine loop that can change over time due to the selection based on the tracked location of the DRF 48 .
  • a static image is not necessary or only a limited time of location determination is not necessary as the location of the DRF 48 can be determined at the same time as the location of the instrument 80 .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to acquiring image data of a subject and selecting image data to be displayed. The image data can include a plurality of frames that relate to a specific location of a tracking device positioned within the subject. The determined location of the tracking device can be used to determine which frame of the image data to display at a selected time.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation application of U.S. patent application Ser. No. 13/020,543 filed on Feb. 3, 2011, the entire disclosure of which is incorporated by reference herein.
  • FIELD
  • The present disclosure relates to acquisition of image data of a subject, and particularly to acquisition and display of image data based upon a physical location of a tracked portion of the subject.
  • BACKGROUND
  • This section provides background information related to the present disclosure which is not necessarily prior art.
  • Image data can be collected of a subject, such as a patient, at a selected rate and over a period of time. For example, a “cine loop” image data can include a plurality of frames that are collected at specific time points of the subject. Generally, the frames can be collected at a specific rate, such as about 10 to about 50 frames per second, to acquire a plurality of image data information of the subject. The plurality of frames can be displayed at a selected rate to illustrate motion of the subject over time, similar to a cinematic movie displaying a plurality of frames to represent motion or change in position of a particular object from one frame to the next.
  • During the procedure, such as a surgical procedure on a patient, the display of the image data on a display device can represent the structure of the subject. However, in various instances, structures within the subject may move over time. For example, a heart in the patient may move over time, such that a single static representation or illustration of the heart may not represent the actual position of the heart at a certain moment. Determining what frame of a plurality of collected frames, such as from the cine loop, to display to illustrate the exact location or configuration of the heart in a specific time point, can be troublesome.
  • SUMMARY
  • This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
  • A structure of a subject, such as a heart or heart wall of a patient, can be illustrated over time based upon a determination of a position of the heart wall. For example, a tracking device, such as a reference tracking device, can be connected to a wall or a portion of the heart of the patient. The tracked location of the reference tracking device can be related to a frame in a cine loop image data of the heart. The cine loop image data can include a plurality of frames where each one relates to a tracked or determined location of the reference tracking device. Accordingly, subsequently tracking the reference tracking device can be used to determine which of the frames should be displayed on a display device to illustrate the current location or configuration of the heart. It will be understood that the reference tracking device can be connected to any appropriate structure, including a heart wall, diaphragm, abdomen wall, or the like and the location of the reference tracking device that relates to selected frames can be made. The correlation can be used to display the appropriate frame on a display device that relates to a later determined and tracked position of the reference tracking device.
  • Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
  • DRAWINGS
  • The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
  • FIG. 1 is an environmental view of a subject with an imaging and navigation system;
  • FIG. 2 is a flowchart of a method for collecting and displaying cine loop images;
  • FIG. 3 is a diagram illustrating a correlation between a reference tracking device and cine loop images;
  • FIG. 4 is a diagram illustrating a correlation between a tracked location reference tracking device, a physiological measurement, and a frame cine loop image;
  • FIG. 5 is a diagram illustrating a predictive correlation between a determined location of a reference tracking device and a cine loop image frame; and
  • FIG. 6 is a diagram illustrating a sequential correlation between a determined location of a reference tracking device and a cine loop image frame and a further tracking and non-sequential selection of images from the cine loop image frames.
  • Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
  • DETAILED DESCRIPTION
  • Example embodiments will now be described more fully with reference to the accompanying drawings. As discussed herein, a cine loop can refer to a plurality of images acquired at a selected rate of any portion. The plurality of images can then be viewed in sequence at a selected rate to indicate motion or movement of the portion. The portion can be an anatomical portion, such as a heart, or a non-anatomical portion, such as a moving engine or other moving system.
  • FIG. 1 is a diagram illustrating an overview of a navigation system 10 that can be used for various procedures. The navigation system 10 can be used to track the location of an item, such as an implant or an instrument (e.g. instrument 80 as discussed herein), relative to a subject, such as a patient 14. It should further be noted that the navigation system 10 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, ablation instruments, stent placement, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc. Non-human or surgical procedures may also use the instrument 80 and the navigation system 10. Moreover, the instruments may be used to navigate or map any region of the body. The navigation system 10 and the various tracked items may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
  • The navigation system 10 can interface with or integrally include an imaging system 12 that is used to acquire pre-operative, intra-operative, or post-operative, or real-time image data of the patient 14. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject. The navigation system 10 can be used to track various tracking devices, as discussed herein, to determine locations of the patient 14. The tracked locations of the patient 14 can be used to determine or select images for display to be used with the navigation system 10. The initial discussion, however, is directed to the navigation system 10 and the exemplary imaging system 12.
  • In the example shown, the imaging system 12 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo., USA. The imaging device 12 includes imaging portions such as a generally annular gantry housing 20 that encloses an image capturing portion 22. The image capturing portion 22 may include an x-ray source or emission portion 26 and an x-ray receiving or image receiving portion 28. The emission portion 26 and the image receiving portion 28 are generally spaced about 180 degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion 22. The image capturing portion 22 can be operable to rotate 360 degrees during image acquisition. The image capturing portion 22 may rotate around a central point or axis, allowing image data of the patient 14 to be acquired from multiple directions or in multiple planes.
  • The imaging system 12 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421; 7,106,825; 7,001,045; and 6,940,941; all of which are incorporated herein by reference. The imaging system 12 can also include or be associated with various image processing systems, as discussed herein. Other possible imaging systems can include C-arm fluoroscopic imaging systems which can also be used to generate three-dimensional views of the patient 14.
  • The patient 14 can be fixed onto an operating table 29, but is not required to be fixed to the table 29. The table 29 can include a plurality of straps 29 s. The straps 29 s can be secured around the patient 14 to fix the patient 14 relative to the table 29. Various apparatuses may be used to position the patient 14 in a static position on the operating table 29. Examples of such patient positioning devices are set forth in commonly assigned U.S. patent application Ser. No. 10/405,068, published as U.S. Pat. App. Pub. No. 2004-0199072 on Oct. 7, 2004, entitled “An Integrated Electromagnetic Navigation And Patient Positioning Device”, filed Apr. 1, 2003 which is hereby incorporated by reference. Other known apparatuses may include a Mayfield® clamp.
  • The navigation system 10 includes a tracking system 30 that can be used to track instruments relative to the patient 14 or within a navigation space. The navigation system 10 can use image data from the imaging system 12 and information from the tracking system 30 to illustrate locations of the tracked instruments, as discussed herein. The tracking system 30 can include a plurality of types of tracking systems including an optical tracking system that includes an optical localizer 40 and/or an electromagnetic (EM) tracking system that can include an EM localizer 42 that communicates with or through an EM controller 44. The optical tracking system 40 and the EM tracking system with the EM localizer 42 can be used together to track multiple instruments or used together to redundantly track the same instrument. Various tracking devices, including those discussed further herein, can be tracked with the tracking system 30 and the information can be used by the navigation system 10 to allow for an output system to output, such as a display device to display, a position of an item. Briefly, tracking devices, can include a patient or reference tracking device (to track the patient 14) 48, an imaging device tracking device 50 (to track the imaging device 12), and an instrument tracking device 52 (to track the instrument 80), allow selected portions of the operating theater to be tracked relative to one another with the appropriate tracking system, including the optical localizer 40 and/or the EM localizer 42. The reference tracking device 48 can be positioned on an instrument 82 (e.g. a catheter) to be positioned within the patient 14, such as within a heart 15 of the patient 14.
  • It will be understood that any of the tracking devices 48-52 can be optical or EM tracking devices, or both, depending upon the tracking localizer used to track the respective tracking devices. It will be further understood that any appropriate tracking system can be used with the navigation system 10. Alternative tracking systems can include radar tracking systems, acoustic tracking systems, ultrasound tracking systems, and the like.
  • An exemplarily EM tracking system can include the STEALTHSTATION® AXIEM™ Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo. Exemplary tracking systems are also disclosed in U.S. Pat. No. 7,751,865, issued Jul. 6, 2010 and entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION”; U.S. Pat. No. 5,913,820, titled “Position Location System,” issued Jun. 22, 1999 and U.S. Pat. No. 5,592,939, titled “Method and System for Navigating a Catheter Probe,” issued Jan. 14, 1997, all herein incorporated by reference.
  • Further, for EM tracking systems it may be necessary to provide shielding or distortion compensation systems to shield or compensate for distortions in the EM field generated by the EM localizer 42. Exemplary shielding systems include those in U.S. Pat. No. 7,797,032, issued on Sep. 14, 2010 and U.S. Pat. No. 6,747,539, issued on Jun. 8, 2004; distortion compensation systems can include those disclosed in U.S. patent Ser. No. 10/649,214, filed on Jan. 9, 2004, published as U.S. Pat. App. Pub. No. 2004/0116803, all of which are incorporated herein by reference.
  • With an EM tracking system, the localizer 42 and the various tracking devices can communicate through the EM controller 44. The EM controller 44 can include various amplifiers, filters, electrical isolation, and other systems. The EM controller 44 can also control the coils of the localizer 42 to either emit or receive an EM field for tracking. A wireless communications channel, however, such as that disclosed in U.S. Pat. No. 6,474,341, entitled “Surgical Communication Power System,” issued Nov. 5, 2002, herein incorporated by reference, can be used as opposed to being coupled directly to the EM controller 44.
  • It will be understood that the tracking system may also be or include any appropriate tracking system, including a STEALTHSTATION® TRIA®, TREON®, and/or S7™ Navigation System having an optical localizer, similar to the optical localizer 40, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colo. Further alternative tracking systems are disclosed in U.S. Pat. No. 5,983,126, to Wittkampf et al. titled “Catheter Location System and Method,” issued Nov. 9, 1999, which is hereby incorporated by reference. Other tracking systems include an acoustic, radiation, radar, etc. tracking or navigation systems.
  • The imaging system 12 can further include a support housing or cart 56 that can house a separate image processing unit 58. The cart can be connected to the gantry 20. The navigation system 10 can include a navigation processing unit 60 that can communicate or include a navigation memory 62. The navigation processing unit 60 can include a processor (e.g. a computer processor) that executes instructions to determine locations of the tracking devices 48-52 based on signals from the tracking devices. The navigation processing unit 60 can receive information, including image data, from the imaging system 12 and tracking information from the tracking systems 30, including the respective tracking devices 48-52 and the localizers 40-42. Image data can be displayed as an image 64 on a display device 66 of a workstation or other computer system 68 (e.g. laptop, desktop, tablet computer which may have a central processor to act as the navigation processing unit 60 by executing instructions). The workstation 68 can include appropriate input devices, such as a keyboard 70. It will be understood that other appropriate input devices can be included, such as a mouse, a foot pedal or the like which can be used separately or in combination. Also, all of the disclosed processing units or systems can be a single processor (e.g. a single central processing chip) that can execute different instructions to perform different tasks.
  • The image processing unit 58 processes image data from the imaging system 12 and transmits it to the navigation processor 60. It will be further understood, however, that the imaging system 12 need not perform any image processing and it can transmit the image data directly to the navigation processing unit 60. Accordingly, the navigation system 10 may include or operate with a single or multiple processing centers or units that can access single or multiple memory systems based upon system design.
  • In various embodiments, the imaging system 12 can generate image data that can be registered to the patient space or navigation space. In various embodiments, the position of the patient 14 relative to the imaging system 12 can be determined by the navigation system 10 with the patient tracking device 48 and the imaging system tracking device 50 to assist in registration. Accordingly, the position of the patient 14 relative to the imaging system 12 can be determined.
  • Alternatively, or in addition to tracking the imaging system 12, the imaging system 12, such as the O-arm® imaging system, can know its position and be repositioned to the same position within about 10 microns. This allows for a substantially precise placement of the imaging system 12 and precise determination of the position of the imaging device 12. Precise positioning of the imaging portion 22 is further described in U.S. Pat. Nos. 7,188,998; 7,108,421; 7,106,825; 7,001,045; and 6,940,941; all of which are incorporated herein by reference.
  • Subject or patient space and image space can be registered by identifying matching points or fiducial points in the patient space and related or identical points in the image space. When the position of the imaging device 12 is known, either through tracking or its “known” position (e.g. O-arm® imaging device sold by Medtronic, Inc.), or both, the image data is generated at a precise and known position. This can allow image data that is automatically or “inherently registered” to the patient 14 upon acquisition of the image data. Essentially, the position of the patient 14 is known precisely relative to the imaging system 12 due to the accurate positioning of the imaging system 12. This allows points in the image data to be known relative to points of the patient 14 because of the known precise location of the imaging system 12.
  • Alternatively, manual or automatic registration can occur by matching fiducial points in image data with fiducial points on the patient 14. Registration of image space to patient space allows for the generation of a translation map between the patient space and the image space. According to various embodiments, registration can occur by determining points that are substantially identical in the image space and the patient space. The identical points can include anatomical fiducial points or implanted fiducial points. Exemplary registration techniques are disclosed in Ser. No. 12/400,273, filed on Mar. 9, 2009, incorporated herein by reference.
  • Once registered, the navigation system 10 with or including the imaging system 12, can be used to perform selected procedures. Selected procedures can use the image data generated or acquired with the imaging system 12. Further, the imaging system 12 can be used to acquire image data at different times relative to a procedure. As discussed herein, image data can be acquired of the patient 14 prior to the procedure for collection of automatically registered image data or cine loop image data. Also, the imaging system 12 can be used to acquire images for confirmation of a portion of the procedure.
  • With reference to FIG. 2, a system or flowchart 100 is illustrated to show a process or method of acquiring image data of a subject, such as the patient 14 and for subsequent illustration of the image data on the display 68 and navigation of a procedure with the instrument 80. It will be understood that the subject may also be any other imagable portion, such as a fuselage, non-human animal, etc. In the method 100, a plurality of images can be acquired of the patient 14, including the heart 15 of the patient 14, which can be displayed on the display 66 to illustrate a current location of the instrument 80 that is positioned within the heart 15 of the patient 14. It will be understood, however, that image data can be acquired of any portion of the patient 14, particularly a portion of the patient that moves over time. For example, a portion of the patient 14 can include lungs, diaphragm, or other circulatory systems that may move over time. The image data can be acquired of the patient 14 and can be registered to the patient to allow for a cinematic “cine loop” illustration of the patient or portion of the patient when the instrument 100 is positioned within the patient 14, such as within the heart 15. The cine loop can include a plurality of images collected in succession and operable to be displayed in succession to illustrate a representation of motion and/or change in configuration of a structure, such as the heart 15.
  • With continuing reference to FIG. 2, and additional reference to FIG. 1, the method 100 can begin by collecting automatically registered images in block 102. The automatically registered images can be acquired with the imaging system 12, illustrated in FIG. 1. As discussed above, the imaging system 12 can collect images of the patient 14, including the heart 15, that are registered to the position of the patient 14. Various methods and mechanisms, as discussed above, can be used to automatically register the images acquired with the imaging system 12 relative to the patient 14. In one example, the position of the patient 14 and the imaging device 12 can be known during image acquisition.
  • The images collected and automatically registered in block 102 allow intraoperative image data acquisition of the patient 14 for positioning various portions or items within the patient 14. For example, a dynamic reference frame, such as the dynamic reference frame 48 discussed above, can be navigated into the subject in block 104. The navigation of the internal dynamic reference frame 48 into the patient 14 can be done under fluoroscope guidance, such as with the imaging system 12. Alternatively, it can be understood that the internal dynamic reference frame 48 can be positioned in the patient 14 using any generally known technique, and can be tracked using the tracking system 30. The DRF 48 can be fixed or placed a portion of the patient in block 106. The internal DRF 48 can be fixed to the patient 14 in any appropriate location, such as affixing the internal DRF 48 to the right ventricular in the heart, such as the right ventricle apex, or the coronary sinus. The internal DRF 48 can be positioned in any appropriate location within the heart 15 and a location or internal DRF 48 can then be used to track and dynamically determine motion of the heart 15 within the patient 14. Also, the internal DRF 48 can be fixed to a structure or well of the heart 15, such as with a helix screw.
  • As is understood, the internal DRF 48 can be tracked with the tracking system 30 which may be the EM tracking system. Accordingly, a movement of the internal DRF 48 can be determined over time as the heart beats at its normal rhythm, or a rhythm imposed upon the heart 15 during the operative procedure. As will be discussed in greater detail herein, the internal DRF 48 can move to different three dimensional positions within the heart 15 as the heart 15 moves during time. Accordingly, as the heart 15 beats over time, the internal DRF 48 can move to a plurality of three dimension locations. However, the beating of the heart 15 is generally cyclic and the internal DRF 48 will generally move within a cycle, such as within a limited range of motion within the heart 15. Also, due to the cyclic nature, the DRF 48 will pass through the same positions over time as the heart 15 beats. For example, when the heart 15 is contracted, the DRF 48 will generally be at the same location during each contraction. Nevertheless, the movement of the heart or a portion of the heart to which internal DRF 48 is attached can be continuously tracked with the tracking system 30 using the internal DRF 48 that is fixed or replaced in the heart 15.
  • Once the internal DRF 48 is positioned within the patient 14 in block 106, a determination block can determine whether cine loop image data was acquired prior to placing the DRF in block 108. Although two possibilities can occur from the decision block 108, initially the following discussion will be directed to following a NO path 110 to a collection of cine loop image data while simultaneously tracking locations of the placed DRF 48 in block 112.
  • Collecting the cine loop image data while simultaneously tracking the DRF 48 includes imaging the patient 14 over time and tracking location of the DRF 48 placed in the patient 14 in block 106 over the same time period. According to various embodiments, an imaging system, such as the imaging system 12 that can include the O-arm® imaging system or other fluoroscopic imaging systems, can acquire image data of the patient 14, including the heart 15, over a specified time period. During the specified time period a selected frame rate of images can be acquired of the patient 14, for example, about 30 frames (e.g. image frames) per second can be captured. It will be understood, however, that any appropriate frame rate, such as 10 frames per second, 20 frames per second, or 60 frames per second, can be collected. Regardless of the frame rate, the frames can be displayed in sequence to illustrate a motion of the patient 14, such as the heart 15, over time. Similar to a cinematic movie where a plurality of frames are shown in a selected sequence and speed to achieve an illusion of motion on a display. Accordingly, the plurality of frames collected can be displayed on a display device, such as the display device 66 to illustrate motion of the heart 15 over time in a “cine loop”. The time period can be any appropriate time period and each frame can be collected at a time point tx. For example a time period can be from time point t1 to tn, where each frame is collected at each time point tx in the time point t1 and tn.
  • During the acquisition of the image data in block 112, the DRF 48 positioned in the patient 14 can be tracked over the same time period. Additionally, the location of the DRF 48 can be tracked at a similar rate or at a faster rate or at a slower rate relative to the acquisition of the image data. Nevertheless, it can be selected to track or determine the location of the DRF 48 during or for each of the frames at time points t1 to tn. Accordingly, a location of the DRF 48 for each of the frames collected in the image data can be correlated to a tracked location of the DRF 48. The DRF 48 can be tracked in any appropriate degrees of freedom of movement including an x, y, z three dimensional position and selected orientation positions, if selected.
  • Once the image data is collected while simultaneously tracking and the DRF 48 in block 112, a correlation can be made between the multiple frames collected of the image data and the tracked locations of the DRF 48 in block 114. As discussed above, it can be selected to correlate each of the frames of the image data acquired to a tracked location of the DRF 48. The following discussion relates to correlation of images to tracked locations of the DRF 48. It will be understood, that alternatively, a procedure can be navigated with images that are correlated using various techniques including gating frame selection techniques as disclosed in U.S. patent application Ser. No. 12/183,688, filed on Jul. 31, 2008, and published as U.S. Pat. App. Pub. No. 2010/0030061, published on Feb. 4, 2010, incorporated herein by reference.
  • As illustrated in FIG. 3, in the left column 31 the location of the DRF 48 can be tracked for the period and at each of the time points t1 through tn. Each of these DRF 48 locations can be correlated to a cine loop image or frame, as discussed above. The plurality of images can be placed in a cine loop to illustrate motion of the heart 15. As illustrated in FIG. 3, in the left hand column 31 each of the tracked locations at each of the time points relate to an image or frame in the cine loop at each of the time points t1 through tn, as illustrated by the arrows 3 a.
  • The correlation can be done during the tracking of the DRF 48 and the acquisition of the images or after the acquisition of the images and the tracked location of the DRF 48. For example, every time an image is acquired, the location of the DRF 48 can be tracked or determined and correlated substantially simultaneously. Alternatively, the DRF 48 can be tracked by a lead tracking system 30 and the cine loop images can be acquired by the imaging system 12. The specific times for each of the images at time points t1 through tn can be saved with each of the images any the specific times of the tracked location of the DRF 48 at time t1 though tn can be saved by the tracking system 30. An appropriate system, such as the navigation system 10 including the processor system 68, can then correlate the two specific times for a tracked location of the DRF 48 and an image in the cine loop acquired by the image system 12. In other words, as a high level example, if an image is acquired at 1100 hours and 5 seconds, and a tracked location of the DRF 48 is acquired at 1100 hours and 5 seconds, the frame and the tracked location of the DRF 48 can be correlated. It will be understood, the system, such as the processor system 68 can execute instructions to perform the correlation based on the time points of the acquisition of the cine loop image data (e.g. frames) and the determined locations of the DRF 48 at the same time points. Generally, each frame will have at least one DRF 48 location that is correlated with that frame. Thus, the location of the DRF 48 can be used to select or determine which correlated frame illustrates the configuration and/or location of the heart 15. In other words, at a later time when the DRF 48 is determined to be at a location that location can be used to select which of the frames, based on the correlated DRF 48 location, should be illustrated.
  • It will also be understood, however, that the acquisition of the image data by the imaging device 12 and the tracked location of the DRF 48 can be coordinated and correlated with the navigation system 10. For example, the beginning of the tracking and the beginning of the image acquisition may happen at a time 0 and each time a frame of the cine loop is collected by the imaging system 12, a tracked location of the DRF 48 can be saved. Thus, the imaging and tracked location determination can happen simultaneously and be coordinated as such. They can then be correlated within the navigation system 10.
  • After the correlation of the image data to the tracked location of the DRF 48, as illustrated in FIG. 3 occurs in block 114, the correlated cine loop images and DRF tracked locations can be stored in block 116. The stored correlated images and tracked locations can be stored within the memory system of the navigation memory 62 or as a part of the image processing unit 58. Nevertheless, the images can be stored in a selected memory, such as a substantially permanent memory or flash memory, for display and/or recall at a selected time. The stored images can be correlated to the tracked locations of the DRF 48 such as with a look-up table or index.
  • As discussed further herein, various frames of the cine loop can be displayed based upon a tracked location of the DRF 48. However, it will be understood that the decision block 108 on whether the cine loop image data was acquired prior to placing the DRF 48 can follow a YES path 120. Accordingly, prior to continuing discussion of performing a procedure with image data correlated with the DRF 48 location, a discussion of acquiring image data prior to placing the DRF 48 in block 106 will be discussed by following the YES path 120 from the decision block 106 of whether the cine loop image data was acquired prior to placing the DRF 48.
  • After a determination of following the YES path 120, due to the determination that the cine loop image data was acquired prior to placing the DRF 48 in block 108, one of two possibilities can occur. One possibility is that a simultaneous measurement of selected physiological subject data and a tracked location of the DRF can occur in block 122. Alternatively, a path can be followed to recall or collect cine loop images simultaneously while measuring the selected physiological data of the subject in block 124. The reason that either path to block 122 or 124 can be followed after the determination of the YES path 120 will be clear after the following discussion. Regardless, tracked and determined locations of the DRF 48 that is placed in block 106 can be correlated to a physiological measurement of the subject and/or can be correlated to image data that is acquired of the subject. Accordingly, the processes in blocks 122 or 124 may happen in any selected order and one need not occur before the other.
  • For the following discussion, simultaneously measuring the subject data and tracking location of the DRF 48 in block 122 will be selected to occur first. Generally, the DRF 48 is paced in the subject, such as the patient 14 in block 106. A selected physiological measurement of the patient 14 can then be made after the DRF 48 is positioned in the patient 14. The physiological measurement of the patient can be any appropriate measurement, such as a respiration, heart rhythm measurement, various polarizations or depolarization's within the heart, or any selected physiological measurements. Generally, a physiological measurement can include measurements of the patient 14 that relate to the physiology of the patient 14 based on the patient's 14 anatomy. For example, an electro-cardiogram (EKG) can be made of the patient 14 that relates to a rhythm of the heart 15 of the patient 14. The measurement of the heart 15 with an EKG can be made while tracking the location of the DRF 48. A correlation between the tracked location of the DRF 48 and various peaks (e.g. amplitudes) and valleys (e.g. amplitudes) in EKG can then be made or saved. Accordingly, the location of the DRF 48 as it relates to any particular portion of the measured EKG can be determined.
  • Cine loop frames image can be acquired of the patient 14 using any appropriate imaging system, such as a magnetic resonance imaging (MRI) or computed tomography (CT) imaging devices. Thus, cine loop or any plurality of images need not be acquired with the DRF in the portion of the subject being imaged. According to various embodiments, the DRF can be placed and an image system, such as the O-arm® imaging system, can image the subject with the DRF in place, but the DRF is not required to be in place while imaging occurs, according to various embodiments. In various embodiments, the DRF 48 cannot (e.g. magnetic coils in a magnet of a MRI) or can be selected to not be positioned in the patient 14 during the acquisition of the cine loop frames. As discussed above, the cine loop data can include the acquisition of a plurality of image frames of the patient 14 over a selected period of time such as between time points t1 through tn. During the acquisition of the cine loop frames in block 124, physiological measurement of the patient 14 can also be collected.
  • In one example, as illustrated in FIG. 4, the cine loop can be acquired while measuring an EKG of the patient 14. The physiological measurement, or a selected portion thereof, can be correlated to each image in the cine loop frame or image that is collected in block 124. As illustrated in FIG. 4, a physiological measurement, such as an EKG measurement of the patient 14, can include a selected amplitude or peak that is measured at each time point t1 through tn as illustrated in the middle column 4 m. An image in the cine loop can also be collected at each of the time points t1 through tn as illustrated in the right column 4 r. As illustrated by arrows 4 a 1, a correlation to the amplitude in column 4 m at time point t1 and the image at time point t1 can be made while acquiring images or at any selected point thereafter, similar to that discussed above. For example, the physiology of the patient 14 can be measured beginning at time 0 and the image data can be acquired at time 0 and moving through the time period the physiological measurements can continue as can the acquisition of image frames. Accordingly, the correlation of the image frames to the physiology measurements can be made at any selected time either during or after the acquisition of both the image data and the physiological measurements.
  • It is understood that the acquisition of the image data and the simultaneous measurement of the physiological data of the patient in block 124 can be collected before or after a simultaneous measurement of the same physiological data of the patient 14 and the tracking of a location of the DRF 48 in block 122. Nevertheless, after both the image data is collected along with a selected physiological measurement and the tracked measurement of the DRF 48 is made along with the collection of the same physiologic data, a correlation of the cine loop frames and the location of the DRF 48 can be made in block 126. As illustrated in FIG. 4, in the left hand column 41, the tracked location of the DRF 48 for each of the time points t1 though tn is illustrated. These locations can be correlated to the measured physiological data at time points t1 though tn as illustrated by the arrows 4 a 2. As discussed above, arrows 4 a 1 show correlation of the physiological measurement and the flows of the cine loop at the various time points. Due to the correlation of the tracked location of the DRF 48 to the physiological data and the physiological data to the cine loop image frames, the tracked location of the DRF 48 can be correlated to each of the frames of the cine loop image data as illustrated by the connecting arrow 130 in FIG. 4.
  • Thus, individual tracked locations of the DRF 48 can be directly correlated to individual images in the acquired cine loop image data as illustrated in FIG. 4. The correlated cine loop images in block 116 can include correlated images with a tracked location of the DRF 48 that were acquired prior to positioning the DRF 48 within the patient 14. Although the DRF 48 can be positioned within the patient 14 subsequent to acquisition of the cine loop image data, the position of the DRF 48 will be correlated to the various frames in the cine loop image data based upon a correlating data set, such as the measured physiological data in column 4 m of FIG. 4.
  • Regardless of the method of correlation, including those discussed above, each frame or a selected plurality of the image frames from the acquired cine loop, regardless of the source of the images, can be correlated to a selected or tracked location of the DRF 48. The correlation can be direct (e.g. based on tracking the DRF 48 while acquiring the image data) or indirect (e.g. based on using an intermediate measurement, such as a physiological measurement). As illustrated in FIGS. 3 and 4, the location of the DRF 48 at time t1 is correlated to an image that relates to the same time t1. As discussed further herein, a later tracked location of the DRF 48 in block 140 can be used to select which of the images for display or analysis. In other words, it is not required to select, display, or analysis the images 3R, 4R, in the order as acquired, but can be selected based on the instantaneous and delayed determined position of the DRF 48 in any order based on the determined position of the DRF 48.
  • Once the correlated cine loop image data is related to the respective tracked location of the DRF 48 is made in block 116, the DRF 48 can be further tracked in block 140. As discussed above, the tracking system 30 can track the location of the DRF 48 within the patient 14, such as within the heart 15. The location of the DRF 48 within the heart 15 can be determined to determine the motion or position of the related structure to which the DRF 48 is attached, within the heart 15. As discussed above, the DRF 48 can be positioned at the right ventricle apex and the DRF 48 can therefore determine the position of the right ventricle apex over time as the location of the DRF 48 is tracked. Once the DRF 48 is being tracked within the patient 14, and the correlation of the DRF 48 locations with frames in the cine loop image data has been made in block 116, a recall and or display of a selected frame from the cine loop image data can be made based on the tracked location of the DRF 48 in block 142.
  • Returning reference to FIGS. 3 and 4, a tracked location of the DRF 48 has been correlated to each image in the cine loop image data. Accordingly, the location of the DRF 48, including x, y, z coordinates and any orientation coordinate relating to, for example, time point t3, are tracked with a tracking system 30 then the image that relates to time point t3 can be displayed on the display device 66. Similarly, as illustrated in FIG. 4, a measured physiology of the patient can be used to select the image from the cine loop image that relates to the time period, such as time point t3 that relates to the image at time point t3. Accordingly, as illustrated in FIGS. 3 and 4, a tracked location of the DRF 48 in block 142 can be used to select an image that correlates to the tracked location of the DRF 48 from the stored data in block 116. The correlated image can then be displayed on the display 66 based upon the tracked location of the DRF 48.
  • It will be understood that the tracked location of the DRF 48 changes over time due to motion of the patient 14, including motion of the heart 15. The sampling rate of the location of DRF 48 can be any appropriate sampling rate, such as one that correlates to a speed at which the frames in the cine loop were collected. Accordingly, if the images in the cine loop were collected at 30 frames per second, then the position of the DRF 48 can be sampled at about 30 times per second. Each time the position of the DRF 48 is sampled a determination of the position of the DRF 48 can be made and a determination of which the time point t1-tn the tracked location from block 140 can be determined. A further determination can be made as to which of the images relates to the tracked location of the DRF 48 at the determined time point and that frame can be displayed on the screen at substantially the same sampling rate as the DRF 48. Accordingly, as the DRF 48 moves within the heart 15, the image 64 on the display device 66 can be matched at substantially the same rate. This can allow the image 64 on the display device 66 to substantially mimic a cine loop of the motion of the heart 15. This will illustrate a more realistic and non-static position of the heart 15 over time. It will be understood that the display can be refreshed at any appropriate rate, such as less than 30 frames or more than 30 frames per second, based on a selection of the user 54 or any appropriate selection and/or the frame rate of the acquisition of image frames of the cine loop.
  • Additionally, it will be understood, that various processing and detection, delays may be present. Accordingly, as illustrated in FIG. 5, a predictive image can be selected. For example, if the navigation system 10 determines that the DRF 48 is tracked in block 140 at the location of time point t1 then the image at time point t3 can be selected for display in block 142. The predictive time period can be any appropriate time period such as one to five or one to ten subsequent time point images later relative to the determined location of the DRF 48. The time delay can be any appropriate delay, such as about 1 to 10 seconds, and including about 1-5 seconds, and for example including about 3 seconds. Thus, the tracked location of the DRF in block 140 that relates to time period t1 can be used to predict that the image to be displayed in block 142 can be one that is about 3 seconds later rather than the image that directly correlates to the determined actual tracked location of the DRF 48. Thus, a one to one correlation, that being the time point t1 tracked location of the DRF and a display of the image for time point t1 is not required for display on the display 66.
  • As a further example and explanation, images may also be selected or recalled, in block 142, out of sequence from the acquisition sequence. As illustrated in FIG. 6, a cine loop or plurality of images can be acquired in a selected order Image t1, image t2, to image tn. Each of the images image t1 to image tn can be correlated to a specific tracked locations x,y,z,φ,θ t1 to x,y,z,φ,θ tn of the DRF 48. While tracking the DRF 48 in block 140, the location of the DRF 48 may not move in the same sequence as when the cine loop was acquired. As discussed above, for example, the heart 15 of the patient 14 may not beat in a normal or average sinus rhythm due to the procedure of other trauma to the patient 14. Thus, the locations x,y,z,φ,θ of the DRF 48 may be out of order of the acquired cine loop of images. Nevertheless, because each image t1 to image tn is correlated to a specific DRF location (e.g. by appropriate methods as discussed above), when one DRF location x,y,z,φ,θ is determined the image to which it relates can be selected for display or analysis.
  • As illustrated in FIG. 6, the tracked location of the DRF can include, in tracked locations x,y,z,φ,θ t5, x,y,z,φ,θ t7, x,y,z,φ,θ t1, and x,y,z,φ,θ t22 in that order. The image that relates to each of these specific tracked locations can be displayed in the order as tracked, or as selected, including image t5, image t7, image t1, and image t22. Thus, tracking the DRF 48 in block 140 does not simply synchronize the display of the image data, but actually allows for a specific determination and selection of an image for display. It is further understood, that interpolation and “guessing” or averaging algorithms can be used even if the DRF 48 is not tracked at a location identical to a location determined for the correlation.
  • This allows for exceptional accuracy and true display of the position of a moving structure, such as the heart 15, which will not be altered based on a change in physiology of the patient 14. Also, the physiology of the patient 14 need not be tracked or monitored to determine which image to select for display. Rather, the tracked location of the DRF 48 alone can be used for the selection.
  • A decision of whether a procedure is to be performed may be made in decision block 150. If no procedure is to be performed then a NO path 148 can be followed to end the method in block 154. However, if it is determined to perform a procedure then a YES path 158 can be followed to track a location of the instrument 80 in block 160. The location of the instrument 80 can be based upon a tracked location of the tracking device 52 that is connected with the instrument 80. The tracked location of the instrument 80 with the tracking device 52 can be used to determine the location of the instrument 80 within the heart 15. The location of the instrument 80 can then be displayed as an icon 166 representing the location of the tracked instrument 80 in block 162. As illustrated in FIG. 1, the icon 166 can be displayed on the display device 68 relative to the image 64 of the heart 15. The image of the heart 64 can be the selected frame from the cine loop in block 142 based upon the tracked location of the DRF 48 from block 140. Thus, the display of the icon 166 representing the location of the tracked location of the instrument in block 162 can relate to a substantially instantaneous and natural and accurate location of the instrument 80 within the heart 15. This is in the alternative to illustrating a single image of the heart 15 as the image 64 which make it appear that the heart 15 is static and motionless and where the instrument 80 may not be in the exact illustrated location due to the movement of the heart 15. The imprecise location illustration of the instrument 80 may be due to the fact that the heart 15 may differ in location from a static image of the heart that is displayed on the display device 66 due to motion of the heart 15.
  • By allowing the DRF 48 to be tracked in block 140 and to select an image based upon the tracked location of the DRF 48 in block 142, the image on the display device 66 can change over time and be based upon a substantially mechanically tracked location of the heart 15. The DRF 48 can be connected directly to or substantially near a selected portion of the heart 15, a physical location of the heart 15 is tracked with the DRF 48. This can allow for a substantially precise and appropriate selection of the image in block 142 for display on the display device 66.
  • The heart 15 may undergo changes due to the procedure that is occurring, but the physical location of the heart 15 can be tracked with a tracking device 48 connected to the heart 15. Accordingly, even assuming that the heart 15 of the patient 14 may have a rhythm that is interrupted due to the procedure that is occurring on the patient 14, such as due to the positioning of the instrument 80 within the patient, the physical location of the heart 15 that is tracked with the DRF 48 can be used to select an image of the cine loop to be displayed on the display device 66. Accordingly, an appropriate image can be selected for display on the display device 66 that is unaffected by change in the physiology of the patient 14 due to the procedure and is based substantially or mostly on a physical location of the portion of the patient, such as the heart 15, on which the procedure is occurring.
  • Accordingly, acquisition of the cine loop image frames and/or the determination of the location of the DRF need not be gated to the patient 14. For example, the determination of the location of the instrument 80 in the patient 14 can be tracked over time and illustrated relative to an image frame of the cine loop that can change over time due to the selection based on the tracked location of the DRF 48. A static image is not necessary or only a limited time of location determination is not necessary as the location of the DRF 48 can be determined at the same time as the location of the instrument 80.
  • The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.

Claims (5)

What is claimed is:
1. A method of preparing images for viewing:
acquiring a plurality of locations of a reference tracking device fixed to a structure over time while acquiring image frames of the structure;
acquiring a plurality measurements of a selected feature of the structure with a measuring device during the acquisition of the image frames over time; and
correlating with a correlation system at least a first image frame of the acquired image frames to at least one of a first location of the plurality of locations of the reference tracking device or a first selected measurement of the plurality of measurements of the measuring device.
2. The method of claim 1, further comprising:
displaying at least one of the first frame or the second frame based on at least one of a subsequent acquisition of a location of the reference tracking device or acquisition of a measurement with the measuring device.
3. A system to prepare images for viewing, comprising:
an imaging device configured to acquire image frames of a structure over time;
a reference tracking device fixed to the structure, wherein the reference tracking device is configured to move with the structure over time;
a measuring device configured to measure a feature of the structure during an acquisition of the image frames over time;
a correlation system configured to correlate at least a first image frame of the acquired image frames to at least one of a first location of the reference tracking device or a first selected measurement by the measuring device.
4. The system of claim 3, wherein the correlation system is further configured to correlate at least a second image frame of the acquired image frames to at least one of a second location of the reference tracking device or a second selected measurement by the measuring device.
5. The system of claim 4, further comprising:
a selection system to select for display at least one of the first frame or the second frame based on at least one of a subsequent tracked location of the reference tracking device or measurement of the measuring device.
US14/319,973 2011-02-03 2014-06-30 Display Of An Acquired Cine Loop For Procedure Navigation Abandoned US20140316256A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/319,973 US20140316256A1 (en) 2011-02-03 2014-06-30 Display Of An Acquired Cine Loop For Procedure Navigation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/020,543 US8768019B2 (en) 2011-02-03 2011-02-03 Display of an acquired cine loop for procedure navigation
US14/319,973 US20140316256A1 (en) 2011-02-03 2014-06-30 Display Of An Acquired Cine Loop For Procedure Navigation

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/020,543 Continuation US8768019B2 (en) 2011-02-03 2011-02-03 Display of an acquired cine loop for procedure navigation

Publications (1)

Publication Number Publication Date
US20140316256A1 true US20140316256A1 (en) 2014-10-23

Family

ID=45569741

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/020,543 Active 2032-10-15 US8768019B2 (en) 2011-02-03 2011-02-03 Display of an acquired cine loop for procedure navigation
US14/319,973 Abandoned US20140316256A1 (en) 2011-02-03 2014-06-30 Display Of An Acquired Cine Loop For Procedure Navigation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/020,543 Active 2032-10-15 US8768019B2 (en) 2011-02-03 2011-02-03 Display of an acquired cine loop for procedure navigation

Country Status (2)

Country Link
US (2) US8768019B2 (en)
WO (1) WO2012106063A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104248449B (en) * 2013-06-28 2018-11-20 通用电气公司 Detect method and apparatus, playback control methods and the equipment and ultrasonic machine of start frame
CN112971985B (en) 2014-07-03 2024-09-03 圣犹达医疗用品国际控股有限公司 Local magnetic field generator
US20150237327A1 (en) * 2015-04-30 2015-08-20 3-D XRay Technologies, L.L.C. Process for creating a three dimensional x-ray image using a single x-ray emitter
US10568598B2 (en) * 2016-06-29 2020-02-25 Wisconsin Alumni Research Foundation Fluoroscopy system resolving slowly evolving conditions
EP3685756A1 (en) * 2019-01-24 2020-07-29 Koninklijke Philips N.V. Methods and systems for investigating blood vessel characteristics

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6273896B1 (en) * 1998-04-21 2001-08-14 Neutar, Llc Removable frames for stereotactic localization
US20040073105A1 (en) * 2002-07-29 2004-04-15 Hamilton Craig A. Cardiac diagnostics using wall motion and perfusion cardiac MRI imaging and systems for cardiac diagnostics
US20050038337A1 (en) * 2003-08-11 2005-02-17 Edwards Jerome R. Methods, apparatuses, and systems useful in conducting image guided interventions
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20070055142A1 (en) * 2003-03-14 2007-03-08 Webler William E Method and apparatus for image guided position tracking during percutaneous procedures
US20080139930A1 (en) * 2005-01-31 2008-06-12 Koninklijke Philips Electronics, N.V. System And Method For The Guidance Of A Catheter In Electrophysiologic Interventions
US20090226069A1 (en) * 2008-03-07 2009-09-10 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20100097373A1 (en) * 2000-08-23 2010-04-22 Corpak Medsystems, Inc. Catheter locator apparatus and method of use
US20100137709A1 (en) * 2008-11-21 2010-06-03 Cyberheart, Inc. Test Object For The Validation of Tracking In The Presence of Motion
US20100226537A1 (en) * 2007-10-01 2010-09-09 Koninklijke Philips Electronics N.V. Detection and tracking of interventional tools
US7835784B2 (en) * 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US20110054308A1 (en) * 1999-05-18 2011-03-03 Amit Cohen Method and system for superimposing virtual anatomical landmarks on an image
US20130303887A1 (en) * 2010-08-20 2013-11-14 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US6702736B2 (en) * 1995-07-24 2004-03-09 David T. Chen Anatomical visualization system
US5697377A (en) 1995-11-22 1997-12-16 Medtronic, Inc. Catheter mapping system and method
US8442618B2 (en) 1999-05-18 2013-05-14 Mediguide Ltd. Method and system for delivering a medical device to a selected position within a lumen
US7386339B2 (en) * 1999-05-18 2008-06-10 Mediguide Ltd. Medical imaging and navigation system
US6493573B1 (en) 1999-10-28 2002-12-10 Winchester Development Associates Method and system for navigating a catheter probe in the presence of field-influencing objects
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6747539B1 (en) 1999-10-28 2004-06-08 Michael A. Martinelli Patient-shielding and coil system
WO2002082375A2 (en) 2001-04-06 2002-10-17 Stephen Solomon Cardiological mapping and navigation system
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US7945304B2 (en) * 2001-11-20 2011-05-17 Feinberg David A Ultrasound within MRI scanners for guidance of MRI pulse sequences
EP1474040B1 (en) 2002-02-15 2007-10-24 Breakaway Imaging, Llc Gantry ring with detachable segment for multidimensional x-ray-imaging
CN100398066C (en) 2002-03-13 2008-07-02 分离成像有限责任公司 Systems and methods for quasi-simultaneous multi-planar X-ray imaging
WO2003081220A2 (en) 2002-03-19 2003-10-02 Breakaway Imaging, Llc Computer tomograph with a detector following the movement of a pivotable x-ray source
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7314446B2 (en) * 2002-07-22 2008-01-01 Ep Medsystems, Inc. Method and apparatus for time gating of medical images
AU2003262726A1 (en) 2002-08-21 2004-03-11 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7260426B2 (en) 2002-11-12 2007-08-21 Accuray Incorporated Method and apparatus for tracking an internal target region without an implanted fiducial
US20040199072A1 (en) 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
AU2006238292B2 (en) * 2005-03-31 2010-04-15 Olympus Medical Systems Corp. Surgery assisting apparatus and treatment assisting apparatus
US8303505B2 (en) * 2005-12-02 2012-11-06 Abbott Cardiovascular Systems Inc. Methods and apparatuses for image guided medical procedures
US8180428B2 (en) * 2007-10-03 2012-05-15 Medtronic, Inc. Methods and systems for use in selecting cardiac pacing sites
US20100030061A1 (en) 2008-07-31 2010-02-04 Canfield Monte R Navigation system for cardiac therapies using gating
US10069668B2 (en) * 2009-12-31 2018-09-04 Mediguide Ltd. Compensation of motion in a moving organ using an internal position reference sensor

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6273896B1 (en) * 1998-04-21 2001-08-14 Neutar, Llc Removable frames for stereotactic localization
US20110054308A1 (en) * 1999-05-18 2011-03-03 Amit Cohen Method and system for superimposing virtual anatomical landmarks on an image
US20100097373A1 (en) * 2000-08-23 2010-04-22 Corpak Medsystems, Inc. Catheter locator apparatus and method of use
US20040073105A1 (en) * 2002-07-29 2004-04-15 Hamilton Craig A. Cardiac diagnostics using wall motion and perfusion cardiac MRI imaging and systems for cardiac diagnostics
US20070055142A1 (en) * 2003-03-14 2007-03-08 Webler William E Method and apparatus for image guided position tracking during percutaneous procedures
US20050038337A1 (en) * 2003-08-11 2005-02-17 Edwards Jerome R. Methods, apparatuses, and systems useful in conducting image guided interventions
US20050281385A1 (en) * 2004-06-02 2005-12-22 Johnson Douglas K Method and system for improved correction of registration error in a fluoroscopic image
US20080139930A1 (en) * 2005-01-31 2008-06-12 Koninklijke Philips Electronics, N.V. System And Method For The Guidance Of A Catheter In Electrophysiologic Interventions
US7835784B2 (en) * 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US20100226537A1 (en) * 2007-10-01 2010-09-09 Koninklijke Philips Electronics N.V. Detection and tracking of interventional tools
US20090226069A1 (en) * 2008-03-07 2009-09-10 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20100137709A1 (en) * 2008-11-21 2010-06-03 Cyberheart, Inc. Test Object For The Validation of Tracking In The Presence of Motion
US20130303887A1 (en) * 2010-08-20 2013-11-14 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Galloway. "The process and development of image-guided procedures". Annu. Rev. Biomed. Eng. 2001. 3:83-108. (Year: 2001) *

Also Published As

Publication number Publication date
WO2012106063A1 (en) 2012-08-09
US20120201432A1 (en) 2012-08-09
US8768019B2 (en) 2014-07-01

Similar Documents

Publication Publication Date Title
US9579161B2 (en) Method and apparatus for tracking a patient
US10939053B2 (en) System and method for radio-frequency imaging, registration, and localization
US8010177B2 (en) Intraoperative image registration
US7831096B2 (en) Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US10413377B2 (en) Flexible skin based patient tracker for optical navigation
US8694075B2 (en) Intra-operative registration for navigated surgical procedures
US8364242B2 (en) System and method of combining ultrasound image acquisition with fluoroscopic image acquisition
US9320569B2 (en) Systems and methods for implant distance measurement
US8428690B2 (en) Intracardiac echocardiography image reconstruction in combination with position tracking system
JP6412608B2 (en) Interventional imaging
US20080119712A1 (en) Systems and Methods for Automated Image Registration
EP2097031A2 (en) System and method for detecting status of imaging device
GB2426656A (en) Medical imaging with region of interest tracking using imaging device movement
US20140316256A1 (en) Display Of An Acquired Cine Loop For Procedure Navigation
JP2014523295A5 (en)
WO2008130354A1 (en) Intraoperative image registration
US20100298695A1 (en) System and Method for Cardiac Lead Placement
US9477686B2 (en) Systems and methods for annotation and sorting of surgical images
US20240341860A1 (en) System and method for illustrating a pose of an object
US20240328784A1 (en) Tracking device and method of using the same
WO2024214057A1 (en) System and method for illustrating a pose of an object

Legal Events

Date Code Title Description
AS Assignment

Owner name: MEDTRONIC, INC., MINNESOTA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NEIDERT, MICHAEL R.;REEL/FRAME:033260/0454

Effective date: 20110204

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION