WO2024089502A1 - Système et procédé pour illustrer une pose d'un objet - Google Patents

Système et procédé pour illustrer une pose d'un objet Download PDF

Info

Publication number
WO2024089502A1
WO2024089502A1 PCT/IB2023/060058 IB2023060058W WO2024089502A1 WO 2024089502 A1 WO2024089502 A1 WO 2024089502A1 IB 2023060058 W IB2023060058 W IB 2023060058W WO 2024089502 A1 WO2024089502 A1 WO 2024089502A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
instrument
relative
image
pose
Prior art date
Application number
PCT/IB2023/060058
Other languages
English (en)
Inventor
Elliott SCHMIDT
Can Cinbis
Kenneth Gardeski
Joshua Blauer
Varun A. BHATIA
Ryan P. Lahm
Brett D. JACKSON
Tarek D. Haddad
Original Assignee
Medtronic, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medtronic, Inc. filed Critical Medtronic, Inc.
Publication of WO2024089502A1 publication Critical patent/WO2024089502A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • A61B8/145Echo-tomography characterised by scanning multiple planes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/40Positioning of patients, e.g. means for holding or immobilising parts of the patient's body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means

Definitions

  • the present disclosure relates to acquisition of image data of a subject, and particularly to acquisition and display of image data collected from a tracked imaging system.
  • An imaging system can be used to image various portions of a subject.
  • the subject can include a patient, such as a human patient.
  • the portions selected to be imaged can be internal portions that are covered by skin or other tissue.
  • a location of imaged portions of the subject may be selected to be known within the imaging system. The locations can be defined or established relative to instruments placed in the subject (e.g., a location of a heart wall relative to a catheter) or a location of the imaged portion relative to the instrument acquiring the image data.
  • An imaging modality may be used to generate image data that are used to render images of an interior of the subject. For example, x-rays may be used to generate image data of an interior of a subject. The images may be displayed for viewing by a user.
  • the subject disclosure relates to illustrating a position, including a pose of an instrument relative to a patient.
  • the patient may be a human or nonhuman patient. Further the patient or subject may be a non-living object where the location of an instrument within the subject is selected to be tracked.
  • a representation of the patient may be fit or morphed to illustrate or represent a size of the patient. Therefore, relative portions of the patient, such as relative portions of the anatomy, are determined or known relative to one another based upon the patient that is the current patient during a procedure. An avatar or indicia of the patient may be generated or altered to a specific patient substantially in real time during a procedure.
  • An instrument can be tracked relative to the patient.
  • a representation of the instrument may be superimposed on the avatar of the patient.
  • the avatar of the patient may provide a visual illustration of various portions of the anatomy of the patient without requiring image data being acquired of the patient.
  • the display being viewed by the user may be generated without requiring image data of the patient, such as fluoroscopy image data.
  • FIG. 1 is an environmental view of a with an imaging and navigation system
  • Fig.2 is a screen shot of a representation of an instrument at a tracked pose relative to an avatar of the , according to various embodiments;
  • Fig.3 is a schematic view of a patient in an operating theater;
  • FIG.4 is a schematic view of a user and a patient in an operating theater, according to various embodiments.
  • FIG.5 is a detailed view of an instrument being moved relative to a patient
  • FIG. 6 is a screen shot of a display illustrating a position of an instrument relative to an avatar
  • Fig. 7 is a flowchart of a method, according to various embodiments
  • FIG. 8 is a schematic illustration of an atlas and an image
  • FIG. 9 is an illustration of images in two planes of a heart, according to various embodiments including ultrasound;
  • FIGs. 10A and 10B illustrate movement of a tracked instrument over time
  • FIG. 11 is a flowchart illustrating a method of showing movement of a tracked instrument over time, according to various embodiments
  • Fig. 12A illustrates a tracked pose of an instrument relative to an image in a plane in two orientations, according to various embodiments
  • Fig. 12B illustrates a tracked pose of an instrument relative to an image in a plane in two orientations, according to various embodiments
  • FIG. 13 is a schematic illustration of a magnet
  • Fig. 14 is a schematic illustration of the magnet relative to various instruments, according to various embodiments.
  • FIG. 15 is a schematic illustration of a magnetic sensing assembly
  • FIG. 16 is a schematic illustration of an operating room with magnetic sensing assemblies, according to various embodiments; and [0027] Fig. 17 is a flowchart illustrating a method of tracking an instrument with a permanent magnetic field.
  • Fig. 1 is a diagram illustrating an overview of a navigation system 10 that can be used for various procedures.
  • the navigation system 10 can be used to track the location of an item, such as an implant or an instrument, and at least one imaging system 12 relative to a subject, such as a subject 14 that may include a human patient and/or other living or non-living subject.
  • the navigation system 10 may be used to navigate any type of instrument, implant, or delivery system, including: catheters, stylets, leads for cardiac rhythm management devices such as pacemakers, leadless pacemakers and delivery systems therefor, guide wires, arthroscopic systems, ablation instruments, stents, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, mechanical parts, Transesophageal Echocardiography (TEE), intra-cardiac echocardiography (ICE), etc.
  • Non-human or non-surgical procedures may also use the navigation system 10 to track a non-surgical or non-human intervention of the instrument or imaging device.
  • the instruments may be used to navigate or map any region of the body.
  • the navigation system 10 and the various tracked items may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
  • the navigation system 10 can interface with or integrally include an imaging system 12 that is used to acquire pre-operative, intra-operative, or postoperative, or real-time image data of the patient 14.
  • the imaging system 12 can be an ultrasound imaging system (as discussed further herein) that has a tracking device 22 attached thereto.
  • the tracking device 22 may be tracked with the tracking system to determine a pose of the imaging system 12.
  • the pose may include an orientation (e.g., three or more degrees of orientation (e.g., yaw, pitch, and roll) and/or a position (e.g., three degrees of freedom in physical space (e.g., x-axis, y-axis, and z-axis).
  • the tracking system 10 may further determine appropriate pose information regarding the tracking device 22.
  • the pose of the imaging system 12 can then be determined based on the tracked pose of the tracking device 22.
  • the imaging system 12 may be used to generate image data to provide images for viewing with a selected display device 26, which may comprise any appropriate display device including an augmented and/or virtual reality display device worn/used by the user 18 such as those disclosed in U.S. Pat. App. Pub. No. US2018/0078316A1 , incorporated herein by reference.
  • the navigation system 10 can be used to track various tracking devices, as discussed herein, to determine locations of the patient 14.
  • the tracked poses of the patient 14 can be used to determine or select images for display to be used with the navigation system 10.
  • the initial discussion, however, is directed to the navigation system 10 and the exemplary imaging system 12.
  • the imaging system 12 includes an ultrasound (US) imaging system with an US housing 16 that is held by a user 18 while collecting image data of the subject 14. It will be understood, however, that the US housing 16 can also be held by a stand or robotic system while collecting image data.
  • the US housing 16 and included transducer can be any appropriate US imaging system 12, such as the M-TURBO® sold by SonoSite, Inc. having a place of business at Bothell, Washington.
  • Associated with, such as attached directly to or molded into, the US housing 16 or the US transducer housed within the housing 16 is at least one imaging system tracking device 22.
  • the tracking device 22 may be any appropriate tracking device such as an electromagnetic tracking device and/or an optical tracking device.
  • the tracking devices can be used together (e.g., to provide redundant tracking information) or separately. Also, only one of the two tracking devices may be present. It will also be understood that various other tracking devices can be associated with the US housing 16, as discussed herein, including acoustic, ultrasound, radar, electrical impedance, and other tracking devices. Also, the tracking device 22 can include linkages or a robotic portion that can determine a location relative to a reference frame.
  • a secondary imaging system may also be presented and/or used to generated image data of the patient 14.
  • the second imaging system may include an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA, a C-arm imaging system, or other appropriate imaging system.
  • the second imaging system can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference.
  • the second imaging system need not be present and the imaging system 12, which may be only the US transducer, may be the only imaging system to generate images to be displayed.
  • various other image data may also be displayed such as an avatar 120 of the patient 14.
  • the imaging system 12 may be tracked, as discussed above.
  • Various tracking systems may include and/or require calibration of the imaging system.
  • the pose of the tracking device 22 relative to a plane of the US imaging system may be determined and/or known.
  • Various systems and methods are disclosed in U.S. Patent Nos. 6,379,302; 6,669,635; 6,968,224; 7,085,400; 7,831 ,082; 8,320,653; 8,811 ,662; and 9,138,204, and U.S. Patent Application Publications Nos. 2004/0097805 A1 , 2019/0183577 A1 , and 2021 /0369394A1 , all of which are incorporated herein by reference.
  • the patient 14 can be fixed in a pose relative to a selected object, such as onto an operating table 40, but is not required to be fixed to the table 40.
  • the table 40 can include a plurality of straps 42.
  • the straps 42 can be secured around the patient 14 to fix the patient 14 relative to the table 40.
  • Various apparatuses may be used to position the patient 40 in a static position on the operating table 40. Examples of such patient positioning devices are set forth in commonly assigned U.S. Pat. App. No. 10/405,068, published as U.S. Pat. App. Pub. No. 2004/0199072 on October 7, 2004, entitled “An Integrated Electromagnetic Navigation And Patient Positioning Device”, filed April 1 , 2003 which is hereby incorporated by reference.
  • Other known apparatuses may include a Mayfield® clamp.
  • the navigation system 10 includes at least one tracking system.
  • the tracking system can include at least one localizer.
  • the tracking system can include an electromagnetic (EM) localizer 50.
  • the tracking system can be used to track instruments relative to the patient 14 or within a navigation space.
  • the navigation system 10 can use image data from the imaging system 12 and information from the tracking system to illustrate locations of the tracked instruments, as discussed herein.
  • the tracking system can also include a plurality of types of tracking systems including an optical localizer 52 in addition to and/or in place of the EM localizer 50.
  • the EM localizer 50 can communicate with or through a localizer communication 54 that may be wired or wireless.
  • the EM localizer 50 may also include and/or take the form of an alternative pad or flat EM localizer 55.
  • the optical localizer 52 and the EM localizer 50 can be used together to track multiple instr90uments or used together to redundantly track the same instrument.
  • Various tracking devices can be tracked and the information can be used by the navigation system 10 to allow for an output system to output, such as a display device to display, a position of an item.
  • tracking devices can include a patient or reference tracking device 56 to track the patient 14, an instrument tracking device 60 to track an instrument 62, and/or other appropriate tracking devices for one or more portions.
  • Patient or reference tracking device 56 and instrument tracking device 60 may include those disclosed in U.S. Pat. Nos. 8,060,185 and 8,644,907, both incorporated herein by refence.
  • the tracking devices allow selected portions of the operating theater to be tracked relative to one another with the appropriate tracking system, including the optical localizer 52 and/or the EM localizer 50.
  • the reference tracking device 56 can also or alternatively be positioned on an instrument and positioned within the patient 14, such as within a heart 15 of the patient 14.
  • any of the tracking devices 22, 56, 60 can be optical or EM tracking devices, or both, depending upon the tracking localizer used to track the respective tracking devices. It will be further understood that any appropriate tracking system can be used with the navigation system 10.
  • Alternative tracking systems can include radar tracking systems, acoustic tracking systems, ultrasound tracking systems, electrical impedance tracking systems, radio frequency beacon, and the like.
  • Exemplary tracking systems include those disclosed in U.S. Pat. Nos. 7,676,268; 8,532,734; and 8,494,608, all incorporated herein by reference.
  • Each of the different tracking systems can be respective different tracking devices and localizers operable with the respective tracking modalities.
  • An exemplary EM tracking system can include the STEALTHSTATION® AXIEMTM Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado. Exemplary tracking systems are also disclosed in U.S. Patent No. 7,751 ,865, issued July 6, 2010 and entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION"; U.S. Patent No. 5,913,820, titled “Position Location System,” issued June 22, 1999 and U.S. Patent No. 5,592,939, titled “Method and System for Navigating a Catheter Probe,” issued January 14, 1997, all herein incorporated by reference.
  • shielding or distortion compensation systems to shield or compensate for distortions in the EM field generated by the EM localizer 50.
  • Exemplary shielding systems include those in U.S. Pat. No. 7,797,032, issued on September 14, 2010 and U.S. Pat. No. 6,747,539, issued on June 8, 2004; distortion compensation systems can include those disclosed in U.S. Pat. No. 10/649,214, filed on January 9, 2004, published as U.S. Pat. App. Pub. No. 2004/0116803, all of which are incorporated herein by reference.
  • the EM localizer 50 and the various tracking devices can communicate through an EM controller.
  • the EM controller can include various amplifiers, filters, electrical isolation, and other systems.
  • the EM controller can also control the coils of the EM localizer 50 to either emit or receive an EM field for tracking.
  • a wireless communications channel such as that disclosed in U.S. Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November s, 2002, herein incorporated by reference, can be used as opposed to being coupled directly to the EM controller.
  • the EM controller can be incorporated into a navigation processing system 70.
  • the tracking system may also be or include any appropriate tracking system, including a STEALTHSTATION® TRIA®, TREON®, and/or S7TM Navigation System having an optical localizer, similar to the optical localizer 52, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado.
  • Further alternative tracking systems are disclosed in U.S. Patent No. 5,983,126, to Wittkampf et al. titled “Catheter Location System and Method,” issued November 9, 1999, which is hereby incorporated by reference.
  • Other tracking systems include an acoustic, radiation, radar, electrical impedance, etc. tracking or navigation systems.
  • the navigation system 10 can include a navigation processing unit or module 74 that can communicate or include a navigation memory 76, which may be included in the navigation processing system 70.
  • the navigation processing system 70 may further include a display device 77.
  • the navigation processing unit 74 can include a processor (e.g., microprocessor, a central processing unit, etc.). In various embodiments, the navigation processing unit may execute instructions to determine one or more poses of the tracking devices based on signals from the tracking devices.
  • the navigation processing unit 74 can receive information, including image data, from the imaging system 12 and tracking information from the tracking systems, including the respective tracking devices and/or the localizers 50, 54. Image data can be displayed as an image 78 on the display device 26.
  • the display device may be separate from and/or integrated into the navigation system, thus, the display device 26 may include the display device 77.
  • the navigation system 70 can include appropriate input devices, such as a keyboard 84. It will be understood that other appropriate input devices can be included, such as a mouse, a foot pedal 88 or the like which can be used separately or in combination. Also, all of the disclosed processing units or systems can be a single processor module (e.g., a single central processing chip) that can execute different instructions to perform different tasks.
  • An image processing unit or module can process image data from the imaging system 12 and a separate first image processor (not illustrated) can be provided to process or pre-process image data from the imaging system 12.
  • the image data from the image processor can then be transmitted to the navigation processor 74.
  • the navigation system 10 may include or operate with a single or multiple processing centers or units that can access single or multiple memory systems based upon system design.
  • the imaging system 12 can generate image data that may be used to compose the image 78 and define an image space that can be registered to a patient space or navigation space that is defined by and/or relative to the patient 14.
  • the position of the patient 14 relative to the imaging system 12 can be determined by the navigation system 10 with the patient tracking device 56 and the imaging system tracking device(s) 22 to assist in and/or maintain registration. Accordingly, the position of the patient 14 relative to the imaging system 12 can be determined.
  • Registration can occur by matching fiducial points in image data with fiducial points on the patient 14. Registration of image space to patient space allows for the generation of a translation map between the patient space and the image space. According to various embodiments, registration can occur by determining points that are substantially identical in the image space and the patient space. The identical points can include anatomical fiducial points or implanted fiducial points. Exemplary registration techniques are disclosed in U.S. Pat. App. No. 12/400,273, filed on March 9, 2009, now published as U.S. Pat. App. Pub. No. 2010/0228117 and in U.S. Pat. No. 9,737,235, issued August 22, 2017, both incorporated herein by reference.
  • the imaging system 12 can be used with an un-navigated or navigated procedure.
  • a localizer and/or digitizer including either or both of an optical localizer 52 and/or an electromagnetic localizer 50, 55 can be used to generate a field and/or receive and/or send a signal within a navigation domain relative to the patient 14.
  • the navigated space or navigational domain relative to the patient 14 can be registered to the image 78.
  • Correlation is to allow registration of a navigation space defined within the navigational domain and an image space defined by the image 78.
  • the patient tracker or dynamic reference frame 56 can be connected to the patient 14 to allow for a dynamic registration and maintenance of registration of the patient 14 to the image 78.
  • the navigation system 10 with or including the imaging system 12, can be used to perform selected procedures. Selected procedures can use the image data generated or acquired with the imaging system 12. Further, the imaging system 12 can be used to acquire image data at different times relative to a procedure. As discussed herein, image data can be acquired of the patient 14 prior to the procedure for collection of automatically registered image data or cine loop image data. Also, the imaging system 12 can be used to acquire images for confirmation of a portion of the procedure. Thus, image data may be acquired at any appropriate time and may be registered to the patient 14. [0049] Registration and navigated procedures are discussed in U.S. Patent No. 8,238,631 , incorporated herein by reference.
  • a graphic representation 90 (e.g., an icon, indicium, animation or other or visual representation) may be displayed relative to, including overlaid (e.g., superimposed) on, the image 78.
  • the graphic representation can be generated using any suitable software and/or hardware, including without limitation the processing unit 74.
  • the instrument 62 could incorporate one or more electromagnetic coil sensors along a portion of its length (e.g., the portion to be rendered in the graphic representation.)
  • a shape sensing fiber e.g. one made up of several fiber Bragg gratings
  • An algorithm that renders the pose, curve and/or shape of the instrument 62 could be employed, e.g.
  • the flexure formula uses the flexure formula to take the known bending characteristics of a particular instrument 62 into account; the pose of the sensor(s) (e.g. electromagnetic sensor(s), etc.) determined and the resulting curve/pose/shape of the instrument 62 generated.
  • the pose, shape and/or position of the rendered portion of the instrument 62 can be displayed and, if desired, labeled.
  • the graphical representation can be captured in the ultrasound image and provided to the processing unit.
  • the image 78 may be an appropriate image and may include one or more 2D images, such as 2D images that are acquired at different planes. Images may also be a 3D image, or any appropriate image as discussed herein.
  • the imaging plane of the US imaging system 12 can also be determined.
  • imaged portions can be located within the patient 14.
  • a position of an imaged portion of the heart 15, or other imaged portion can also be tracked as disclosed in U.S. Patent Nos. 6,379,302; 6,669,635; 6,968,224; 7,085,400; 7,831 ,082; 8,320,653; 8,811 ,662; and 9,138,204 all of which are incorporated herein by reference.
  • the patient 14 may be positioned on the table 40, as noted above.
  • the display 26 and/or the display 77 may display various information regarding the patient 14 and/or other information selected by the user 18.
  • the display 26 may illustrate the image 78 that may be acquired with the imaging system 12.
  • the image 78 may also be displayed on the display 77.
  • the instrument 62 may be tracked and the graphic representation 90 of the instrument 62 may be displayed on the display 26, such as relative to the image 78.
  • the graphic representation 90 may also be displayed relative to a patient avatar 120.
  • the patient avatar 120 may be illustrated on a selected display, as illustrated in Fig. 2, such as the display 26.
  • the patient avatar 120 may be based on a general avatar that includes various features of the patient 14, as noted herein, but is sized to the current and specific patient.
  • the avatar 120 may be illustrated as a two-dimensional (2D) image and/or a three-dimensional (3D) image.
  • the avatar 120 may be any appropriate shape to assist in identifying various positions and relative positions of portions of the patient 14.
  • the avatar 120 is sized to the specific patient as disclosed herein.
  • the patient avatar 120 may generally illustrate a portion of the patient 14 and may be sized relative to the patient 14.
  • the patient avatar may illustrate an inferior portion of a neck 122, a pectoral region 124, and other appropriate regions of the patient 14.
  • the patient avatar 120 may be displayed on the display device 26 to illustrate a general representation of the patient 14 without requiring an image acquisition of the same portion of the patient 14. Therefore, the patient avatar 120 may provide a general roadmap or indication of the position of various tracked portions, as discussed further herein.
  • the avatar 120 may however, as discussed herein, be morphed to match a relative size of the patient 14 to provide an indication of relative anatomical portions of the patient 14 and a pose of the tracked portions, such as the instrument 62, relative thereto.
  • the avatar 120 may be generally sized to the patient 14 according to one or more appropriate procedures.
  • the avatar 120 may be sized or morphed in real time to match relative dimensions of the patient 14.
  • the relative positions of the anatomic portions may include a relative position of a suprasternal notch and an exterior shoulder or inferior jugular vein, or any other suitable anatomical location(s).
  • the user 18 may provide or include information regarding the patient 14.
  • the navigation processing system 70 may include various inputs that allow the user 18 to input information. Therefore, as illustrated in Fig. 3, the user 18 may input information such as a height, a weight, and a sex of the patient 14.
  • the input information is used by the processor module 74 to execute selected instructions to size the patient avatar 120 to the actual patient 14.
  • the resizing of the avatar 120 may be based upon recalling of a selected or known predetermined size from a database, reconfiguring or resizing the patient avatar 120 based upon the input information, or other appropriate information. Therefore, the avatar 120 may be sized to substantially match the patient 14 without acquiring an image of the patient 14 for display on the display device 26.
  • the user 18 may also identify or size the avatar 120 to the patient 14 with a tracked indicator or tracked member 130.
  • the tracked indicator 130 which may be an indicator or probe usable to touch a surface of the patient 14, may be moved by the user 18 to various predetermined positions on the patient 14 as indicated on the avatar 120.
  • the tracked indicator 130 may be tracked with any appropriate tracking system, including those discussed herein according to various embodiments such as the localizer system of U.S. Patent No. 5,983,126, hereby incorporated by reference.
  • the user 18 may move the tracked indicator 130 to various points, such as the exterior shoulder portions 132’, 134’ as illustrated by the icons 132 and 134.
  • the user 18 may move the tracked indicator 130 to other portions such as a suprasternal notch 136 and a sternum 140 of the patient 14 or outer boundary positions. It is understood that the tracked indicator 130 may be moved to any appropriate position and the above-noted positions are merely exemplary. Nevertheless, based upon the tracked position of the tracked indicator 130, the avatar 120 may be sized to morph the identified points to a representation of the patient 14 for display on the display device 26. Also, the points on the avatar 120 may be registered to the patient 14 by identifying them in the patient space relative to the patient 14 and identifying them on the avatar 120.
  • the avatar 120 may also be morphed to actual measurements taken of the patient 14. For example, a distance between the shoulder points 132’, 134’ on that the patient 14 may be measured, such as with a tape measure. The measured distance may be input within the navigation processing system at 77. The avatar 120 may be sized based upon the measurements.
  • the avatar 120 may be matched or morphed to substantially mimic the shape of the patient 14.
  • the avatar 120 may then be used as an illustration of a pose of the tracked instrument, such as the instrument 62, during a procedure on the patient 14.
  • the avatar 120 may provide a representation of a position of the tracked instrument 62 without requiring an image acquisition of the patient 14, such as with a fluoroscopic imaging system.
  • the avatar 120 as it is morphed to the specific patient may or will be displayed as a patient specific avatar.
  • the avatar 120 may include general information, but is displayed and registered to the patient 14 as a patient specific avatar 120 including relevant locations and relative positions of various portions of the patient 14.
  • the specific patient avatar illustrates the relative positions of the anatomy of the patient 14 that is the current patient. For example, the position of the heart 15 relative to the suprasternal notch.
  • the avatar 120 may be registered to the patient 14 in any appropriate manner, including those discussed above.
  • the tracked indicator 130 is tracked to determine positions of the patient 14 for marking or changing the avatar 120.
  • the patient 14 after the various points are determined in both the patient 14 and the avatar 120 may be registered thereto.
  • the patient tracker 56 may be positioned on the patient 14 at an appropriate position.
  • the position of the patient tracker 56 may be at the suprasternal notch 136’ which relates to a point on the avatar 120, such as the suprasternal notch 136.
  • the position of the suprasternal notch 136 in the avatar 120 may be determined or known to be the position of the patient tracker 56 to allow for registration of the avatar 120 relative to the patient 14.
  • the patient tracker 56 may maintain registration even when the patient 14 moves relative to the localizer(s). It is understood that any appropriate number of the patient trackers 56 may be utilized, including each may be used for registration and/or more than one of the patient trackers 56 may be used.
  • the localizers may also or alternatively be fixed or positioned relative to the patient 14 in an appropriate manner.
  • the localizer 58 may be fixed relative to the table 14 such that it is immovable during the procedure.
  • the field generated with the EM localizer 58 can be at a known position relative to the patient 14 and allow for a registration of the avatar 120 to the patient 14 based upon the known position of the EM localizer 58 relative to the patient 14.
  • the localizer 55 may also be positioned under the patient 14, such as between the patient 14 and the bed 40. Localizer 55 may include field generating portions, such as coils, that generate a field relative to the patient 14.
  • the flat or panel localizer 55 may be positioned at a known position relative to the patient 14, such as aligning the shoulder edges 132’, 134’ to known or predetermined portions of the panel localizer 55. As the panel localizer 55 is at a known position relative to the patient 14, and the avatar 120 is morphed to the patient and 14, the avatar 120 may be registered to the patient 14 due to the position of the panel localizer 55.
  • the avatar 120 may be registered to the patient 14. Registration of the avatar 120 relative to the patient 14 allows the avatar 120 to be used to assist navigation and guiding of the instrument 62, or any appropriate instrument relative to the patient 14. As the tracked position of the instrument 62 is determined with the navigation system 10, the pose of the instrument 62 may be illustrated with a graphical representation, such as the graphic representation 90 relative to the image 78 and/or the avatar 120.
  • the instrument’s graphic representation 90 is displayed superimposed on the avatar 120.
  • the user 18 may view the display 26 and understand the tracked position of the instrument 62 relative to the patient 14 by viewing the graphic representation 90 of the instrument 62 in the avatar 120 on display 26.
  • the user 18 may then understand the relative position of the instrument 62 to the patient 14 and determine that the instrument 62 is generally moving and/or is positioned in a selected or procedure planned position.
  • the user 18, thus, may understand this by viewing the display 26 and the avatar 120 without requiring additional or any image data being acquired of the patient 14.
  • image data may be acquired of the patient 14.
  • the imaging system 12 may acquire image data of the patient 14.
  • the imaging system 12 may generally be a non-ionizing imaging system, such as an ultrasound imaging system.
  • the ultrasound image may generally be collected at a single plane relative to the housing 16.
  • the US imaging system 12 with the tracker 22 may be tracked relative to the patient 14 so that the image may also be registered to the patient 14.
  • the image 78 may also be displayed superimposed on the avatar 120, such as the image superimposition 78i.
  • the image may also be displayed separately from the avatar, as also illustrated in Fig. 2.
  • the image 78 may be registered to the patient 14, a single image acquisition may occur and the image 78 may be displayed for the procedure. Further, as the image 78 is registered to the patient the graphic representation 90 of the instrument 62 may be displayed at the tracked pose on the image 78.
  • the US imaging system 12 may be used at one point in time to acquire the image data, but it may be used to illustrate the selected portion of the patient 14, such as the heart, for an entire procedure. A constant imaging of the patient 14 is not required to illustrate or know a position of the instrument 62 as the instrument 62 is tracked and the image 78 is registered to the patient 14 due to the US tracker 22.
  • the user 18 may perform the procedure on the patient 14 with the tracked instrument(s).
  • the procedure may be any appropriate procedure, such as placing a pacing lead and/or a catheter to assist in placing the lead. Other appropriate procedures may also be performed.
  • the user 18 may form an incision, such as an incision 150 on the patient 14, and move the instrument 62 into the patient 14 through the incision 150.
  • the tracked portion 60 on the instrument 62 may be tracked so that a position of the instrument 62 may be displayed relative to the avatar 120 and/or the image 78, as noted above.
  • the instrument 62 may then be moved relative to the patient 14.
  • the instrument 62 may be a catheter and/or lead for an implant, or a stylet or a delivery system for an implant such as a leadless pacemaker or a stent.
  • an implant, including an electrode may be positioned relative to one or more portions of the heart 15 of the patient 14.
  • the position of the instrument 62 may be illustrated with the graphic representation 90 over time and/or instantaneously relative to the avatar 120.
  • the graphic representation 90 illustrates the pose of the instrument 62 tracked relative to the patient 14.
  • the graphic representation 90 may be viewed by the user 18 to understand a real-time position of the instrument 62 within the patient 14 via the avatar 120.
  • the avatar 120 may be generated without requiring image data of the patient 14, such as with fluoroscopy.
  • the avatar 120 may also assist the user 18 in identifying or determining an orientation of the instrument 62 as being in a selected or in a not- selected pose relative to the patient 14. For example, as illustrated in Fig.
  • the avatar 120 may be viewed by the user 18 during a procedure.
  • the user 18 may view the representation 90 of the instrument 62 as it moves in a selected direction and/or position.
  • the tracked pose of the instrument 62 may be updated in substantially real time and displayed relative to the avatar 120. Therefore, the graphic representation 90 of the instrument 62 when it is moving in a not-selected position or and/or pose may also be illustrated and understood by the user 18.
  • a representation 90x may illustrate movement and/or pose toward a neck 122 of the patient 14. This may be a not-selected pose when the instrument 62 is selected to be moving toward the heart 15 of the patient 14. Therefore, the user 18 may view the graphic representation 90x and understand that the instrument 62 is moving away from the heart 15 of the patient 14.
  • a representation of the heart 15 may also be displayed on the avatar 120 including the image 78 and/or with a heart representation 15a that is included in the avatar 120 and appropriately located with the patient specific information.
  • the user 18 may view and understand the position of the instrument 62 during the placement of an implant, which may include, for example, a stimulation lead for a selected system.
  • the lead may include a pacing electrode be for a cardiac stimulation system, such as the AZURE®, ADAPTA®, or other appropriate heart pacemaker or defibrillation systems sold by Medtronic, Inc. having a place for business in Minnesota, USA.
  • a method 200 for determining and displaying a patient specific avatar is illustrated.
  • the process may begin in Block 210 and include recalling a standard avatar in Block 214.
  • the processor such as the processor 74, may recall a standard avatar including various anatomical features, including exterior dimensions of a patient's anatomy and selected internal anatomy such as a heart, physical location of the heart, jugular vein, or the like.
  • Information regarding the patient, such as the specific patient may be received in Block 218.
  • the information regarding the patient may be any appropriate information, including that discussed above. Therefore, the information may include general dimensions of the current patient, including a height, weight, sex, or the like.
  • Information received in Block 218 may also include specific tracked points of predetermined portions (e.g., shoulder edges) of the patient 14. Further the received information may include measurements of the patient. Regardless, the received information of the patient may be patient specific information regarding the patient for a current or real-time procedure.
  • a patient specific avatar is then generated in Block 230.
  • the generation of a patient specific avatar may include morphing or fitting the standard avatar based upon the received information.
  • the method may include altering dimensions of the standard avatar based upon a ratio given the information received regarding the current patient. The ratio may be based upon the known dimensions of the current patient based upon known or average dimensions of various anatomical portions relative to one another based upon the dimensions of the specific patient. This may also include having predetermined average positions and relative positions of various anatomical features.
  • the tracked position of the instrument may identify specific anatomical locations.
  • the standard avatar may then be morphed based upon the tracked specific locations for the specific patient.
  • the avatar may be morphed based upon selected instructions including an algorithm to generate the patient specific avatar in Block 230.
  • the patient specific avatar may be displayed in Block 234.
  • the display 26 may display the patient specific avatar for selected procedures.
  • the display of the patient specific avatar may allow the user 18 to identify and understand relative positions or locations of various portions of the current patient 14.
  • An instrument may then be tracked, optionally and according to various embodiments, in Block 238. Although the tracking of the instrument is an optional part of the method 200, such tracking may allow for information regarding a pose of an instrument to be determined. Regardless of whether the instrument is tracked, the method 200 may include receiving tracking information in Block 240. [0075] Receiving tracking information in Block 240 may include tracking one or more tracking devices to determine a pose of various elements or portions, such as the pose of the instrument 62 via the instrument tracker 60 and/or the patient 14 via the patient tracker 56. The patient specific avatar may be registered to the patient based upon the known location of one or more specific portion(s) of the patient, such as the positioning of the patient tracker 56 in the suprasternal notch of the patient 14. The suprasternal notch of the avatar 120 may then be registered for determination of the pose of the instrument 62 relative to the patient tracker 56. Therefore, the pose of the instrument 62 may be displayed on the display device 26 such as the superimposed graphic representation 90 on the avatar 120.
  • the pose of the instrument may be superimposed on the avatar 120 in Block 244. Again, this allows a user 18 to view and understand a pose of the tracked instrument 62 on a representation of the patient 14 without requiring image data to be acquired to the patient 14. While image data of the patient 14 may be acquired, such as with the US imaging system 12, such image information is not required. The image data information from the US imaging system 12 may assist in finalizing or determining a specific location of the instrument 62 relative to a selected portion, such as a specific portion of the heart 15 of the patient 14. However, the position of the instrument 62 may be determined without the image data. Thus, the user may understand a position of the instrument 62, including a movement of the instrument 62 toward a selected position, such as the RV septum of the patient 14.
  • the method 200 may then end in Block 250. And ending the method 250 it is understood that the method 200 may be repeated as often as necessary or selected to allow for up to date and/or real-time tracking and illustration of a pose of the instrument 62 relative to the avatar 120. Thus, the method 200 may be repeated at a selected rate and for a selected period, such as selected by the user 18.
  • moving the instrument 62 relative to the patient 14, such as relative to the heart 15, may be accomplished while tracking a pose of the instrument 62.
  • tracking a pose of the instrument 62 can involve attempting to determine a physical location and/or orientation of the instrument 62 in the physical space.
  • the tracked posed may include a location which includes X, Y, and Z axis positions and orientation which may include yaw, pitch, and roll orientations in space.
  • the physical space may also be referred to as navigation space and is generally defined relative to the patient 14 and/or the heart 15.
  • the patient tracker 56 allows the pose of the patient 14, including the heart 15, to be tracked. Tracking a pose relative to a portion of the patient 14, including the heart 15, may include systems such as those disclosed in U.S. Pat. No. 7,697,972 issued April 13, 2010; U.S. Pat. No. 8,046,052 issued October 25, 201 1 ; U.S. Pat. No. 8,180,428 issued May 15, 2012; and U.S. Pat. No. 8,663,120 issued March 4, 2014, all of which are herein incorporated by reference.
  • the instrument 62 may be tracked relative to the patient 14 which may be displayed such as within the graphic representation 90 relative to selected portions including the avatar 120, the image 78, and/or an atlas 78m, as illustrated in Fig. 8.
  • the atlas 78m may be based upon the selected information, such as based on a plurality of images of patients that have been averaged and portions thereof identified.
  • the atlas 78m may be a model based on the image data from a plurality of images.
  • the atlas 78m (also referred to as a model) may have selected portions therein that are identified.
  • the atlas 78m may include the following identified locations: a right ventricle 300, a superior vena cava ostium 304, a tricuspid valve 308, a right ventricle 312, and a right ventricle septum 316.
  • the image 78 is an example of one image in a set of images that can be used to build the atlas 78m.
  • the atlas 78m may also be generated using the patient’s pre-procedure imaging like a CT or MRI.
  • the image 78 may be obtained with a selected imaging system, including the US imaging system 12, as discussed above.
  • a patient specific anatomy model may be generated using multiple 2D/3D views/planes stitched together, for example to reconstruct the endocardium of the heart.
  • a clinician may identify various relevant portions or locations within the image 78.
  • the user 18 may identify portion(s) or location(s) in the image 78 shown in the right-hand side of Fig.
  • Other locations may include endocardium or blood pool boundary may include the identification of the portions/locations within the heart. The location identification may be done manually by the user 18 and/or with assistance of an automatic location identification, such as segmenting a valve, etc.
  • Multiple images similar to the image 78 e.g., multiple similar images from a population of patients other than the patient 14, with similar identified portion(s) or location(s) may be combined to build the atlas 78m to illustrate the identified portion(s) or location(s) on a model of the imaged anatomy or heart, which model represents a typical or average patient anatomy.
  • the atlas 78m may be built only from images of the anatomy of the patient 14, and the portion(s) or location(s) of interest applied to the atlas based on identification within the images of the patient’s own anatomy, and/or based on a compilation of images from other patients’ anatomy and an estimated and/or average position of the portion(s) or location(s) of interest within the compilation of images.
  • the atlas 78m may also be morphed to actual measurements taken of the patient 14.
  • a processing module or unit such as the processor 74, may execute selected instructions to register or identify various portions in the image 78 and/or on the atlas 78m for viewing by the user 18. For example, in the atlas 78m the right ventricle septum may be identified as a selected point including a target 330.
  • the image 78 may also, therefore, have the target identified thereon 330x to illustrate a proposed or selected position of the right ventricle septum (RV septum), e.g., to serve as a target for delivery of a catheter, stylet and/or pacing lead, or a leadless pacemaker.
  • RV septum right ventricle septum
  • the predetermined portions in the model 78m may be used to assist in identifying portions of the image 78 based on one or more portions identified by the user 18 and/or appropriate image identification systems. Therefore, the model 78m may be used to assist in identifying portions within the image 78 to assist in the procedure.
  • the user may view the image 78 and identify or evaluate the target 330x prior to and/or during the procedure.
  • the image 78 may also include a plurality of images that are generated with the imaging system 12.
  • the US imaging system 12 may acquire images of the patient 14, including the heart 15, in a plane. As the US housing 16 is moved, the plane of the image relative to the patient 14 may also change.
  • the image 78 may include a plurality of images that are illustrated in one or more planes. For example, with reference to Fig. 9, schematic images that may represent an approximately coronal to the body co-ordinate system image 78c may be acquired and displayed and/or an image that is approximately axial to the body co-ordinate system 78ax may be displayed.
  • the two images 78ap and 78ml may both be displayed and registered to the patient 14, as discussed above.
  • the tracked position of the instrument 62 may be displayed relative to one or both of the images 78ml and 78ap. Also, both of the images 78ml and 78ap may assist in registering or relating of the images to the model 78m. Additionally, the user 18 may view the images, as discussed above, and identify portions or locations thereon and of the model 78m that may be used to assist in identifying various other portions. For example, the user 18 may identify the tricuspid valve 308’ and the superior vena cava entrance 304’ and the model 78m may be used to identify various other portions or locations, such as the target 330x on the RV septum 330.
  • the two images 78ap and 78ml may also assist in registration to the model 78m.
  • the identification of anatomy in the images 78 and/or one or both of the images and 78ap and 78ml may be assisted by the registration of more than one image relative to the model or atlas 78m.
  • the ultrasound imaging system 12 is registered to the patient 14, thus the images acquired with of the imaging system 12 may also be registered to the patient 14.
  • the registration of the images to the patient may be based upon the tracking of the imaging system 12, such as with the imaging system tracking device 22 and the tracking of the patient 14, such as with the patient tracking device 56.
  • the instrument 62 may be tracked relative to the patient 14 and its pose may be displayed relative to the image 78. As discussed above, the pose of the instrument 62 may also be displayed relative to the avatar 120.
  • the graphic representation 90 allows the user 18 to view the pose of the instrument 62 relative to both of the image 78 and the avatar 120.
  • the instrument 62 may be tracked in real time. Therefore, the instrument may be displayed relative to the avatar 120 and/or the image 78 to provide a real time graphical representation of the instrument 62 relative to the heart 15, based upon a graphical representation displayed relative to the image 78. In addition to viewing the tracked pose of the instrument 62, the graphical representation may show the pose of the instrument 62 over time. Therefore, the graphic representation 90 may illustrate or provide a sense of an entire path of the instrument 62 as it moves from the incision 150 in the patient 14 to the current location within the patient 14. Further, the graphic representation 90 may provide a real-time feedback to the user 18 of a movement of the instrument 62.
  • the images 78 may be acquired or based upon image data acquired at a selected time. For example, prior to the incision 150, image data may be acquired with the imaging system 12 to generate the image 78. Image 78 may thus be registered to the patient 14 and displayed with a display device 26 either alone and/or superimposed on the avatar 120. Nevertheless, the image 78 may not be a real time or current image and may be a static image. The instrument 62, however, may be tracked in real time and provide a current determination of the pose of the instrument 62. The graphic representation 90 may, therefore, be updated in real time and at a selected rate to illustrate a current position of the instrument 62 and/or a current shape or condition of the patient anatomy.
  • the display of the graphical representation may also be updated in a selected rate to provide a perception of motion of the instrument 62.
  • a refresh of the pose of the instrument 62 and of the graphic representation 90 may be 10 times a second, 30 times a second, or any appropriate rate such as about one time a second to about 120 times per second.
  • the pose of the instrument 62 may be illustrated to show a motion within the display device 26.
  • the image 78 may be displayed on the display device 26.
  • the image 78 may not change over time as the image 78 may be a static image that is acquired at a selected time, such as prior to positioning the instrument 62 within the patient 14. Nevertheless, the tracking device 60 may be tracked in real time on the device 62.
  • the graphic representation 90 may include a first or initial time graphic representation 90t1 .
  • the graphic representation 90t 1 may represent a first graphic representation at a selected time, that may be arbitrarily referred to as a first or initial time.
  • a second graphic representation 90t2 may be displayed on the display device 26, as illustrated in Fig. 10B.
  • the graphic representation 90t2 may be displayed or have a selected position that is displaced a certain distance of 350 relative to the first graphic representation 90t 1 .
  • the distance 350 may be any appropriate distance and may be based upon the various physical features, such as movement of a beating heart.
  • the heart 15 of the patient 14 may beat (i.e., move due to the heartbeat), as is understood by one skilled in the art. Therefore, the graphic representation 90t2, including a terminal end 90x, may be shown in different positions based upon movements of the instrument 62. For example, as one skilled in the art will understand, when the terminal end of the instrument 62 contacts the RV septum 316, the terminal end of the instrument 62 may move due to the beating of the heart 15.
  • the instrument 62 may appear to move in real time.
  • Real-time tracking of the instrument 62 with the tracking device and 60 may allow for a representation of the movement of the instrument 62 to be displayed on the display device 26.
  • the movement of the instrument may be updated and displayed at any appropriate rate to provide a perception of movement to the user 18.
  • the user 18 may be provided with additional information regarding a position of the instrument 62, such as being in contact with of the RV septum 316.
  • the process 370 may be a process that is executed by one or more of the processing units, such as processing unit 74.
  • the method 370 may begin at start Block 374.
  • the process 370 may determine a first position or pose of the instrument in Block 378. Determining the first pose of the instrument may be performed with the navigation system, as discussed above.
  • the pose of the instrument 62 may be known relative to the patient 14, or portions thereof such as the heart 15, in light of the tracking of the patient with the patient tracker 58.
  • an output for display of the determined pose of the instrument may be made in Block 382.
  • the output of the determined pose may be based upon the registration of the image 78 and/or the avatar 120 to the physical space defined by the patient 14, as discussed above.
  • the output may display the graphic representation 90 at a first position, for example of the first pose 90t1 ,as illustrated in Fig. 10A.
  • a wait or pause for a selected time may occur in Block 384.
  • the wait for a selected time may be any appropriate time, such as about one second to about 120 seconds.
  • the wait time may be any appropriate time to assist in illustrating a motion to the user 18, as discussed above.
  • the second pose may be determined in Block 388.
  • the second pose may be determined in the same manner as the first pose was determined in Block 384.
  • An output for display of the second pose may be made in Block 392 to display the graphical representation of the instrument 62 at a second time, such as the graphic representation 90t2. As noted above, therefore, the pose of the instrument 62 may be displayed over a series of the events and/or time to show movement to the user 18.
  • the process 370 may provide an illustration to the user 18 for an appropriate period of time. If a tracking is not to be continued, a NO path 400 may be followed to end the process in Block 404.
  • the process 370 may be used to illustrate and determine a pose of the instrument 62 over time for illustration relative to the image 78, as illustrated in Fig. 10A and 10B.
  • the display 26 may display the image of the patient 14 that is acquired with the imaging system 12.
  • the image 78 may be an image that is acquired in a plane from the ultrasound housing 16. Therefore, rotation of the image may change a perspective of the image 78.
  • the image may be viewed over the plane 78f or from an edge 78e.
  • the edge image 78e may allow for a viewing and understanding of a perspective of the instrument 62, as illustrated by the graphic representation 90, relative to a plane of the image.
  • the image acquired with the imaging system 12 may be a static image that is acquired with the imaging system 12. It is understood, however, that a live or current image may also be acquired and displayed. Nevertheless, the position of the instrument 62 relative to the plane of the image may be illustrated by rotating the image, such as showing it from an edge as illustrated in the edge image 78e.
  • the distance of the instrument 62 that is within the plane of the image may also be illustrated.
  • a first and second distance icon 410, 412 may be included or illustrated superimposed on both of the images 78e, 78f.
  • the distance icons may be used to illustrate the distance that the instrument 62 is in the plane of the image.
  • the user 18 may also understand that the portion of the instrument that is within the plane of the image and that other portions are not within the plane of the image.
  • the distance icon 410, 412 may be sized depending upon the length or portion of the instrument 62 that is within the plane of the image 78. Therefore, the first distance icon 410 may have a first distance or size 420.
  • the second distance icon 412 may have a second distance or size 424.
  • the two dimensions 420, 424 may be different or the same depending upon the amount of the instrument or length of the instrument that is within the plane of the image 78. If the instrument has a greater length within the plane, the distance icons 410, 412 may have greater dimensions. If any dimension of the instrument is smaller in the plane, then the dimension icons may be smaller.
  • the dimension icons 410, 412 may be displayed, such as superimposed on, the image 78 to represent the distance of the instrument within the image plane. This may allow the user 18 to understand or determine the distance of the instrument 62 within a plane of the imaging system and/or assist in moving of the imaging system to acquire a different perspective than in the image 78 relative to the instrument 62.
  • the display 26 may display the image of the patient 14 that is acquired with the imaging system 12 at more than one location. In other words, more than one image may be displayed for each position of the imaging system 12 at which image data is acquired. Moreover, one or more image data portions may be merged and/or used in a reconstruction of a displayed image. As discussed above and herein, all of the images and/or the reconstructed image may be registered to the patient 14 due at least to the tracking of the imaging system 12.
  • the image 78 may be an image that is acquired in a plane from the ultrasound housing 16. Therefore, rotation of the image may change a perspective of the image 78.
  • the image may be viewed as acquired at a first location (e.g., inferior from the suprasternal notch) as over a plane 78f ’ or from an edge 78e’. These two views may be able to provide information to the user, as discussed above including location and/or dimension that the instrument graphic representations 90 passes through the plane 410’, 412’, 420’, 424’.
  • the image may be viewed as acquired at a second location (e.g., between selected ribs) as over a plane 78f” or from an edge 78e”.
  • a second location e.g., between selected ribs
  • These two views may be able to also provide information to the user, as discussed above including location and/or dimension that the instrument graphic representations 90 passes through the plane 410”, 412”, 420”, 424”.
  • the graphic representations 90 is illustrated to pass through the planes at the different positions and/or dimensions 410’, 412’, 420’, 424’, 410”, 412”, 420”, 424” and this information is displayed for the user.
  • any appropriate number of location images may be acquired and/or displayed and two is merely exemplary.
  • the edge image may allow for a viewing and understanding of a perspective of the instrument 62, as illustrated by the graphic representation 90, relative to a plane of the image.
  • the image acquired with the imaging system 12 may be a static image that is acquired with the imaging system 12. It is understood, however, that a live or current image may also be acquired and displayed. Nevertheless, the position of the instrument 62 relative to the plane of the image may be illustrated by rotating the image, such as showing it from an edge as illustrated in the edge image.
  • the instrument 62 may be tracked with any appropriate tracking system.
  • tracking and/or navigation systems include those disclosed above and those disclosed in U.S. Pat. No. 7,697,972 issued April 13, 2010; U.S. Pat. No. 8,046,052 issued October 25, 2011 ; U.S. Pat. No. 8,180,428 issued May 15, 2012; and U.S. Pat. No. 8,663,120 issued March 4, 2014, all of which are herein incorporated by reference.
  • the navigation system may allow for a determination of a pose of the instrument 62 in a physical location and/or orientation of the instrument 62 in the physical space.
  • the tracked pose may include a location which includes X, Y, and Z axis positions and/or orientation which may include yaw, pitch, and roll orientations in space.
  • a tracking and navigation system may also include one or more permanent magnets and/or magnetometers.
  • a magnet 500 may be positioned with, in or on the instrument 62 as the instrument tracking device 60, according to various embodiments.
  • the magnet 500 may be fixed relative to the instrument, which can comprise at least one of a catheter 62c, a dilator 62d, or other portion or device that may be temporarily placed in and then removed from the patient 14.
  • the magnet 500 may also be placed on a lead 62I, or a leadless pacemaker or a delivery system therefor (not shown) that may be inserted or implanted in the patient 14.
  • the magnet 500 may emit a magnetic field 504 relative thereto that may be detected and used to track the instrument 62.
  • the magnet 500 may be a magnet that has a selected dimension 508 as a first dimension, a second dimension 510, and a third dimension 512.
  • the three dimensions 508, 510, and 512 may allow for generation of the magnetic field 504 to define an axis 518 relative to the magnet 500. Therefore, the axis 518 may be determined relative to the magnet 500 when the field 504 is sensed, as discussed herein.
  • the magnet 500 may be magnetized in any appropriate orientation, thus the magnet 500 may also define an axis 5181 that is a long axis of the magnet 500 if the magnet is magnetized along its long axis.
  • the magnet 500 may be formed of a selected material that enables the magnet to generate a magnetic field on its own, without need for any external power. Thus, the magnet 500, in various embodiments, forms the magnetic field alone. In other words, the magnet 500 may be the only member formed the magnetic field according to various embodiments.
  • the magnet 500 may be formed of selected materials, including more than one material, such as a rare earth materials. Rare earth material magnets include those such as neodymium, iron, and boron magnets.
  • the field 504 of the magnet 500 may be an appropriate strength, such as about 0.001 Tesla (T) to about 5 Tesla, including about 0.1 T to about 3T. Detectability of the field could be around 1 nT. According to various embodiments, magnetic tagging may be used such as disclosed in U.S. Pat. No. 6,329,916.
  • the field 504 of the magnet 500 may be sensed exterior to the patient 14, as discussed further herein, when the magnet 500 is within the patient 14.
  • the magnet 500 may include a dimension or size that allows it to be inserted into the patient 14, such as into and relative to the heart 15 of the patient 14. Thus, the magnet 500 may be positioned relative to the heart 15 and be sensed exterior to the patient 14, as discussed herein.
  • the magnet 500 may be positioned on, in or relative to one or more instruments 62 that is positioned within the patient 14.
  • the magnet 500 may be used to determine a pose of the instrument 62.
  • the pose may include up to six degrees of freedom that may include x, y, and z axis and/or orientation including yaw, pitch, and roll.
  • the lead 62I may be positioned within that the patient 14 and implanted into the heart 15.
  • the lead 62I may be permanently positioned, such as within the patient 14 during the procedure and retained in the patient 14 after the procedure.
  • the lead 62I may include various portions including a fixation member 520 and a conductor 522.
  • the magnet 500 may be fixed relative to the lead 62I.
  • the magnet 500 may include the axis 518 that is at a known orientation relative to various portions of the lead 62I including a long axis 524 of the lead 62I and a selected distance from the fixation member 520.
  • the pose of the lead 62I may be determined based upon sensing the magnetic field 504, determining the axis 518 and the known or predetermined location of the magnet 500 relative to other portions of the lead 62I.
  • the magnet 500 may also be positioned on, in or relative to other instruments, such as the catheter 62c.
  • the catheter 62c may include an interior cannula 528 through which various members may be passed, such as the lead 62I or other instruments. They magnet 500, however, may be positioned within the catheter 62c such as in a wall thereof.
  • the magnet 500 may define the field 504 that defines the axis 518.
  • the orientation of the axis 518 may be known relative to a portion of the catheter 62c, such as a long axis of the catheter 530. Further, a distance of the axis 518 relative to a distal end 532 of the catheter 62c may also be known. Thus, a pose of the catheter 62c may be determined based upon sensing the field of the magnet 500 and determining the location and/or orientation of the axis 518.
  • the magnet 500 can also be shaped with a hole along an axis of the magnet (e.g., the long axis 518L) to allow passage of a guide wire, lead, etc. assuming a cylindrical shape for the magnet and coaxial placement of it within the catheter.
  • the dilator 62d may also incorporate the magnet 500.
  • the dilator 62d may include a distal end 536 that is enlarged relative to an elongated portion or shaft 538. The end 536, therefore, may allow for inclusion of the magnet 500 therein.
  • the magnet 500 may include a selected volume that may be maintained within the distal end 536. Thus, the magnet 500 may generate the field that defines the axis 518.
  • the location and/or orientation of the axis 518 may again be known relative to a distal tip 540 of the dilator 62d and also relative to a long axis 542 of the dilator 62d. This allows a pose of the dilator 62d, particularly the end 536 thereof, to be determined with the navigation system when sensing the field 504 of the magnet 500.
  • the magnet 500 may be associated with, such as fixed to or incorporated into, one or more of the instruments 62.
  • the instrument 62 may be positioned within the patient 14 and the magnetic field 504 of the magnet 500 may be sensed.
  • the magnetic field 504 may be sensed with an appropriate instrument, such as a magnetometer.
  • the field sensed with the magnetometer may then be used to determine a pose of the magnet 500 and, therefore, a pose of the related instrument, such as that described above.
  • each instrument may have more than one magnet 500.
  • the catheter 62c may have a plurality of magnets spaced along a length of the catheter 62c.
  • a pose of each of the magnets 500 along the length and, therefore, each portion including one of the magnets may be determined. This may allow a pose of more than one portion of the instrument to be determined simultaneously.
  • only one magnet 500 may be provided or a plurality of the magnets 500 may be provided on an instrument and/or relative to the patient 14.
  • the magnet 500 may be understood to be a magnetic field generator that may generate a magnetic field as a permanent magnet.
  • a permanent magnet includes material that generates the magnetic field 504 alone with no external force or power.
  • the permanent magnet and/or field generated thereby therefore, may differ from a magnetic field that is generated with a coil through which a direct current (DC) or alternating current (AC) is passed.
  • DC direct current
  • AC alternating current
  • a direct or alternating current may be passed through a coil of wire to generate a magnetic field.
  • the coil must be connected to a power source to generate and drive the current through the wire and to form a field. Only the permanent magnet, however, generates the magnetic field with no current being supplied thereto.
  • the magnetic field of the permanent magnet may be sensed by a magnetic field sensing device, such as a hall effect sensor, an Anisotropic Magneto Resistive sensor (AMR), and/or other appropriate DC magnetic field sensor.
  • AMR Anisotropic Magneto Resistive sensor
  • AMR sensors may include Analog Devices Anisotropic Magnetoresistive sensors, which are targeted for typical EM localization applications.
  • the AMR sensors also sense DC magnetic fields.
  • the six degree of freedom (6DOF) configuration may be affixed to the skin at one or more, including three or more, discrete locations should be able to track a position of the magnet 500 in the patient 14.
  • a fiducial marker may be incorporated with the sensor so each could precisely localized with a tracking system (e.g., an infrared (IR) optical tracking system) and a point probe.
  • IR reflective spheres could be used for real-time direct tracking of the sensors rather than periodically touching off on the fiducial markers with a probe.
  • a network of such sensors may work similar to a cell tower system used for continuous telecommunications coverage.
  • a magnetic sensor may include those as noted above.
  • a magnetic sensor may be oriented relative to a fixed position and sense a field along at least one axis.
  • a magnetic sensing assembly 550 may include a selected number of magnetic sensors, e.g., three orthogonally oriented Hall effect sensors, as disclosed in U.S. Patent No. 7,650,178, incorporated herein by reference.
  • the magnetic sensing assembly may include three magnetic sensors including a first magnetic sensor 560, a second magnetic sensor 564, and a third magnetic sensor 568. It is understood, however, that the magnetic sensing assembly 550 may include more or fewer than three magnetic sensors.
  • the assembly 550 may include four or more sensors or two or fewer. More sensors may allow for greater resolution of location and orientation, if selected.
  • three magnetic sensors 560 - 568 may be oriented within the magnetic sensing assembly 550 to sense a magnetic field along one of three respective axes.
  • the first magnetic sensor 560 may sense the field along a first axis 570
  • the second magnetic sensor may sense the field along a second axis 574
  • the third sensor 568 may sense a magnetic field along a third axis 578.
  • Each of the three axes 570 - 578 may be oriented substantially orthogonal to one another. Therefore, the three-dimensional position of a sensed magnetic field may be determined relative to a point defined relative to the magnetic sensing assembly 550. For example, the three-dimensional position may be determined relative to an origin point 590 established relative to the magnetic field sensing assembly 550.
  • the pose (including a position and/or orientation) of the axis 518 of the magnet 500 may be known relative to an orientation of the magnet 500 and also relative to a configuration of the instrument 62.
  • An orientation of the magnet 500 may be determined and a related orientation of the instrument 62 may also be determined relative to the origin point 590.
  • the pose of the magnet 500 may be determined and the associated pose of the instrument 62 may also be determined.
  • the magnetic sensing assembly 550 may sense the magnetic field of the magnet 500 and a determination of the pose of the magnet and the related instrument may be made based thereon.
  • each of the three magnetic sensors 560, 564, 568 may sense the magnetic field independently when the magnet 500 is in any specific position relative thereto. Therefore, a determination may be made, based upon the sensed field, of the location and orientation of the magnet 500 relative to the magnetic sensing assembly 550 including the origin point 590. The sensed field may then be used by the navigation system to determine the pose of the instrument 62 associated with the magnet 500.
  • the magnetic sensing assembly 550 may be positioned on or near the patient 14 at any appropriate location. With reference to Fig. 16, the magnetic sensing assembly 550 may be provided on the patient 14 at a single location or at a plurality of locations.
  • the magnetic sensing assembly 550 may be positioned on the patient 14 at four positions or locations relative to the heart 15 of the patient 14. As illustrated in Fig. 16, four different magnetic sensing assemblies may be positioned, referenced as 550a, 550b, 550c, and 550d. Each of the magnetic sensing assemblies 550 may be positioned at different locations relative to the heart 15. Therefore, as the instrument 62 moves relative to the heart, the magnet 500 may be sensed by one or more of the magnetic sensing assembl(ies) 550 during the procedure.
  • the navigation system may determine the pose of the instrument and display it with the graphic representation 90 on the display device 26.
  • the graphic representation 90 may be displayed relative to the avatar 120 and/or the image 78.
  • the user 18 may understand the pose of the instrument 62 during the procedure.
  • the pose of the instrument 62 may be determined based upon a sensed magnetic field sensed by any one of the magnetic assemblies 550 and/or two or more of the magnetic assemblies 550.
  • the navigation system may determine the pose of the instrument 62 based upon the received one or more signals.
  • the pose of the magnet 500 may be determined and a pose of the instrument 62 may be determined based thereon.
  • the pose of the instrument 62 may be displayed with a graphical representation on the display device 26, including the superimposed graphic representation 90.
  • the pose of the instrument 62 may be determined with the first magnetic sensing assembly 550a when the magnetic sensing assembly 550a is nearest the magnet 500. As the instrument 62 moves past the magnetic sensing assembly 550a and closer to the magnetic sensing assembly 550b, the magnetic sensing assembly 550b may be used to determine the pose of the instrument 62.
  • the choice of the specific magnetic sensing assembly, if more than one is provided, used for the determination of the pose of the instrument 62 at a particular moment in time may be based upon the strength of the field sensed from the magnet 500 associated with the instrument 62.
  • the navigation system may determine the magnetic sensing assembly that sensed the strongest magnetic field and make the determination based upon that magnetic sensing assembly signal. It is understood, however, that the pose of the instrument 62 may be determined at any particular time based upon the signal received from more than one of the magnetic sensing assemblies 550.
  • a process 600 for displaying the pose of the instrument 62 is disclosed.
  • the process 600 may be included in an algorithm and/or system having instructions that are executed with the processing unit, such as the processing unit 74. Therefore, it is understood that the process 600 may be substantially an automatic process. Nevertheless, the process 600 may include various manual inputs, such as those from the user 18.
  • the process 600 may begin at start Block 604.
  • Starting in Block 604 may include initializing the system, preparing the patient 14, or otherwise associating the instrument 62 in a navigation space relative to the patient 14.
  • a zeroing step may be made to account for the effects of earth’s magnetic field.
  • the magnet 500 should be away from the sensors and measured output due to earth’s magnetic field should be subtracted to make all sensor outputs zero.
  • a recall of a pose determination system may be made in Block 610.
  • the pose determination system may be any appropriate pose determination system.
  • an algorithm may determine a pose of the instrument relative to one or more of the magnetic sensing assemblies 550. The algorithm, therefore, may be recalled or loaded into memory for access by the processing unit during execution of the method 600.
  • a machine learning system including an artificial intelligence or neural network, may also be trained and recalled. The machine learning system may have been trained based upon moving in the magnet relative to one or more of the magnetic sensing assemblies 550 to train a machine learning system. The machine learning system may then be used to determine a pose of the magnet 500 during use of the magnet 500 in a procedure on the patient 14.
  • the machine learning system may be trained in any appropriate time, e.g., any time prior to the instant procedure, to determine a pose of the magnet relative to one or more of the sensing assemblies 550.
  • the trained system may also be updated with information from one or more procedures to assist in enhancing a validity of the trained system.
  • a field from a magnet may be sensed in Block 614.
  • a permanent magnet may have a DC magnetic field and may be sensed by any appropriate sensing system, including the magnetic sensing assembly 550. Therefore, once the procedure 600 starts in Block 604, a field may be sensed in Block 614.
  • more than one magnetic sensing assembly 550 may be provided and employed during a procedure. Thus, the magnetic field may be sensed by more than one magnetic sensing assembly 550.
  • the process 600 allows for determination of whether the magnetic field is sensed by more than one sensing assembly 550 in Block 620.
  • the determination may be made that the magnetic field is sensed by only one magnetic sensing assembly, by a plurality of magnetic sensing assemblies, or by no magnetic sensing assemblies.
  • the process 600 may allow for a determination of the magnetic field is sensed by one or more of the magnetic sensing assemblies.
  • a NO path 624 may be followed.
  • the NO path 624 leads to a determination of which sensing assembly senses the strongest field in Block 628.
  • the determination of which sensing assembly senses the strongest field may allow for a choice of which sensing assembly to utilize to determine a pose of the magnet(s).
  • the determination of the pose of the magnet(s) may be based upon the strongest sensed field even when there is a plurality of the sensing assemblies 550.
  • signals may not be discarded even though weaker than the best signal. It can be used to improve position and orientation registration.
  • a YES path 634 may be followed.
  • a signal, after following the YES path 634, may be sent from the selected sensing assembly in Block 640.
  • the signal from the sensing assembly may include a signal that is an aggregate of all of the three orthogonal axes of all the sensing portions 560, 564, and 578 or an individual signal from each of the sensing portions. The signal, therefore, may be based upon the sensed field from the magnet 500.
  • the signal may be evaluated according to the recalled determination system in Block 644.
  • the determination system may be any appropriate determination system including a specific algorithm, a trained machine learning system (e.g., a neural network), or other appropriate determination system.
  • the determination system may be included in instructions that are executed by the processing unit 74.
  • the pose may, therefore, be determined based upon the signal sensed from the selected sensing assembling from Block 640.
  • the evaluation allows for an output of the determined pose relative to the sensing assembly in Block 648.
  • the pose relative to the sensing assembly 550 allows for a determination of the pose of the magnet 500 relative to the sensing assembling 550 from which the signal is sensed based upon sensing the magnetic field.
  • the pose relative to the sensing assembly 550 may be used to determine the pose of the magnet relative to other portions, such as the image 78 and/or the avatar 120.
  • a recall of the registered pose of the sensing assembly may be made in Block 654.
  • the pose of the sensing assembl(ies) 550 may be determined by tracking the sensing assembl(ies) 550 relative to the patient tracker 56, the localizer(s) 50, 55, or other appropriate mechanisms.
  • the sensing assembl(ies) may be positioned at known or predetermined positions relative to the patient 14 including the heart 15.
  • the sensing assembl(ies) 550 may be positioned at a specific geometric position relative to the heart 15 of the patient 14.
  • the sensing assemblies 550 may have known poses relative to the heart 15.
  • An image of the heart, such as the image 78, may then have a known pose relative to the sensing assemblies 550.
  • a signal sensed by the sensing assemblies 550 from the magnets associated with the instrument 62 may then be known relative to the sensing assemblies 550 for determining a pose to generate the graphic representation 90.
  • the system may further recall a pose of the magnet relative to the instrument in Block 660.
  • the position of the magnet 500 relative to the instrument 62 is known. Illustrated in Fig. 13, the axis 518 may be defined relative to the magnetic field 504. It is understood, however, that the longitudinal axis 5181 may also be determined relative to the magnet 500.
  • the magnetic field 504 may be rotated substantially 90° relative to the magnet as illustrated in Fig. 13 and allow the axis 5181 to be the axis defined by the magnetic field 504.
  • the pose of the magnet relative to a portion of the instrument such as the respective long axes of the respective instruments 524, 530, 542, may be recalled in Block 616.
  • a determination of the pose of the instrument may be made in Block 664 and may be displayed in Block 668.
  • Display of the determined pose may be made as discussed and illustrated above, such as Fig. 16.
  • the graphic representation 90 of the instrument 62 may be displayed as the pose of the instrument based upon the determined pose of the magnet relative to the magnetic sensing assemblies 550.
  • the user 18 may therefore view the display device 26 and understand the pose of the instrument 62 relative to portions of the patient 14 that are not visible, such as the heart within patient 14.
  • a determination of whether tracking is to be continued may be made in Block 672. If tracking is to continue, a YES path 674 may be followed to sense a magnetic field from the magnet in Block 614. Thus, the process 600 may loop to provide a real time sensing and display of the instrument 62 relative to a selected portion, such as the image 78, with a display device 26. If tracking is not to be continued, a NO path 680 may be followed to end the process in Block 682. Thus, the process may end when tracking is complete, such as may be selected by the user 18.
  • the process 600 may allow for a tracking and display of a pose of the instrument 62 over time, based on the position of the magnet 500 relative to the instrument 62 as it is moved relative to the tracking magnetic sensing assembl(ies) 550.
  • the magnetic sensing assembl(ies) 550 and the magnet 500 allow the pose of the instrument 62 to be tracked and determined without requiring an external energy emitting system. Rather, the permanent magnet generates the magnetic field that is sensed by the magnetic sensing assembl(ies) to determine the pose of the magnet 500.
  • a field may not be generated by the localizer to define the navigation space but rather the magnetic field from the magnet 500 in the patient 14 may be sensed to make the pose determination.
  • Example embodiments are provided so that this disclosure will be thorough and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
  • the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit that may also be referred to as a processor.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors or processor modules such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors or processor modules may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • Example i A system to track a device relative to a heart of a subject, comprising: an ultrasound imaging system configured to acquire an image data of the heart of the subject; a tracking system; a tracking device configured to be tracked by the tracking system; an instrument configured to be moved relative to the heart of the subject, wherein the tracking device is associated with the instrument; a processing unit configured to execute instructions to: register the image data to the subject; generate an image of the heart of the subject; and generate a graphical representation of the tracked pose of the instrument relative to the generated image.
  • Example 2 The system of Example 1 , wherein ultrasound imaging system is configured to generate image data of the subject in a plane relative to an ultrasound transducer.
  • Example s The system of Example 2, wherein the generated graphical representation includes (i) a representation of the instrument and (ii) a representation of a dimension that the instrument intersects the plane of the image data acquired with the imaging system.
  • Example 4 The system of Example 3, further comprising: an imaging system tracking device; wherein a pose of the plane is determined by tracking the ultrasound imaging system, wherein the imaging system tracking device is associated with the ultrasound transducer.
  • Example 5 The system of Example 1 , wherein the processing unit is configured to execute further instructions to: recall an atlas of a heart operable to be related to the heart of the subject; relate the atlas to the generated image of the heart; and determine a location of at least one portion of the heart in the generated image data based on the recalled atlas of the heart.
  • Example 6 The system of Example 5, wherein the atlas is registered to the generated image of the heart at least by an identification of a landmark in both the atlas and the generated image of the heart.
  • Example 7 The system of Example 6, wherein the image data includes ultrasound image data in at least a first plane and a second plane.
  • Example 8 The system of Example 1 , wherein the instrument includes at least one of a catheter or a lead.
  • Example 9 The system of Example 8, wherein the catheter includes a cannula through which the lead is configured to pass to the heart of the subject.
  • Example 10 The system of Example 1 , wherein the tracking system is configured to track the pose of the tracking device over time; wherein the generated graphical representation of the tracked pose of the instrument relative to the generated image changes over time based on the tracked pose of the tracking device over time.
  • Example 11 A method to track a device relative to a heart of a subject, the method comprising: acquiring an ultrasound image data of the heart of the subject from an ultrasound imaging system; tracking a pose of a tracking device with a tracking system; providing an instrument configured to be moved relative to the heart of the subject, wherein the tracking device is associated with the instrument; executing instructions with a processing unit to: register the image data to the subject; generate an image of the heart of the subject; and generate a graphical representation of the tracked pose of the instrument relative to the generated image.
  • Example 12 The method of Example 11 , wherein acquiring the ultrasound image data includes acquiring ultrasound image data in at least a plane relative to an ultrasound transducer.
  • Example 13 The method of Example 12, wherein executing instructions with the processing unit to generate the graphical representation includes generating: (i) a representation of the instrument, and (ii) a representation of a dimension that the instrument intersects the plane of the image data acquired with the ultrasound imaging system.
  • Example 14 The method of Example 13, further comprising: tracking an imaging system tracking device; determining a pose of the plane by tracking the imaging system tracking device associated with the ultrasound transducer of the ultrasound imaging system.
  • Example 15 The method of Example 11 , wherein executing instructions with the processing unit includes executing further instructions to: recall an atlas of a heart operable to be related to the heart of the subject; relate the atlas to the generated image of the heart; and determine a location of at least one portion of the heart in the generated image data based on the recalled atlas of the heart.
  • Example 16 The method of Example 5, further comprising: registering the atlas to the generated image of the heart at least by an identification of a landmark in both the atlas and the generated image of the heart.
  • Example 17 The method of Example 16, wherein acquired the image data includes acquiring ultrasound image data in at least a first plane and a second plane.
  • Example 18 The method of Example 11 , further comprising: determining a pose of at least one of a catheter or a lead by tracking the tracking device.
  • Example 19 The method of Example 18, further comprising: providing the catheter to include a cannula through which the lead is configured to pass to the heart of the subject.
  • Example 20 The method of Example 11 , further comprising: tracking the pose of the tracking device over time; executing the instructions with the processing unit to generate the graphical representation of the tracked pose of the instrument relative to the generated image to change over time based on the tracked pose of the tracking device over time.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Gynecology & Obstetrics (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un système pour afficher une représentation d'un instrument. La représentation de l'instrument peut être affichée par rapport à une représentation d'un patient. La représentation du patient peut être ajustée en temps réel pour représenter des positions de deux parties du patient l'une par rapport à l'autre.
PCT/IB2023/060058 2022-10-28 2023-10-06 Système et procédé pour illustrer une pose d'un objet WO2024089502A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263420274P 2022-10-28 2022-10-28
US63/420,274 2022-10-28

Publications (1)

Publication Number Publication Date
WO2024089502A1 true WO2024089502A1 (fr) 2024-05-02

Family

ID=88412346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/060058 WO2024089502A1 (fr) 2022-10-28 2023-10-06 Système et procédé pour illustrer une pose d'un objet

Country Status (1)

Country Link
WO (1) WO2024089502A1 (fr)

Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5577502A (en) * 1995-04-03 1996-11-26 General Electric Company Imaging of interventional devices during medical procedures
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US5983126A (en) 1995-11-22 1999-11-09 Medtronic, Inc. Catheter location system and method
US6329916B1 (en) 1995-04-02 2001-12-11 Flying Null Limited Magnetic marker or tag
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US20040097805A1 (en) 2002-11-19 2004-05-20 Laurent Verard Navigation system for cardiac therapies
US6747539B1 (en) 1999-10-28 2004-06-08 Michael A. Martinelli Patient-shielding and coil system
US20040116803A1 (en) 2001-06-04 2004-06-17 Bradley Jascob Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US20040199072A1 (en) 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
US6940941B2 (en) 2002-02-15 2005-09-06 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7085400B1 (en) 2000-06-14 2006-08-01 Surgical Navigation Technologies, Inc. System and method for image based sensor calibration
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7108421B2 (en) 2002-03-19 2006-09-19 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US7188998B2 (en) 2002-03-13 2007-03-13 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US7650178B2 (en) 2004-04-30 2010-01-19 University Of Basel Magnetic field sensor-based navigation system to track MR image-guided interventional procedures
US7676268B2 (en) 2006-11-30 2010-03-09 Medtronic, Inc. Medical methods and systems incorporating wireless monitoring
US7751865B2 (en) 2003-10-17 2010-07-06 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US20100228117A1 (en) 2009-03-09 2010-09-09 Medtronic Navigation, Inc System And Method For Image-Guided Navigation
US7797032B2 (en) 1999-10-28 2010-09-14 Medtronic Navigation, Inc. Method and system for navigating a catheter probe in the presence of field-influencing objects
US8060185B2 (en) 2002-11-19 2011-11-15 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8180428B2 (en) 2007-10-03 2012-05-15 Medtronic, Inc. Methods and systems for use in selecting cardiac pacing sites
US8238631B2 (en) 2009-05-13 2012-08-07 Medtronic Navigation, Inc. System and method for automatic registration between an image and a subject
US8494608B2 (en) 2008-04-18 2013-07-23 Medtronic, Inc. Method and apparatus for mapping a structure
US8532734B2 (en) 2008-04-18 2013-09-10 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US8663120B2 (en) 2008-04-18 2014-03-04 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US8811662B2 (en) 2011-04-29 2014-08-19 Medtronic Navigation, Inc. Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US9138204B2 (en) 2011-04-29 2015-09-22 Medtronic Navigation, Inc. Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US20180078316A1 (en) 2016-09-22 2018-03-22 Medtronic Navigation, Inc. System for Guided Procedures
US20190183577A1 (en) 2017-12-15 2019-06-20 Medtronic, Inc. Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools
US20190307518A1 (en) * 2018-04-06 2019-10-10 Medtronic, Inc. Image-based navigation system and method of using same
US20210369394A1 (en) 2020-05-29 2021-12-02 Medtronic, Inc. Intelligent Assistance (IA) Ecosystem

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US6329916B1 (en) 1995-04-02 2001-12-11 Flying Null Limited Magnetic marker or tag
US5577502A (en) * 1995-04-03 1996-11-26 General Electric Company Imaging of interventional devices during medical procedures
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US5983126A (en) 1995-11-22 1999-11-09 Medtronic, Inc. Catheter location system and method
US6968224B2 (en) 1999-10-28 2005-11-22 Surgical Navigation Technologies, Inc. Method of detecting organ matter shift in a patient
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US6669635B2 (en) 1999-10-28 2003-12-30 Surgical Navigation Technologies, Inc. Navigation information overlay onto ultrasound imagery
US6747539B1 (en) 1999-10-28 2004-06-08 Michael A. Martinelli Patient-shielding and coil system
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US7797032B2 (en) 1999-10-28 2010-09-14 Medtronic Navigation, Inc. Method and system for navigating a catheter probe in the presence of field-influencing objects
US7831082B2 (en) 2000-06-14 2010-11-09 Medtronic Navigation, Inc. System and method for image based sensor calibration
US8320653B2 (en) 2000-06-14 2012-11-27 Medtronic Navigation, Inc. System and method for image based sensor calibration
US7085400B1 (en) 2000-06-14 2006-08-01 Surgical Navigation Technologies, Inc. System and method for image based sensor calibration
US20040116803A1 (en) 2001-06-04 2004-06-17 Bradley Jascob Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US6940941B2 (en) 2002-02-15 2005-09-06 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US7188998B2 (en) 2002-03-13 2007-03-13 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US7108421B2 (en) 2002-03-19 2006-09-19 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7697972B2 (en) 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20040097805A1 (en) 2002-11-19 2004-05-20 Laurent Verard Navigation system for cardiac therapies
US8046052B2 (en) 2002-11-19 2011-10-25 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US8060185B2 (en) 2002-11-19 2011-11-15 Medtronic Navigation, Inc. Navigation system for cardiac therapies
US20120059249A1 (en) * 2002-11-19 2012-03-08 Medtronic Navigation, Inc. Navigation System for Cardiac Therapies
US20040199072A1 (en) 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
US7751865B2 (en) 2003-10-17 2010-07-06 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US7650178B2 (en) 2004-04-30 2010-01-19 University Of Basel Magnetic field sensor-based navigation system to track MR image-guided interventional procedures
US7676268B2 (en) 2006-11-30 2010-03-09 Medtronic, Inc. Medical methods and systems incorporating wireless monitoring
US8180428B2 (en) 2007-10-03 2012-05-15 Medtronic, Inc. Methods and systems for use in selecting cardiac pacing sites
US8494608B2 (en) 2008-04-18 2013-07-23 Medtronic, Inc. Method and apparatus for mapping a structure
US8532734B2 (en) 2008-04-18 2013-09-10 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US8663120B2 (en) 2008-04-18 2014-03-04 Regents Of The University Of Minnesota Method and apparatus for mapping a structure
US9737235B2 (en) 2009-03-09 2017-08-22 Medtronic Navigation, Inc. System and method for image-guided navigation
US20100228117A1 (en) 2009-03-09 2010-09-09 Medtronic Navigation, Inc System And Method For Image-Guided Navigation
US8238631B2 (en) 2009-05-13 2012-08-07 Medtronic Navigation, Inc. System and method for automatic registration between an image and a subject
US8811662B2 (en) 2011-04-29 2014-08-19 Medtronic Navigation, Inc. Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US9138204B2 (en) 2011-04-29 2015-09-22 Medtronic Navigation, Inc. Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker
US20180078316A1 (en) 2016-09-22 2018-03-22 Medtronic Navigation, Inc. System for Guided Procedures
US20190183577A1 (en) 2017-12-15 2019-06-20 Medtronic, Inc. Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools
US20190307518A1 (en) * 2018-04-06 2019-10-10 Medtronic, Inc. Image-based navigation system and method of using same
US20210369394A1 (en) 2020-05-29 2021-12-02 Medtronic, Inc. Intelligent Assistance (IA) Ecosystem

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
AN INTEGRATED ELECTROMAGNETIC NAVIGATION AND PATIENT POSITIONING DEVICE, 1 April 2003 (2003-04-01)
WITTKAMPF ET AL., CATHETER LOCATION SYSTEM AND METHOD, 9 November 1999 (1999-11-09)

Similar Documents

Publication Publication Date Title
US9597154B2 (en) Method and apparatus for optimizing a computer assisted surgical procedure
EP2152183B1 (fr) Appareil de navigation électromagnétique d'une sonde de stimulation magnétique
US9642555B2 (en) Subcutaneous lead guidance
EP2331001B1 (fr) Système de suivi d'un patient
EP1583469B1 (fr) Procede et systeme d'enregistrement d'une situation medicale associee a un premier systeme de coordonnees, dans un second systeme de coordonnees, au moyen d'un systeme mps
US8010177B2 (en) Intraoperative image registration
US7313430B2 (en) Method and apparatus for performing stereotactic surgery
US8734466B2 (en) Method and apparatus for controlled insertion and withdrawal of electrodes
US20100030063A1 (en) System and method for tracking an instrument
WO2011053432A1 (fr) Système et procédé pour évaluer une position et un espacement d'électrode
EP2432388B1 (fr) Système pour placement de dérivation cardiaque
WO2008130354A1 (fr) Alignement d'une image au cours d'une intervention
WO2024089502A1 (fr) Système et procédé pour illustrer une pose d'un objet
WO2024089504A1 (fr) Système utilisable pour déterminer une pose d'un instrument
WO2024089503A1 (fr) Système et procédé pour illustrer une pose d'un objet
US20240341860A1 (en) System and method for illustrating a pose of an object
WO2024214057A1 (fr) Système et procédé d'illustration de la pose d'un objet
WO2024215790A1 (fr) Système et procédé de positionnement d'un élément par rapport à un sujet
WO2024215791A1 (fr) Système et procédé de positionnement d'un élément par rapport à un sujet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23789764

Country of ref document: EP

Kind code of ref document: A1