US20100125284A1 - Registered instrument movement integration - Google Patents

Registered instrument movement integration Download PDF

Info

Publication number
US20100125284A1
US20100125284A1 US12/507,766 US50776609A US2010125284A1 US 20100125284 A1 US20100125284 A1 US 20100125284A1 US 50776609 A US50776609 A US 50776609A US 2010125284 A1 US2010125284 A1 US 2010125284A1
Authority
US
United States
Prior art keywords
instrument
method
workspace
operator
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/507,766
Inventor
Neal A. Tanner
Christopher M. Sewell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hansen Medical Inc
Original Assignee
Hansen Medical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11645408P priority Critical
Application filed by Hansen Medical Inc filed Critical Hansen Medical Inc
Priority to US12/507,766 priority patent/US20100125284A1/en
Assigned to HANSEN MEDICAL, INC. reassignment HANSEN MEDICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SEWELL, CHRISTOPHER M., TANNER, NEAL A.
Publication of US20100125284A1 publication Critical patent/US20100125284A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/71Manipulators operated by drive cable mechanisms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B18/22Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser the beam being directed along or through a flexible conduit, e.g. an optical fibre; Couplings or hand-pieces therefor
    • A61B18/24Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser the beam being directed along or through a flexible conduit, e.g. an optical fibre; Couplings or hand-pieces therefor with a catheter
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0818Redundant systems, e.g. using two independent measuring systems and comparing the signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre

Abstract

Systems and method are disclosed whereby elongate medical instruments may be registered to adjacent tissue structures and other structures, and may be navigated and operated in a coordinated fashion to maximize ranges of motion, ease of use, and other factors. A method for registering an instrument relative to nearby structures may comprise moving a portion of the instrument between two in situ positions, tracking movement during this movement with both a kinematic model and also a localization sensor based configuration, determining the orientation of the tracked portion relative to both the instrument coordinate system used in the kinematic modeling and also a localization coordinate reference frame, and adjusting the orientation of the instrument coordinate reference frame to minimize the difference between determined orientations using the kinematic model and localization sensors. Methods and configurations for navigating coupled and registered instrument sets are also disclosed.

Description

    RELATED APPLICATION DATA
  • The present application claims the benefit under 35 U.S.C. §119 to U.S. Provisional Patent application Ser. No. 61/116,454, filed Nov. 20, 2008. The foregoing application is hereby incorporated by reference into the present application in its entirety.
  • FIELD OF THE INVENTION
  • The invention relates generally to remotely steerable medical instrument systems, such as telerobotic surgical systems, and more particularly to registration and navigation of such systems in a three-dimensional environment adjacent tissue and other structures, in furtherance of minimally invasive diagnostic and therapeutic procedures.
  • BACKGROUND
  • Minimally invasive medical techniques often rely on steerable elongate instruments, such as steerable catheters, to conduct procedures. One of the challenges in conducting diagnostic and/or interventional cases with minimally invasive instruments is understanding wherein pertinent medical instrumentation is located and/or oriented related to nearby tissue structures and other instrumentation. Imaging modalities such as radiography, fluoroscopy, and ultrasound may not be ideally suited for understanding the detailed positioning and orientation of instruments in real or near-real time. For example, it is possible to use multiple planes and/or imaging field of view perspectives with modalities such as fluoroscopy to determine the location and orientation of instrumentation that shows up in the images relative to anatomy which also is featured in the images—but multiplanar imaging may not be convenient or accurate enough to facilitate realtime navigation of minimally invasive instruments through various anatomical spaces. Further, it is possible to utilize kinematic models of instruments to understand the positions and orientations of portions of such instruments, but compliance, control mechanism slack, repositioning, and other factors may lead to the desire to recalibrate kinematic-based position and/or orientation models relative to the actual anatomy from time to time. Embodiments are presented herein to address these and other challenges.
  • SUMMARY
  • One embodiment is directed to a method for navigating a coupled instrument set, comprising registering an instrument set, comprising a first instrument having a first instrument workspace, movably coupled to a second instrument having a second instrument workspace, relative to a three-dimensional map of nearby anatomical structures; presenting an operator with a user interface configured to allow for selection of various anatomical destinations wherein operation of the instrument set is desired; and upon selection of an anatomical destination by an operator, assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination. Registering an instrument set may comprise registering the instrument set relative to a master input device an operator display; and registering the instrument set relative to a three-dimensional map of nearby anatomical structures. Registering the instrument set relative to a three-dimensional map of nearby anatomical structures may comprise navigating the registered instrument set to one or more known anatomical landmarks also featured on the three-dimensional map of nearby anatomical structures, and aligning the map with the instrument workspace of one or more of the instruments comprising the instrument set. Registering the instrument set may comprise registering the first instrument relative to a master input device and operator display, then registering the second instrument relative to the first instrument. Registering an instrument set may comprise moving a portion of the first instrument between a first position in situ and a second position in situ relative to a first instrument coordinate reference frame; tracking movement of the portion relative to the first instrument coordinate reference frame using a kinematic model, and also tracking movement of the portion relative to a localization coordinate reference frame using one or more localization sensors coupled to the portion; determining the orientation of the portion relative to both the first instrument coordinate reference frame and the localization coordinate reference frame; and adjusting the orientation of the first instrument coordinate reference frame to minimize the difference between determined orientations using the kinematic model and localization sensors. Presenting an operator with a user interface configured to allow for selection of various anatomical destinations may comprise presenting the operator with a set of menu driven software selections, presenting the operator with a set of hardware interfaces associated with various anatomical destinations, or presenting the operator with a map of anatomical destinations which may be selected using a master input device. Assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination may comprise limiting commanded movement to movement along a single path. In one embodiment the single path is a line. In another embodiment the single path is a curved path. Limiting commanded movement may comprise imparting haptic feedback to the operator through a haptic master input device. Limiting commanded movement may comprise providing a haptic groove feedback configuration to the operator. Limiting commanded movement may comprise providing a gravity well feedback configuration to the operator. Assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination may comprise limiting commanded movement to movement along a single plane. Limiting commanded movement may comprise imparting haptic feedback to the operator through a haptic master input device. Assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination may comprise avoiding contact between the first or second instruments and certain predetermined anatomical zones of preferred contact avoidance. Avoiding contact may comprise imparting haptic feedback to the operator through a haptic master input device. Assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination may comprise automatically navigating the first and second instrument to the selected anatomical destination. Automatically navigating may comprise taking into account predetermined movement limitations of the first and second instrument. Automatically navigating may comprise taking into account predetermined anatomical contact limitations. Assisting the operator in repositioning the instrument set may comprise sequentially assisting the operator in repositioning one of the first or second instruments toward the selected anatomical destination, then assisting the operator in repositioning the remaining one of the first or second instruments toward the selected anatomical destination. The first and second instruments may be coaxially and movably coupled to one another, and subsequent to positioning of the first, the second may be advanced along the path of the first. The first and second instruments may comprise robotically navigable catheters, and assisting the operator in repositioning the instrument set may comprise electromechanically navigating the first instrument toward the selected anatomical destination, and subsequently following the path established by the first instrument with electromechanical advancement of the second instrument. The first instrument may be repositioned to the selected anatomical destination before movement of the second. The first and second instruments may be alternately and incrementally positioned toward the selected anatomical destination in a plurality of sequences.
  • Another embodiment is directed to a method for navigating a coupled instrument set, comprising registering an instrument set, comprising a first instrument having a first instrument workspace, movably coupled to a second instrument having a second instrument workspace, relative to a three-dimensional map of nearby anatomical structures; presenting an operator with a user interface configured to allow the operator to input navigation commands of the first instrument relative to nearby anatomical structures; and upon input of a navigation command by an operator for the first instrument, moving the first instrument as commanded, while also moving the second instrument so as to keep the first instrument close to a preferred portion of the workspace of the first instrument by repositioning the second instrument. Registering an instrument set may comprise registering the instrument set relative to a master input device an operator display; and registering the instrument set relative to a three-dimensional map of nearby anatomical structures. Registering the instrument set relative to a three-dimensional map of nearby anatomical structures may comprise navigating the registered instrument set to one or more known anatomical landmarks also featured on the three-dimensional map of nearby anatomical structures, and aligning the map with the instrument workspace of one or more of the instruments comprising the instrument set. Registering the instrument set may comprise registering the first instrument relative to a master input device and operator display, then registering the second instrument relative to the first instrument. Registering an instrument set may comprise moving a portion of the first instrument between a first position in situ and a second position in situ relative to a first instrument coordinate reference frame; tracking movement of the portion relative to the first instrument coordinate reference frame using a kinematic model, and also tracking movement of the portion relative to a localization coordinate reference frame using one or more localization sensors coupled to the portion; determining the orientation of the portion relative to both the first instrument coordinate reference frame and the localization coordinate reference frame; and adjusting the orientation of the first instrument coordinate reference frame to minimize the difference between determined orientations using the kinematic model and localization sensors. The method may further comprise secondarily moving the second instrument so as to keep the second instrument as close to a preferred portion of the workspace of the second instrument by repositioning the second instrument. Current positions of the instrument workspaces for the first and second instruments may be illustrated in a user interface relative to the instrument set and nearby anatomical structures. In one embodiment the preferred portion of the workspace of the first instrument may be the center of said workspace. In another embodiment the preferred portion of the workspace of the first instrument may be a forward-oriented conical volume of said workspace. In one embodiment the preferred portion of the workspace of the second instrument may be the center of said workspace. In another embodiment the preferred portion of the workspace of the second instrument may be a forward-oriented conical volume of said workspace. The first and second instruments are coaxially coupled. In one embodiment the second instrument may be moved so as to keep the first instrument close to the preferred portion of the workspace of the first instrument only after the first instrument crosses a threshold of misalignment with the preferred portion of the workspace.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates one embodiment of a robotic catheter system.
  • FIGS. 2A-2C illustrate aspects of one embodiment of a robotic catheter instrument set.
  • FIGS. 3A-3I illustrate embodiments of a registration or alignment technique in accordance with the present invention.
  • FIG. 4 illustrates one embodiment of a registration or alignment technique in accordance with the present invention.
  • FIG. 5 illustrates one embodiment of a registration or alignment technique in accordance with the present invention.
  • FIG. 6 illustrates one embodiment of a registration or alignment technique in accordance with the present invention.
  • FIG. 7 illustrates one embodiment of a navigation configuration in accordance with the present invention.
  • FIG. 8 illustrates one embodiment of a navigation configuration in accordance with the present invention.
  • FIG. 9 illustrates one embodiment of a navigation configuration in accordance with the present invention.
  • FIGS. 10A-10C illustrate one embodiment of a navigation configuration in accordance with the present invention.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, a system (14) is depicted wherein an operator (2) is seated at an operator workstation (6) in a position such that he has access to one or more displays (4), in addition to one or more input devices, such as a master input device (10) and an operator button console or pendant (12). A computing system or controller (8) comprising a processor is operably coupled via a cable (16) to a robotic instrument driver (40), which is coupled to an operating table (36) with a fixed mounting member (38). Similar systems have been described, for example, in U.S. patent application Ser. Nos. 11/073,363; 11/179,007; 11/176,598; 11/176,957; 11/481,433; 11/331,576; 11/637,951; 11/640,099; 11/678,001; 11/690,116; 11/804,585; 11/829,076; 11/833,969; 11/852,255; 11/906,746; 11/972,581; 12/032,626; 12/398,763; and 12/504,564, each of which is incorporated by reference in its entirety into this patent application. It is important to note that while certain aspects of the embodiments described herein are specifically applicable to electromechanically navigated medical instrument systems, other aspects, such as the registration aspects described below, are broadly applicable to steerable or navigable medical instruments which may or may not comprise electromechanical drive systems, and such variations are within the intended scope of this invention.
  • Referring again to FIG. 1, the computing system in the depicted exemplary system is operably coupled to a laser therapy system (28), a video system (26), and a lighting system (24) configured to provide endoscopic lighting for the video system (26) by respective cables (22, 20, 18) connecting such systems to the computing system (8). The computing system, via such couplings, is configured to control lighting, video image capture, and laser energy emission, preferably in response to commands input by the operator (2) to interfaces such as the pendant (12) or master input device (10) at the operator workstation (6). Other input devices, such as a foot pedal (not shown), may also be operably coupled to the computing system (8) to enable an operator (2) to execute commands, such as video capture, laser energy emission, and/or lighting, via such input. The laser system (28) is operably coupled to the depicted robotic instrument assembly (42) via a laser energy transmission fiber assembly, or “laser fiber”, (34) while the video system (26) is operably coupled to the instrument assembly (42) via an optics bundle (32) comprising a plurality of optical transmission fibers. The lighting system (24) is similarly operably coupled to the robotic instrument assembly (42) via a light transmission bundle (30) preferably comprising optical transmission fibers. Such a system may be broadly applied to various clinical and diagnostic scenarios pertinent to healthcare, including but not limited to interventions and diagnostics within the bloodstream, such as minimally invasive, endocardial cardiac ablation, valve repair, and other procedures.
  • Referring to FIGS. 2A-2C, aspects of the depicted elongate steerable instrument assembly (42) are described, such assembly being configured for endoscopic diagnosis and/or intervention in an environment wherein direct optical visualization (for example, with an optical image capture device such as a fiberscope or camera chip) is desired, such as with kidney stone interventions using trans-urethral endolumenal access.
  • Referring to FIG. 2A, an instrument assembly (42) is depicted comprising an inner elongate member, or “guide member”, (81) proximally coupled to a specialized inner instrument base housing (77) which is removably coupleable to an image capture device member (111) preferably comprising a camera chip (not shown). The midsection and distal portion of the inner elongate member (81) are shown slidably coupled and inserted through a working lumen defined through an outer elongate member, or “sheath member”, (79). Also depicted are the outer instrument base housing (75) and a clamp (83) configured to assist with coupling to aspects of an instrument driver (40 such as that shown in FIG. 1). FIG. 2B is a cross sectional view of the instrument assembly (42) depicted in FIG. 2A. Referring to FIG. 2B, the inner elongate member (81) is threaded through a working lumen (181) defined by the outer elongate member (79). The geometric interaction of the outer elongate member working lumen (181), having a substantially square cross sectional shape with rounded corner surfaces (99), and the outer shape of the inner elongate member (81), which in the depicted embodiment has a square cross sectional outer shape with rounded corners (97), is designed to allow for slidable coupling of the two elongate members (for example, to allow insertion of one relative to the other without a great degree of load applied), while also preventing relative rolling, or rotation, of the two elongate members relative to each other—at least in the areas where they are coupled.
  • Referring again to FIG. 2B, a relatively complex embodiment is shown for illustrative purposes, wherein the outer elongate instrument member (79) defines four lumens (89) for four control elements (85), such as metallic, semi-metallic, polymeric, or natural pull or pushwires, to enable relatively sophisticated steering of the outer elongate instrument member (79), when such control elements (85) are coupled to a distal portion of the outer elongate instrument member (79), and also coupled to actuator motors within an instrument driver (40) via a mechanical interfacing with rotatable members coupled to the outer instrument base housing (75), as described in the aforementioned incorporated by reference applications. In other words, in one embodiment, the outer instrument may comprise a 4-wire electromechanically steerable sheath instrument capable of omnidirectional steering (for example, when three or four wires terminate at the same position distally), and capable of more complex shapes when one or more wires terminate more proximally than others. Preferably each wire is actuated utilizing an independently operable motor assembly in the instrument driver (40). In other embodiments, such as the embodiments described in the aforementioned incorporated by reference applications, the outer instrument may be much more simple—for example, with only one, two, or even zero control elements. The outer (79) and inner (81) elongate instrument members may comprise polymeric coextrusions.
  • Referring again to FIG. 2B, the depicted embodiment of the inner elongate instrument member is also relatively sophisticated, defining four instrumentation lumens (93) and a central, larger diameter, working lumen (91) preferably substantially aligned with the longitudinal axis of the inner elongate member (81) and sized to accommodate desired working tools, such as a mini-grasper tool, such as those available from suppliers such as Novare, Inc., or a collapsible basket tool, such as those available from suppliers such as Boston Scientific, Inc. Like the depicted embodiment of the outer elongate instrument member (79), the inner elongate instrument member (81) comprises four control elements (192), such as pushwires and/or pullwires made from metallic, semimetallic, polymeric, or natural materials, threaded through four control element lumens (87). As described above in reference to the outer elongate member (79), this embodiment may be omnidirectionally steerable and/or capable of complex curvatures, via operable coupling of such control elements (191) between distal portions of the inner elongate member (81) and actuation motors within an instrument driver (40). In other embodiments, a simpler configuration comprising one, two, or three control elements (192) may be desired.
  • Referring again to FIG. 2B, the four instrumentation lumens (93) defined within the depicted embodiment of the inner elongate instrument member (81) are configured to accommodate relatively fixed (in other words, the lumens are large enough to accommodate assembly of the instrument, but small enough to provide a relatively close fit thereafter to prevent significant relative motion) positioning of a light bundle (30) and video/optics bundle (32). Another instrumentation lumen (93) is more loosely and slidably coupled to a laser fiber (34), to allow for relative insertion, retraction, and sometimes roll (depending upon the curvature of the overall assembly) intraoperatively. The fourth instrumentation lumen (93) may be utilized as a saline or other fluid (for example, a contrast agent or medicinal fluid) infusion or flush channel (95) for intraoperative use. Referring to FIG. 2C, in one embodiment, it is desirable that about twelve centimeters of a more flexible, steerable distal portion (105) of the inner elongate instrument member (81) be able to protrude out the distal end of the outer elongate instrument member (79), and that the inner elongate instrument member (81) be capable with such protrusion of forming a bend radius (103) of approximately eight millimeters, with a maximum bend angle (101) of approximately 250 degrees.
  • The system described above in reference to FIGS. 1-2C is very sophisticated and capable, but if it is not registered or aligned to pertinent coordinate systems, such as the world coordinate system to which other structures, such as other instruments and tissue structures and maps thereof, may be registered, its utility in medical intervention may not be fully realized. FIGS. 3A-3I illustrate a novel technique for addressing this registration/alignment challenge.
  • Referring to FIG. 3A, a more simplified system is depicted, comprising an outer steerable instrument (79) coaxially and operably coupled with an inner steerable instrument (81), the two again comprising an instrument assembly (42) which is drivably coupled to an electromechanical instrument driver (40) which is operatively coupled, via an electronic communication link (16) such as a cable, to a system controller (8) comprising a processor. The instruments may comprise catheters, probes, or other elongate minimally invasive instruments. In the depicted embodiment, the inner steerable instrument distal tip (60) is coupled to one or more localization sensors (50) operatively coupled, via an electronic communication link (52) such as a small cable, to a localization system (44). The localization system (44) preferably is configured to determine, observe, or track the spatial coordinates of the one or more sensors (50) relative to a localization coordinate reference frame (46—the xprime/yprime/zprime coordinate reference frame) that preferably is a substantially absolute coordinate reference frame, in that it is preferably coupled to something relatively immovable relative to the world coordinate system (i.e., of the floor of the operating room), such as a heavy operating table. The localization system may comprise a potential difference based system, such as those available under the tradename EnSite™ from St. Jude Medical, Inc. of St Paul, Minn.; alternatively the localization system may be ultrasound based, such as the RPM™ system from Boston Scientific Corporation, electromagnetic flux based, such as the systems available from the Biosense Webster division of Johnson & Johnson, Inc., or Bragg-fiber based, such as the systems available from Luna Innovations, Inc. The depicted localization system (44) is a potential difference based system, and operates by monitoring electrical potential differences between one or more localization sensors (50) coupled to an instrument portion, and two or more conductive skin patches (56), also connected to the system (44) by electronic communication links (64) such as a small cables. Such a system is configured to provide positional information regarding the one or more sensors (50) in real or near-real time, but generally is not configured to provide roll orientation information. In other words, twisting of the instrument about its longitudinal axis may very well go undetected by such a system. The depicted instrument assembly (42) does not have a localization sensor coupled to the outer elongate instrument (79), but the position of the distal end (62) of the outer elongate instrument may be inferred through its relationship with the distal end (60) of the inner instrument, which has at least one localization sensor (50) coupled to it. Indeed, in one embodiment, at least two localization sensors are coupled longitudinally in sequence to the inner instrument (81), such as by a known distance roughly equivalent to the amount of inner instrument portion which may be extended or inserted beyond the distal tip (62) of the outer instrument (79), or in other embodiments approximately half or approximately one quarter of this distance, to enable the localization system and/or the control system to determine not only the position of the distal portion of the inner instrument (81), but also the spatial position/orientation of the longitudinal, or Y, axis of the instrument coordinate reference frame (48). In other words, with two localization sensors, position information regarding each, and a known distance between each of the sensors and the instrument coordinate reference frame (48), the position of the instrument coordinate reference frame (48) and orientation of the Y axis thereof may be determined. In another embodiment, rather than having two localization sensors to characterize the system in this way, a single 5 degree of freedom sensor (providing position and also orientation information) may be utilized, given the distance between such sensor and the instrument coordinate reference frame, and an assumption regarding a known shape of the instrument body in between, such as a substantially straight position or shape. For convenience, the illustrations of FIGS. 3A-3E feature one localization sensor (50), which may be, for example, a single 5 degree of freedom localization sensor, or may be thought of as representing a series of two position-only localization sensors. Further, the locations of both instruments may be determined utilizing established kinematic relationships and some basic assumptions regarding lack of contact with outside forces, as described in the aforementioned incorporated by reference applications. As described briefly above, one of the challenges to accurate navigation relative to nearby tissue structures and other instruments is registering one or more coordinate systems pertinent to the operation of the instruments relative to other known coordinate systems, such as those of other tissue structures, or more absolute coordinate systems, such as that of the world, or that of a relatively stable and well-physically-grounded localization system. With a system such as that depicted in FIG. 3A, localization sensing and trajectory comparison may be utilized to accomplish such registration, as described in reference to FIGS. 3B-3I.
  • Referring to FIG. 3B, a close up view of the distal portions of the instruments depicted in FIG. 3A are illustrated. At the outset, the instruments (79, 81) are operated using a kinematic-based control paradigm wherein kinematic formulas associated with movement of mechanisms within the instrument driver (element 40 in FIG. 3A) that are coupled to the instrument assembly (42) may be utilized to infer where in space the distal portions (62, 60) of the instruments (79, 81) are in space relative to an instrument coordinate reference frame (48, X/Y/-Z as depicted) positioned at the distal tip (62) of the outer instrument (79), which has no localization sensor. When the inner instrument (81) is moved, its position may be determined with the localization sensor (50) and system, and also with the kinematic based configuration. In one embodiment, it is desirable to navigate the inner instrument (79) to a curved position, such as that depicted in FIG. 3C, which has enough curvature and insertion length to lie outside of a pre-prescribed sampling zone boundary, which may be displayed to an operator as a semitransparent cylindrical shape in a three dimensional virtual navigation environment, such as that described in the aforementioned incorporated by reference applications, to assist the operator in achieving such curvature. Referring to FIG. 3D, subsequent to achieving the curved position depicted in FIG. 3C, the inner instrument (81) may be retracted along the path it occupied as it was inserted into the curved position, a scenario which may be termed “autoretract” when electromechanically accomplished in accordance with the incorporated by reference applications. As the inner instrument (81) is retracted, or autoretracted, a series of datapoints (68) may be collected using two position tracking schemas: the kinematics based approach relative to the instrument coordinate reference frame (48), and the localization based approach relative to the localization coordinate reference frame (46). Referring to FIG. 3E, the retracting or autoretracting may be stopped when the inner instrument (81) is in a straight position, substantially aligned with the longitudinal (or “Y”) axis of the outer instrument (79), and protruding a predetermined distance, such as between 8 and 20 millimeters, from the distal tip (62) of the outer instrument (79).
  • In another embodiment, the exact opposite pattern may be conducted, with the inner instrument (81) starting in a curved position, then retracted, or “autoretracted” back toward the outer instrument (79) distal tip (62). This order of events may be advantageous because retraction is generally a safe, noninterfering maneuver relative to other surrounding structures, such as tissue structures, and also because particularly with “autoretract” functionality, the trajectory generally is a straight line when projected in the XZ plane, as illustrated in FIG. 3F. In another embodiment, a retraction or autoretraction, followed by a nearly arbitrary manually navigated path (so long as such path is on some kind of curve that is not straight along the axis of the instrument), may be utilized to find all three orientation degrees of freedom, or used to just find the roll offset after an axial pointing direction is determined using two sequentially-longitudinally positioned localization sensors, or a single 5 degree of freedom localization/orientation sensor.
  • Referring to FIG. 3F, a sample set of data (70) acquired during autoretraction of an inner instrument relative to an instrument coordinate reference frame and determined using the kinematics based approach is depicted (positions projected in the XZ plane), with a best fit line (214) fitted through it. Since autoretraction, by definition, is defined as electromechanically retracting straight back to the outer instrument (79) along the path previously occupied during insertion, the kinematic approach predictably has a very clean dataplot with little fit error (such as root mean square, or “RMS” error) between the fitted line (214) and the data.
  • Referring to FIG. 3G, a sample set of data (72) acquired during the same autoretraction is depicted (positions projected in the XZ plane) for comparison, the plot in FIG. 3G being based upon the localization data and relative to the localization coordinate reference frame. As one can see, the localization data is relatively noisy. This noise may be based upon nonlinearities built into the software code resident on the localization system, data artifacts associated with breathing of the patient, physical interference between the subject instrument set and other nearby structures, and other reasons. Rather than simply fitting a line through all of the data and having a fitted line with a relatively large error, such as RMS error, it is preferred to sequentially address the data, starting with the data most likely to not be representative of a scenario wherein aspects of the instrument set are in physical contact with other structures: the data closest to full retraction or autoretraction. It is also preferable, however, to base the fitted line on more rather than less data. In one embodiment, as a compromise, line fitting and fitting quality analysis are conducted for a series of “data windows”. The first window fitted and analyzed is a series of points closest to retraction. Subsequently, one or more additional points are added to the data window, points which are immediately adjacent the previous data window, and a new line is fitted for the new data window and fit quality analysis (such as RMS error calculation) is conducted. The data window is enlarged until the quality of fit increases past a predetermined threshold. In this embodiment, the largest data window having an acceptable fit is considered the “included data” (200) and the line fit therethrough (212) is deemed representative of the localization data for the subject retraction. The remaining data is considered “excluded data” (202). In one embodiment, to ensure that a line perpendicular (not shown) to the selected line (212) through the included data (200) is not selected, the same localization data is plotted broken down in determined X positions versus time (FIG. 3H) and determined Z positions versus the same time scale (FIG. 3I), and a similar data windowing schema is utilized to plot lines through included X data (204), as opposed to excluded X data (206), and included Z data (208), as opposed to excluded Z data (210), separately. This additional step of fitting X and Z separately, each as a function of time, preserves time dependent directionality of the data.
  • In one very simplified embodiment, alignment could be conducted based on a much smaller set of data—such as two data points: one data point from a fully retracted inner instrument position, and one data point from an extended and curved inner instrument configuration. This embodiment would, of course, be more prone to inaccuracy due to noise in such a small dataset.
  • Subsequent to having a reliable line fitted through each of the kinematic-based data and the localization-based data, an orientation difference between the kinematic-based data coordinate system (the instrument coordinate reference frame—48) and the localization-based data coordinate system (the localization coordinate reference frame—46) may be determined, and this difference may be treated as an error in the orientation of the instrument coordinate reference frame (48) which may be minimized by reorienting the instrument coordinate reference frame (48). Subsequent to such minimization, the two coordinate systems should be registered, “aligned”, or “calibrated” relative to each other, and navigation of the instrument assembly (42) relative to the updated/reoriented instrument coordinate reference frame (48) should produce more predictable movements relative to other related coordinate systems and structures registered thereto. FIGS. 4-6 illustrate further aspects of registration embodiments.
  • Referring to FIG. 4, certain aspects of the aforementioned embodiment are summarized in a flowchart. A presumed roll orientation of the instrument coordinate reference frame may not be correct, or may need to be updated (250), due to the fact that if the roll orientation of the instrument coordinate reference frame is not correct, commanded movements may not result in movements exactly as desired (252). This registration challenge may be addressed by putting the instrument coordinate reference frame back into orientation alignment relative to the world coordinate system or other reliable coordinate reference frame (254), such as the localization coordinate reference frame (256).
  • Referring to FIG. 5, an instrument, such as an inner elongate instrument (81), may be moved from a first position to a second position. For example, the instrument may be moved from a retracted or autoretracted and substantially straight position, to a second position, such as a curved position (258). Alternatively, the instrument may be moved from a curved and relatively inserted position to a retracted or autoretracted and substantially straight position. In another variation, the instrument may simply be moved from one position to another, without retraction or autoretraction, such as in an embodiment wherein subtle cyclic motion is overlaid upon realtime navigational movement, allowing the system to be constantly cycling and analyzing new data, and constantly updating registration and alignment of the instrument coordinate reference frame (48) relative to other coordinate reference frames, such as a localization coordinate reference frame. Subsequent to determining orientations relative to the move trajectory in two different coordinate reference frames (262), orientation of the instrument coordinate reference frame (48) may be adjusted, such as by transformation matrix or coordinate system rotation, to minimize the difference between determined orientations using one position and coordinate system versus the other (264). Such a process may be repeated (266) periodically, constantly, once per procedure, after an incident wherein the instrument appears to be stuck on an adjacent structure, and the like.
  • Referring to FIG. 6, a flowchart similar to that depicted in FIG. 5 is shown, with additional details regarding the data windowing technique described above in reference to FIGS. 3A-3I. Referring to FIG. 6, an instrument or portion thereof may be moved from one position to another (268), the movement tracked in two different ways and two different coordinate reference frames (270). Acquired data from the movement tracking may be analyzed using data windowing (272), preferably wherein the window is made as inclusive as possible without violating a predetermined fitted line fit quality threshold. The lines fitted through the included data (274) may be compared, and the orientation of the instrument coordinate reference frame may be adjusted (276). As with the embodiment described in reference to FIG. 5, this process may be repeated intraoperatively (278).
  • Such registration embodiments may be broadly applied. For example, in one embodiment, they may be applied to an instrument configuration comprising a localized intravascular ultrasound (“IVUS”) catheter coupled to another steerable catheter, such as through the working lumen of such steerable catheter. A localization sensor coupled to the IVUS catheter, and a kinematic model, may be used as described above to conduct movements and register the IVUS catheter and steerable catheter to various coordinate reference frames, to allow for coordinated, “instinctive” navigation relative to the coordinate systems of, for example, a master input device and/or display upon which IVUS and other images may be presented to the operator.
  • Having registered an instrument set to other pertinent coordinate systems and structures which are registered or aligned thereto, many intraoperative instrument coordination paradigms may be facilitated. Some of these are illustrated in FIGS. 7, 8, 9, and 10A-10C.
  • Referring to FIG. 7, subsequent to registering an instrument set relative to a three dimensional map of one or more nearby structures, such as anatomical structures or other medical instruments or foreign bodies (280), an operator may be presented with a user interface configured to allow for operator selection of various anatomical destinations wherein operation of the instrument set is desired (282). For example, having positioned an instrument set in the inferior vena cava and confirmed registration to a reliable coordinate system such as a localization coordinate system, an operator may utilize a software-based (such as menus, or manual selection using a master input device and an anatomical map displayed for the operator) or hardware based (for example, buttons on a pendant or keyboard hardware device) controls to select a trans-atrial-septal approach, after which the system may be configured to move/reposition the instrument set such that the workspaces of the instruments are optimized (284) relative to the selected anatomical destination (in one transseptal embodiment, with the outer instrument distal tip positioned substantially perpendicular to the atrial septal wall, and with the inner instrument workspace positioned to allow movement of the inner instrument as far across the atrial septal wall as possible given the limits of the inner instrument workspace). In other words, the system may be preconfigured to assist the operator in optimally positioning a registered instrument set for certain predetermined intraoperative procedures or portions thereof.
  • Referring to FIG. 8, another similar embodiment is illustrated wherein after registration (286) and selection (288), rather than fully automated movement of the instrument set, as in the embodiment described in relation to FIG. 7, the embodiment of FIG. 8 is configured to assist (290) the operator's repositioning of the instrument, using techniques such as guiding navigation of the instrument using a haptic master input device and a haptic groove or haptic gravity well. In other variations, movement may be limited by the system to avoid predetermined zones of preferred minimal or zero contact, to move only along a curve or line or within a plane, and the like—all to position the instrument set in a configuration optimized for the preselected anatomical destination.
  • Referring to FIG. 9, another embodiment is illustrated wherein after registering (292) an operably coupled instrument set to a three-dimensional map of one or more nearby structures, such as anatomical structures, other medical instruments, foreign bodies, and the like, the operator may be presented with a navigation mode wherein commands input (294), for example, at a master input device, may be utilized by the system to assist in optimal positioning of the instrument workspace of the first instrument (296) through automated repositioning of the second instrument. In other words, if the system interprets through an issued movement command from an operator that the operator wishes to take at least a portion of the first instrument, such as the inner instrument (81) to a certain location, the system may be configured to move not only the first instrument, but also the second operatively intercoupled instrument, such as the outer instrument (79), to place the first instrument workspace optimally. In the illustrated embodiment, such optimal positioning comprises moving the second instrument so as to keep the first instrument close to a preferred portion of the first instrument workspace, such as at the center of such workspace. In another embodiment, the second instrument may be moved so as to keep the first instrument close to the preferred portion of the workspace of the first instrument only after the first instrument crosses a threshold of misalignment with the preferred portion of the workspace. For example, with such an embodiment, the second instrument would remain in position until the first instrument reaches, say, 80% of the way (i.e., an 80% threshold) to its instrument workspace, after which the second instrument would move to assist positioning of the first instrument closer to the center of its workspace. Such embodiments may be configured to keep one instrument in the center of its workspace, in a forward oriented conical volume known to be easiest for the instrument to accurately and expediently navigate, etcetera.
  • Referring to FIGS. 10A-10C, a related embodiment is depicted. As shown in FIG. 10A, an inner instrument (81) is being advanced toward a tissue structure (58). As the inner instrument (81) crosses an instrument workspace (216) threshold (an imaginary line depicted as element 220; in some embodiments this imaginary line may be presented to the operator in the user interface), as in FIG. 10A, the outer instrument (79) moves over toward the targeted tissue structure (58), thereby carrying the inner instrument with it, along with the inner instrument's workspace, which is shown in FIG. 10B as advanced over to the right toward the targeted tissue structure (58). In one embodiment, the outer instrument may be configured to advance over, or “swallow” over, the proximal exposed portion of the inner instrument to further increase stability and, depending upon the available remaining room in the insertion degree of freedom, to continue to advance the inner instrument (81) workspace (216) toward the targeted tissue structure.
  • In another embodiment, a shape representing the desired region of the second instrument's workspace may be modeled as an implicit surface, and rendered haptically using haptic implicit surface algorithms—and the direction of the resultant force may be used as the direction in which to move the first instrument. Such force direction may be altered if the motion of the first instrument is desired to be constrained in some way; for example, in one embodiment, it may be projected onto an instrument roll plane to prevent adding torque.
  • While multiple embodiments and variations of the many aspects of the invention have been disclosed and described herein, such disclosure is provided for purposes of illustration only. For example, wherein methods and steps described above indicate certain events occurring in certain order, those of ordinary skill in the art having the benefit of this disclosure would recognize that the ordering of certain steps may be modified and that such modifications are in accordance with the variations of this invention. Additionally, certain of the steps may be performed concurrently in a parallel process when possible, as well as performed sequentially. Accordingly, embodiments are intended to exemplify alternatives, modifications, and equivalents that may fall within the scope of the claims.

Claims (37)

1. A method for navigating a coupled instrument set, comprising:
a. registering an instrument set, comprising a first instrument having a first instrument workspace, movably coupled to a second instrument having a second instrument workspace, relative to a three-dimensional map of nearby anatomical structures;
b. presenting an operator with a user interface configured to allow for selection of various anatomical destinations wherein operation of the instrument set is desired; and
c. upon selection of an anatomical destination by an operator, assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination.
2. The method of claim 1, wherein registering an instrument set comprises:
a. registering the instrument set relative to a master input device an operator display; and
b. registering the instrument set relative to a three-dimensional map of nearby anatomical structures.
3. The method of claim 2, wherein registering the instrument set relative to a three-dimensional map of nearby anatomical structures comprises navigating the registered instrument set to one or more known anatomical landmarks also featured on the three-dimensional map of nearby anatomical structures, and aligning the map with the instrument workspace of one or more of the instruments comprising the instrument set.
4. The method of claim 2, wherein registering the instrument set comprises registering the first instrument relative to a master input device and operator display, then registering the second instrument relative to the first instrument.
5. The method of claim 1, wherein registering an instrument set comprises:
a. moving a portion of the first instrument between a first position in situ and a second position in situ relative to a first instrument coordinate reference frame;
b. tracking movement of the portion relative to the first instrument coordinate reference frame using a kinematic model, and also tracking movement of the portion relative to a localization coordinate reference frame using one or more localization sensors coupled to the portion;
c. determining the orientation of the portion relative to both the first instrument coordinate reference frame and the localization coordinate reference frame; and
d. adjusting the orientation of the first instrument coordinate reference frame to minimize the difference between determined orientations using the kinematic model and localization sensors.
6. The method of claim 1, wherein presenting an operator with a user interface configured to allow for selection of various anatomical destinations comprises presenting the operator with a set of menu driven software selections, presenting the operator with a set of hardware interfaces associated with various anatomical destinations, or presenting the operator with a map of anatomical destinations which may be selected using a master input device.
7. The method of claim 1, wherein assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination comprises limiting commanded movement to movement along a single path.
8. The method of claim 7, wherein the single path is a line.
9. The method of claim 7, wherein the single path is a curved path.
10. The method of claim 7, wherein limiting commanded movement comprises imparting haptic feedback to the operator through a haptic master input device.
11. The method of claim 10, wherein limiting commanded movement comprises providing a haptic groove feedback configuration to the operator.
12. The method of claim 10, wherein limiting commanded movement comprises providing a gravity well feedback configuration to the operator.
13. The method of claim 1, wherein assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination comprises limiting commanded movement to movement along a single plane.
14. The method of claim 13, wherein limiting commanded movement comprises imparting haptic feedback to the operator through a haptic master input device.
15. The method of claim 1, wherein assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination comprises avoiding contact between the first or second instruments and certain predetermined anatomical zones of preferred contact avoidance.
16. The method of claim 15, wherein avoiding contact comprises imparting haptic feedback to the operator through a haptic master input device.
17. The method of claim 1, wherein assisting the operator in repositioning the instrument set such that the first instrument workspace and second instrument workspace are optimized relative to the selected anatomical destination comprises automatically navigating the first and second instrument to the selected anatomical destination.
18. The method of claim 17, wherein automatically navigating comprises taking into account predetermined movement limitations of the first and second instrument.
19. The method of claim 17, wherein automatically navigating comprises taking into account predetermined anatomical contact limitations.
20. The method of claim 1, wherein assisting the operator in repositioning the instrument set comprises sequentially assisting the operator in repositioning one of the first or second instruments toward the selected anatomical destination, then assisting the operator in repositioning the remaining one of the first or second instruments toward the selected anatomical destination.
21. The method of claim 20, wherein the first and second instruments are coaxially and movably coupled to one another, and wherein subsequent to positioning of the first, the second may be advanced along the path of the first.
22. The method of claim 21, wherein the first and second instruments comprise robotically navigable catheters, and wherein assisting the operator in repositioning the instrument set comprises electromechanically navigating the first instrument toward the selected anatomical destination, and subsequently following the path established by the first instrument with electromechanical advancement of the second instrument.
23. The method of claim 22, wherein the first instrument is repositioned to the selected anatomical destination before movement of the second.
24. The method of claim 22, wherein the first and second instruments are alternately and incrementally positioned toward the selected anatomical destination in a plurality of sequences.
25. A method for navigating a coupled instrument set, comprising:
a. registering an instrument set, comprising a first instrument having a first instrument workspace, movably coupled to a second instrument having a second instrument workspace, relative to a three-dimensional map of nearby anatomical structures;
b. presenting an operator with a user interface configured to allow the operator to input navigation commands of the first instrument relative to nearby anatomical structures; and
c. subsequent to input of a navigation command by an operator for the first instrument, moving the first instrument as commanded, while also moving the second instrument so as to keep the first instrument close to a preferred portion of the workspace of the first instrument by repositioning the second instrument.
26. The method of claim 25, wherein registering an instrument set comprises:
a. registering the instrument set relative to a master input device and operator display; and
b. registering the instrument set relative to a three-dimensional map of nearby anatomical structures.
27. The method of claim 26, wherein registering the instrument set relative to a three-dimensional map of nearby anatomical structures comprises navigating the registered instrument set to one or more known anatomical landmarks also featured on the three-dimensional map of nearby anatomical structures, and aligning the map with the instrument workspace of one or more of the instruments comprising the instrument set.
28. The method of claim 26, wherein registering the instrument set comprises registering the first instrument relative to a master input device and operator display, then registering the second instrument relative to the first instrument.
29. The method of claim 25, wherein registering an instrument set comprises:
a. moving a portion of the first instrument between a first position in situ and a second position in situ relative to a first instrument coordinate reference frame;
b. tracking movement of the portion relative to the first instrument coordinate reference frame using a kinematic model, and also tracking movement of the portion relative to a localization coordinate reference frame using one or more localization sensors coupled to the portion;
c. determining the orientation of the portion relative to both the first instrument coordinate reference frame and the localization coordinate reference frame; and
d. adjusting the orientation of the first instrument coordinate reference frame to minimize the difference between determined orientations using the kinematic model and localization sensors.
30. The method of claim 25, further comprising secondarily moving the second instrument so as to keep the second instrument as close to a preferred portion of the workspace of the second instrument by repositioning the second instrument.
31. The method of claim 25, wherein the current positions of the instrument workspaces for the first and second instruments are illustrated in a user interface relative to the instrument set and nearby anatomical structures.
32. The method of claim 25, wherein the preferred portion of the workspace of the first instrument is the center of said workspace.
33. The method of claim 25, wherein the preferred portion of the workspace of the first instrument is a forward-oriented conical volume of said workspace.
34. The method of claim 30, wherein the preferred portion of the workspace of the second instrument is the center of said workspace.
35. The method of claim 30, wherein the preferred portion of the workspace of the second instrument is a forward-oriented conical volume of said workspace.
36. The method of claim 25, wherein the first and second instruments are coaxially coupled.
36. The method of claim 25, wherein the second instrument is moved so as to keep the first instrument close to the preferred portion of the workspace of the first instrument only after the first instrument crosses a threshold of misalignment with the preferred portion of the workspace.
US12/507,766 2008-11-20 2009-07-22 Registered instrument movement integration Abandoned US20100125284A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11645408P true 2008-11-20 2008-11-20
US12/507,766 US20100125284A1 (en) 2008-11-20 2009-07-22 Registered instrument movement integration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/507,766 US20100125284A1 (en) 2008-11-20 2009-07-22 Registered instrument movement integration

Publications (1)

Publication Number Publication Date
US20100125284A1 true US20100125284A1 (en) 2010-05-20

Family

ID=42172603

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/507,766 Abandoned US20100125284A1 (en) 2008-11-20 2009-07-22 Registered instrument movement integration
US12/507,777 Active 2031-09-27 US8317746B2 (en) 2008-11-20 2009-07-22 Automated alignment
US13/678,280 Active US8657781B2 (en) 2008-11-20 2012-11-15 Automated alignment

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/507,777 Active 2031-09-27 US8317746B2 (en) 2008-11-20 2009-07-22 Automated alignment
US13/678,280 Active US8657781B2 (en) 2008-11-20 2012-11-15 Automated alignment

Country Status (1)

Country Link
US (3) US20100125284A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100125285A1 (en) * 2008-11-20 2010-05-20 Hansen Medical, Inc. Automated alignment
US20110113852A1 (en) * 2009-11-13 2011-05-19 Intuitive Surgical, Inc. Optical fiber shape sensor calibration
WO2012082193A1 (en) * 2010-12-16 2012-06-21 St. Jude Medical, Atrial Fibrillation Division, Inc. Proximity sensor interface in a robotic catheter system
US20140276938A1 (en) * 2013-03-15 2014-09-18 Hansen Medical, Inc. User input devices for controlling manipulation of guidewires and catheters
US20140275955A1 (en) * 2013-03-15 2014-09-18 Globus Medical, Inc. Surgical tool systems and method
WO2015135647A1 (en) * 2014-03-12 2015-09-17 Pittaluga Paul Medical device comprising a hydrophilic curved flexible tip for the treatment of varicose veins
EP2881048A4 (en) * 2012-07-31 2016-04-06 Olympus Corp Medical manipulator
US9710921B2 (en) 2013-03-15 2017-07-18 Hansen Medical, Inc. System and methods for tracking robotically controlled medical instruments
EP3119325A4 (en) * 2014-03-17 2017-11-22 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation
US10123755B2 (en) 2013-03-13 2018-11-13 Auris Health, Inc. Reducing incremental measurement sensor error
US10130427B2 (en) 2010-09-17 2018-11-20 Auris Health, Inc. Systems and methods for positioning an elongate member inside a body
US10206746B2 (en) 2013-03-15 2019-02-19 Auris Health, Inc. User interface for active drive apparatus with finite range of motion
US10213264B2 (en) 2013-03-14 2019-02-26 Auris Health, Inc. Catheter tension sensing

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090036900A1 (en) * 2007-02-02 2009-02-05 Hansen Medical, Inc. Surgery methods using a robotic instrument system
IL184151D0 (en) 2007-06-21 2007-10-31 Diagnostica Imaging Software Ltd X-ray measurement method
JP5407036B2 (en) * 2008-09-02 2014-02-05 オリンパスメディカルシステムズ株式会社 Endoscope for treatment
JP5796982B2 (en) * 2011-03-31 2015-10-21 オリンパス株式会社 Control apparatus and control method of the surgical system
WO2013043344A1 (en) * 2011-09-21 2013-03-28 Boston Scientific Scimed, Inc. Systems and methods for preventing laser fiber misfiring within endoscopic access devices
US20130211244A1 (en) * 2012-01-25 2013-08-15 Surgix Ltd. Methods, Devices, Systems, Circuits and Associated Computer Executable Code for Detecting and Predicting the Position, Orientation and Trajectory of Surgical Tools
US8885904B2 (en) * 2012-04-19 2014-11-11 General Electric Company Systems and methods for landmark correction in magnetic resonance imaging
US20140188440A1 (en) * 2012-12-31 2014-07-03 Intuitive Surgical Operations, Inc. Systems And Methods For Interventional Procedure Planning
CN105101905B (en) * 2013-03-29 2017-08-04 奥林巴斯株式会社 Master-slave system

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030109780A1 (en) * 2001-06-07 2003-06-12 Inria Roquencourt Methods and apparatus for surgical planning
US20050222554A1 (en) * 2004-03-05 2005-10-06 Wallace Daniel T Robotic catheter system
US20060095022A1 (en) * 2004-03-05 2006-05-04 Moll Frederic H Methods using a robotic catheter system
US20060200026A1 (en) * 2005-01-13 2006-09-07 Hansen Medical, Inc. Robotic catheter system
US20070043338A1 (en) * 2004-03-05 2007-02-22 Hansen Medical, Inc Robotic catheter system and methods
US20070156123A1 (en) * 2005-12-09 2007-07-05 Hansen Medical, Inc Robotic catheter system and methods
US20070197896A1 (en) * 2005-12-09 2007-08-23 Hansen Medical, Inc Robotic catheter system and methods
US20070233044A1 (en) * 2006-02-22 2007-10-04 Hansen Medical, Inc. Apparatus for measuring distal forces on a working instrument
US20070265503A1 (en) * 2006-03-22 2007-11-15 Hansen Medical, Inc. Fiber optic instrument sensing system
US20080004633A1 (en) * 2006-05-19 2008-01-03 Mako Surgical Corp. System and method for verifying calibration of a surgical device
US20080027464A1 (en) * 2006-07-26 2008-01-31 Moll Frederic H Systems and methods for performing minimally invasive surgical operations
US20080058836A1 (en) * 2006-08-03 2008-03-06 Hansen Medical, Inc. Systems and methods for performing minimally invasive procedures
US20080082109A1 (en) * 2006-09-08 2008-04-03 Hansen Medical, Inc. Robotic surgical system with forward-oriented field of view guide instrument navigation
US20080119727A1 (en) * 2006-10-02 2008-05-22 Hansen Medical, Inc. Systems and methods for three-dimensional ultrasound mapping
US20080140087A1 (en) * 2006-05-17 2008-06-12 Hansen Medical Inc. Robotic instrument system
US20080167750A1 (en) * 2007-01-10 2008-07-10 Stahler Gregory J Robotic catheter system and methods
US20080262480A1 (en) * 2007-02-15 2008-10-23 Stahler Gregory J Instrument assembly for robotic instrument system
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20100125285A1 (en) * 2008-11-20 2010-05-20 Hansen Medical, Inc. Automated alignment
US7763035B2 (en) * 1997-12-12 2010-07-27 Medtronic Navigation, Inc. Image guided spinal surgery guide, system and method for use thereof
US20100228191A1 (en) * 2009-03-05 2010-09-09 Hansen Medical, Inc. Lockable support assembly and method
US20110015483A1 (en) * 2009-07-16 2011-01-20 Federico Barbagli Endoscopic robotic catheter system

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030073908A1 (en) * 1996-04-26 2003-04-17 2000 Injectx, Inc. Method and apparatus for delivery of genes, enzymes and biological agents to tissue cells
US5575810A (en) * 1993-10-15 1996-11-19 Ep Technologies, Inc. Composite structures and methods for ablating tissue to form complex lesion patterns in the treatment of cardiac conditions and the like
US5710870A (en) * 1995-09-07 1998-01-20 California Institute Of Technology Decoupled six degree-of-freedom robot manipulator
US5722959A (en) * 1995-10-24 1998-03-03 Venetec International, Inc. Catheter securement device
US5855583A (en) * 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US5830224A (en) * 1996-03-15 1998-11-03 Beth Israel Deaconess Medical Center Catheter apparatus and methodology for generating a fistula on-demand between closely associated blood vessels at a pre-chosen anatomic site in-vivo
US5845646A (en) * 1996-11-05 1998-12-08 Lemelson; Jerome System and method for treating select tissue in a living being
US5876373A (en) * 1997-04-04 1999-03-02 Eclipse Surgical Technologies, Inc. Steerable catheter
US6061587A (en) * 1997-05-15 2000-05-09 Regents Of The University Of Minnesota Method and apparatus for use with MR imaging
US6200312B1 (en) * 1997-09-11 2001-03-13 Vnus Medical Technologies, Inc. Expandable vein ligator catheter having multiple electrode leads
US6086532A (en) * 1997-09-26 2000-07-11 Ep Technologies, Inc. Systems for recording use of structures deployed in association with heart tissue
US20020120200A1 (en) * 1997-10-14 2002-08-29 Brian Brockway Devices, systems and methods for endocardial pressure measurement
US7214230B2 (en) * 1998-02-24 2007-05-08 Hansen Medical, Inc. Flexible instrument
US20030135204A1 (en) * 2001-02-15 2003-07-17 Endo Via Medical, Inc. Robotically controlled medical instrument with a flexible section
US7169141B2 (en) * 1998-02-24 2007-01-30 Hansen Medical, Inc. Surgical instrument
US7699835B2 (en) * 2001-02-15 2010-04-20 Hansen Medical, Inc. Robotically controlled surgical instruments
US8414505B1 (en) * 2001-02-15 2013-04-09 Hansen Medical, Inc. Catheter driver system
US7766894B2 (en) * 2001-02-15 2010-08-03 Hansen Medical, Inc. Coaxial catheter system
JPH11267133A (en) * 1998-03-25 1999-10-05 Olympus Optical Co Ltd Therapeutic apparatus
DE69940792D1 (en) * 1998-03-31 2009-06-04 Medtronic Vascular Inc Tissue penetrating catheter with transducers for imaging
US6004271A (en) * 1998-05-07 1999-12-21 Boston Scientific Corporation Combined motor drive and automated longitudinal position translator for ultrasonic imaging system
US6409674B1 (en) * 1998-09-24 2002-06-25 Data Sciences International, Inc. Implantable sensor with wireless communication
US20030074011A1 (en) * 1998-09-24 2003-04-17 Super Dimension Ltd. System and method of recording and displaying in context of an image a location of at least one point-of-interest in a body during an intra-body medical procedure
US6331181B1 (en) * 1998-12-08 2001-12-18 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
AU4323701A (en) * 2000-02-25 2001-09-03 Univ Leland Stanford Junior Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body
US6610007B2 (en) * 2000-04-03 2003-08-26 Neoguide Systems, Inc. Steerable segmented endoscope and method of insertion
US6551273B1 (en) * 2000-08-23 2003-04-22 Scimed Life Systems, Inc. Catheter having a shaft keeper
AU2002305341A1 (en) * 2001-05-06 2002-11-18 Stereotaxis, Inc. System and methods for advancing a catheter
US20040176751A1 (en) * 2002-08-14 2004-09-09 Endovia Medical, Inc. Robotic medical instrument system
US7404824B1 (en) * 2002-11-15 2008-07-29 Advanced Cardiovascular Systems, Inc. Valve aptation assist device
US7101387B2 (en) * 2003-04-30 2006-09-05 Scimed Life Systems, Inc. Radio frequency ablation cooling shield
US20040220588A1 (en) * 2003-05-01 2004-11-04 James Kermode Guide assembly
EP1691666B1 (en) * 2003-12-12 2012-05-30 University of Washington Catheterscope 3d guidance and interface system
US20050215888A1 (en) * 2004-03-05 2005-09-29 Grimm James E Universal support arm and tracking array
US8971597B2 (en) * 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US20070038181A1 (en) * 2005-08-09 2007-02-15 Alexander Melamud Method, system and device for delivering a substance to tissue
US8444631B2 (en) * 2007-06-14 2013-05-21 Macdonald Dettwiler & Associates Inc Surgical manipulator

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7763035B2 (en) * 1997-12-12 2010-07-27 Medtronic Navigation, Inc. Image guided spinal surgery guide, system and method for use thereof
US20030109780A1 (en) * 2001-06-07 2003-06-12 Inria Roquencourt Methods and apparatus for surgical planning
US20060095022A1 (en) * 2004-03-05 2006-05-04 Moll Frederic H Methods using a robotic catheter system
US20060100610A1 (en) * 2004-03-05 2006-05-11 Wallace Daniel T Methods using a robotic catheter system
US20060293643A1 (en) * 2004-03-05 2006-12-28 Wallace Daniel T Robotic catheter system
US20070043338A1 (en) * 2004-03-05 2007-02-22 Hansen Medical, Inc Robotic catheter system and methods
US20050222554A1 (en) * 2004-03-05 2005-10-06 Wallace Daniel T Robotic catheter system
US20060200026A1 (en) * 2005-01-13 2006-09-07 Hansen Medical, Inc. Robotic catheter system
US20070156123A1 (en) * 2005-12-09 2007-07-05 Hansen Medical, Inc Robotic catheter system and methods
US20070197896A1 (en) * 2005-12-09 2007-08-23 Hansen Medical, Inc Robotic catheter system and methods
US20070233044A1 (en) * 2006-02-22 2007-10-04 Hansen Medical, Inc. Apparatus for measuring distal forces on a working instrument
US20070265503A1 (en) * 2006-03-22 2007-11-15 Hansen Medical, Inc. Fiber optic instrument sensing system
US20080140087A1 (en) * 2006-05-17 2008-06-12 Hansen Medical Inc. Robotic instrument system
US20080004633A1 (en) * 2006-05-19 2008-01-03 Mako Surgical Corp. System and method for verifying calibration of a surgical device
US20080027464A1 (en) * 2006-07-26 2008-01-31 Moll Frederic H Systems and methods for performing minimally invasive surgical operations
US20080058836A1 (en) * 2006-08-03 2008-03-06 Hansen Medical, Inc. Systems and methods for performing minimally invasive procedures
US20080082109A1 (en) * 2006-09-08 2008-04-03 Hansen Medical, Inc. Robotic surgical system with forward-oriented field of view guide instrument navigation
US20080119727A1 (en) * 2006-10-02 2008-05-22 Hansen Medical, Inc. Systems and methods for three-dimensional ultrasound mapping
US20080167750A1 (en) * 2007-01-10 2008-07-10 Stahler Gregory J Robotic catheter system and methods
US20080262480A1 (en) * 2007-02-15 2008-10-23 Stahler Gregory J Instrument assembly for robotic instrument system
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20100125285A1 (en) * 2008-11-20 2010-05-20 Hansen Medical, Inc. Automated alignment
US20100228191A1 (en) * 2009-03-05 2010-09-09 Hansen Medical, Inc. Lockable support assembly and method
US20110015483A1 (en) * 2009-07-16 2011-01-20 Federico Barbagli Endoscopic robotic catheter system

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8657781B2 (en) 2008-11-20 2014-02-25 Hansen Medical, Inc. Automated alignment
US20100125285A1 (en) * 2008-11-20 2010-05-20 Hansen Medical, Inc. Automated alignment
US8317746B2 (en) 2008-11-20 2012-11-27 Hansen Medical, Inc. Automated alignment
US20110113852A1 (en) * 2009-11-13 2011-05-19 Intuitive Surgical, Inc. Optical fiber shape sensor calibration
WO2011059888A3 (en) * 2009-11-13 2011-09-29 Intuitive Surgical Operations, Inc. Optical fiber shape sensor calibration
US8183520B2 (en) 2009-11-13 2012-05-22 Intuitive Surgical Operations, Inc. Optical fiber shape sensor calibration
US10130427B2 (en) 2010-09-17 2018-11-20 Auris Health, Inc. Systems and methods for positioning an elongate member inside a body
WO2012082193A1 (en) * 2010-12-16 2012-06-21 St. Jude Medical, Atrial Fibrillation Division, Inc. Proximity sensor interface in a robotic catheter system
EP2881048A4 (en) * 2012-07-31 2016-04-06 Olympus Corp Medical manipulator
US10123755B2 (en) 2013-03-13 2018-11-13 Auris Health, Inc. Reducing incremental measurement sensor error
US10213264B2 (en) 2013-03-14 2019-02-26 Auris Health, Inc. Catheter tension sensing
US10206746B2 (en) 2013-03-15 2019-02-19 Auris Health, Inc. User interface for active drive apparatus with finite range of motion
US9710921B2 (en) 2013-03-15 2017-07-18 Hansen Medical, Inc. System and methods for tracking robotically controlled medical instruments
US20140275955A1 (en) * 2013-03-15 2014-09-18 Globus Medical, Inc. Surgical tool systems and method
US20140276938A1 (en) * 2013-03-15 2014-09-18 Hansen Medical, Inc. User input devices for controlling manipulation of guidewires and catheters
US10130345B2 (en) 2013-03-15 2018-11-20 Auris Health, Inc. System and methods for tracking robotically controlled medical instruments
FR3018444A1 (en) * 2014-03-12 2015-09-18 Paul Pittaluga Medical device comprising a hydrophilic hooked flexible tip for treating varicose veins
WO2015135647A1 (en) * 2014-03-12 2015-09-17 Pittaluga Paul Medical device comprising a hydrophilic curved flexible tip for the treatment of varicose veins
EP3119325A4 (en) * 2014-03-17 2017-11-22 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation

Also Published As

Publication number Publication date
US8317746B2 (en) 2012-11-27
US8657781B2 (en) 2014-02-25
US20130072944A1 (en) 2013-03-21
US20100125285A1 (en) 2010-05-20

Similar Documents

Publication Publication Date Title
US8551076B2 (en) Retrograde instrument
US8900131B2 (en) Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
JP5285270B2 (en) Automatic guide wire operating system
US8974408B2 (en) Robotic catheter system
US8615288B2 (en) Robotically guided catheter
US8337397B2 (en) Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
EP2038712B1 (en) Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
EP2177174B1 (en) Robotic instrument system
US7963288B2 (en) Robotic catheter system
JP6293777B2 (en) Collision avoidance between the controlled motion of the image capture device and a steerable device movable arm
US10004387B2 (en) Method and system for assisting an operator in endoscopic navigation
US7961924B2 (en) Method of three-dimensional device localization using single-plane imaging
US10219874B2 (en) Instrument device manipulator with tension sensing apparatus
US20090076476A1 (en) Systems and methods employing force sensing for mapping intra-body tissue
US9737371B2 (en) Configurable robotic surgical system with virtual rail and flexible endoscope
US20100331856A1 (en) Multiple flexible and steerable elongate instruments for minimally invasive operations
US7974681B2 (en) Robotic catheter system
US9500472B2 (en) System and method for sensing shape of elongated instrument
US7466303B2 (en) Device and process for manipulating real and virtual objects in three-dimensional space
EP2411966B1 (en) System for providing visual guidance for steering a tip of an endoscopic device towards one or more landmarks and assisting an operator in endoscopic navigation
US9629595B2 (en) Systems and methods for localizing, tracking and/or controlling medical instruments
EP2542292A1 (en) Robotic catheter system
US9727963B2 (en) Navigation of tubular networks
US20170049524A1 (en) System and Method for Controlling a Remote Medical Device Guidance System in Three-Dimensions using Gestures
CN105208960B (en) Systems and methods for integration with external robotic medical imaging system

Legal Events

Date Code Title Description
AS Assignment

Owner name: HANSEN MEDICAL, INC.,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TANNER, NEAL A.;SEWELL, CHRISTOPHER M.;REEL/FRAME:023227/0871

Effective date: 20090914

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION