Connect public, paid and private patent data with Google Patents Public Datasets

Laparoscopic ultrasound robotic surgical system

Download PDF

Info

Publication number
US20070021738A1
US20070021738A1 US11447668 US44766806A US2007021738A1 US 20070021738 A1 US20070021738 A1 US 20070021738A1 US 11447668 US11447668 US 11447668 US 44766806 A US44766806 A US 44766806A US 2007021738 A1 US2007021738 A1 US 2007021738A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
ultrasound
probe
lus
robotic
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US11447668
Inventor
Christopher Hasser
Russell Taylor
Michael Choti
Joshua Leven
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Johns Hopkins University
Intuitive Surgical Operations Inc
Original Assignee
Johns Hopkins University
Intuitive Surgical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3132Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/03Automatic limiting or abutting means, e.g. for safety

Abstract

A LUS robotic surgical system is trainable by a surgeon to automatically move a LUS probe in a desired fashion upon command so that the surgeon does not have to do so manually during a minimally invasive surgical procedure. A sequence of 2D ultrasound image slices captured by the LUS probe according to stored instructions are processable into a 3D ultrasound computer model of an anatomic structure, which may be displayed as a 3D or 2D overlay to a camera view or in a PIP as selected by the surgeon or programmed to assist the surgeon in inspecting an anatomic structure for abnormalities. Virtual fixtures are definable so as to assist the surgeon in accurately guiding a tool to a target on the displayed ultrasound image.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application claims priority to U.S. provisional application Ser. No. 60/688,019 filed Jun. 6, 2005, which is incorporated herein by reference.
  • GOVERNMENT RIGHTS STATEMENT
  • [0002]
    This invention was made with-Government support under Grant No. 1 R41 RR019159-01 awarded by the National Institutes of Health. The Government has certain rights to the invention.
  • FIELD OF THE INVENTION
  • [0003]
    The present invention generally relates to robotic surgical systems and in particular, to a laparoscopic ultrasound robotic surgical system useful for performing minimally invasive surgical procedures.
  • BACKGROUND OF THE INVENTION
  • [0004]
    Minimally invasive surgery offers many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue. Consequently, demand for minimally invasive surgery using robotic surgical systems is strong and growing.
  • [0005]
    Laparoscopy is a type of minimally invasive surgery in which a small incision is made in the abdominal wall through which an instrument called a laparoscope is inserted to permit anatomic structures within the abdomen and pelvis to be seen. The abdominal cavity is commonly distended and made visible by the instillation of absorbable gas such as carbon dioxide. Tubes may be pushed through the same or different incisions in the skin so that probes or other instruments can be introduced to a surgical site. In this way, a number of surgical procedures can be performed without the need for a large or open cavity surgical incision.
  • [0006]
    One disadvantage of laparoscopy, however, is the inability to manually palpate hidden or solid organs. Laparascopic Ultrasound (“LUS”) allows the surgeon to overcome this limitation by providing visualization of deeper structures. In fact, even when open cavity operations are performed, intraoperative ultrasonography may be significantly more sensitive at detecting otherwise occult lesions within anatomic structures than bimanual palpation.
  • [0007]
    As an example, intraoperative ultrasonography of the liver is useful in a variety of clinical settings during laparoscopic surgery. These include: staging and assessment of the liver, including ultrasound-guided needle biopsy, liver tumor ablation, and evaluation of the liver prior to laparoscopic liver resection.
  • [0008]
    For resection procedures, surgeons should have the ability to perform accurate staging of the liver and other sites to rule out metastatic disease prior to resection. The addition of LUS to standard laparoscopy improves the diagnosis of metastases over conventional preoperative diagnostic methods.
  • [0009]
    Ultrasound-directed liver biopsy is an important component of hepatic staging and assessment. When a lesion is identified by ultrasound, needle biopsy is necessary to confirm the findings histologically. Current practice requires manual free-hand LUS in conjunction with free-hand positioning of the biopsy needle under ultrasound guidance.
  • [0010]
    For the treatment of unresectable metastases, increasing interest has been focused on ablative approaches such as radiofrequency (“RF”), cryotherapy, microwave, or chemical ablation. While interstitial ablation can be performed percutaneously or during open surgery, laparoscopic ablation has significant advantages. First, unlike percutaneous therapy, laparoscopy can identify both hepatic and extrahepatic metastases not visualized on preoperative imaging, which misses significant tumors in about 10% to 20% of patients with colorectal liver metastases. Second, laparoscopic or operative ultrasound (“US”) has been shown to be significantly more accurate than transabdominal US, CT or MR at visualizing liver lesions. Further, operative approaches, including laparoscopy, permit mobilization of structures away from a surface tumor that may be thermally injured during RF ablation. Percutaneous ablation and laparoscopic ablation both typically require general anesthesia and an overnight hospital stay. Laparoscopy, on the other hand, does not impose a significantly greater burden on the patient.
  • [0011]
    While ablation promises advantages compared to other approaches, the technical difficulty of manipulating the ultrasound probe, aligning the ultrasound probe with the ablation probe, and placement of the ablation probe demands considerable expertise. The surgeon must precisely place the ablation probe tip within the volumetric center of the tumor in order to achieve adequate destruction of the tumor and a 1 cm zone of surrounding normal parenchyma. Tumors are identified by preoperative imaging, primarily CT and MR, and then laparoscopically localized by LUS.
  • [0012]
    One major limitation of ablative approaches is the lack of accuracy in probe tip placement within the center of the tumor. This is particularly important, as histologic margins cannot be assessed after ablation as is done with hepatic resection. In addition, manual guidance often requires multiple passes and repositioning of the probe tip, further increasing the risk of bleeding and tumor dissemination. Intraoperative ultrasound provides excellent visualization of tumors and provides guidance for RF probe placement, but its 2D-nature and dependence on the sonographer's skill limit its effectiveness.
  • [0013]
    Although laparoscopic instrumentation and techniques are beginning to be extended to resection of the liver, loss of the surgeon's tactile sense makes it difficult to assess the safe margins of resection necessary for safe parenchymal transection. Lack of clear visualization and mapping of intrahepatic structures with current LUS techniques could result in catastrophic injury to major adjacent structures. The surgeon must carefully examine the liver by ultrasound prior to resection in order to rule out additional tumors which may preclude curative therapy. Surgeons also require ultrasound to determine and plan safe and complete resection with sufficient surgical margin clearance.
  • [0014]
    Despite its theoretical advantages, intraoperative LUS is not widely practiced for such uses as laparoscopic liver cancer surgery. To expand usage in this and other applications, advances in LUS robotic surgical systems that improve surgeon efficiency in performing minimally invasive surgical procedures, as well as the ease of using those systems is desirable.
  • [0015]
    For example, optimization of LUS for hepatic surgery may significantly improve the clinical management of patients. In addition to minimizing morbidity and discomfort, an improved LUS robotic surgical system may significantly reduce costs. Faster, more accurate, and more complete assessment of the liver may be performed by experts, as well as potentially by surgeons who are not experts in intraoperative ultrasonography of the liver.
  • [0016]
    Image-guided biopsy of sometimes small and inaccessible liver lesions may be facilitated. Advanced LUS robotic tools could increase the use of resection as a definitive treatment for larger and less favorably placed tumors. Improved real-time guidance for planning, delivery and monitoring of ablative therapy may also provide the missing tool needed to allow accurate and effective application of this promising therapy.
  • OBJECTS AND SUMMARY OF THE INVENTION
  • [0017]
    Accordingly, one object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that are easy to use and promote surgeon efficiency.
  • [0018]
    Another object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that provide faster, more accurate and complete assessment of anatomic structures.
  • [0019]
    Another object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that provide robotically generated intra-operative 3D ultrasound images of an anatomic structure using surgeon trained trajectories.
  • [0020]
    Another object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that provide flexible display of ultrasound images on a display screen.
  • [0021]
    Still another object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that provide assistance in guiding a tool to a target on an anatomic structure.
  • [0022]
    These and additional objects are accomplished by the various aspects of the present invention, wherein briefly stated, one aspect is laparoscopic ultrasound robotic surgical system comprising: a first robotic arm mechanically coupled to an ultrasound probe; a second robotic arm mechanically coupled to a surgery related device; a master manipulator; a control switch having user selectable first and second modes; and a processor configured to cause the second robotic arm to be locked in position and the first robotic arm to move the ultrasound probe according to user manipulation of the master manipulator when the control switch is in the first mode, and cause the second robotic arm to manipulate the tool according to manipulation of the master manipulator and the first robotic arm to move the ultrasound probe according to stored instructions upon detection of a user command associated with the stored instructions when the control switch is in the second mode.
  • [0023]
    Another aspect is a method for providing robotic assisted laparoscopic ultrasound, comprising: storing a current ultrasound probe position and orientation upon detection of a start of training indication; and periodically storing ultrasound probe positions and orientations to define a trajectory of positions and orientations until detection of an end of training indication.
  • [0024]
    Another aspect is a method for providing robotic assisted laparoscopic ultrasound, comprising: capturing an ultrasound image using an ultrasound probe disposed at a position and orientation; storing information of the position and orientation; generating a clickable thumbnail of the ultrasound image; associating the stored position and orientation with the clickable thumbnail; and displaying the clickable thumbnail on a display screen.
  • [0025]
    Still another aspect is a method for providing robotic assisted laparoscopic ultrasound, comprising: displaying an ultrasound view of an anatomic structure in a patient as a registered overlay to a camera view of the anatomic structure; receiving information of a target marked on the ultrasound view; determining a path for a tool to travel to the target within the patient; and generating a virtual fixture to assist in electronically constraining the tool to travel over the determined path.
  • [0026]
    Additional objects, features and advantages of the various aspects of the present invention will become apparent from the following description of its preferred embodiment, which description should be taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0027]
    FIG. 1 illustrates a top view of an operating room employing a laparoscopic ultrasound robotic surgical system utilizing aspects of the present invention.
  • [0028]
    FIG. 2 illustrates a block diagram of a laparoscopic ultrasound robotic surgical system utilizing aspects of the present invention.
  • [0029]
    FIG. 3 illustrates a laparoscopic ultrasound probe utilizing aspects of the present invention.
  • [0030]
    FIG. 4 illustrates a flow diagram of a method for training a LUS robotic surgical system to robotically move a LUS probe in a trained manner upon command, utilizing aspects of the present invention.
  • [0031]
    FIG. 5 illustrates a flow diagram of a method for generating a clickable thumbnail image that allows a user to command that a LUS probe be automatically moved to a position and orientation from which the image was captured, utilizing aspects of the present invention.
  • [0032]
    FIG. 6 illustrates a flow diagram of a method for automatically moving s LUS probe to a position and orientation associated with a clickable thumbnail image, utilizing aspects of the present invention.
  • [0033]
    FIG. 7 illustrates a flow diagram of a method for robotically assisted needle guidance to a marked lesion of a cancerous structure, utilizing aspects of the present invention.
  • [0034]
    FIG. 8 illustrates a perspective view of a 3D ultrasound image of an anatomic structure in a camera reference frame with selectable 2D image slices as used in a medical robotic system utilizing aspects of the present invention.
  • [0035]
    FIG. 9 illustrates a perspective view of a 3D camera view of an anatomic structure in a camera reference as used in a medical robotic system utilizing aspects of the present invention.
  • [0036]
    FIG. 10 illustrates a perspective view of a frontal 2D slice of a 3D ultrasound view of an anatomic structure that overlays a 3D camera view of the anatomic structure, as displayable in a medical robotic system utilizing aspects of the present invention.
  • [0037]
    FIG. 11 illustrates a perspective view of an inner 2D slice of a 3D ultrasound view of an anatomic structure that overlays a 3D camera view of the anatomic structure, as displayable in a medical robotic system utilizing aspects of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • [0038]
    FIG. 1 illustrates, as an example, a top view of an operating room employing a robotic surgical system. The robotic surgical system in this case is a Laparascopic Ultrasound Robotic Surgical System 100 including a Console (“C”) utilized by a Surgeon (“S”) while performing a minimally invasive diagnostic or surgical procedure with assistance from one or more Assistants (“A”) on a Patient (“P”) who is reclining on an Operating table (“O”).
  • [0039]
    The Console includes a Master Display 104 (also referred to herein as a “Display Screen”) for displaying one or more images of a surgical site within the Patient as well as perhaps other information to the Surgeon. Also included are Master Input Devices 107 and 108 (also referred to herein as “Master Manipulators”), one or more Foot Pedals 105 and 106, a Microphone 103 for receiving voice commands from the Surgeon, and a Processor 102. The Master Input Devices 107 and 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, or the like. The Processor 102 is preferably a personal computer that may be integrated into the Console or otherwise connected to it in a conventional manner.
  • [0040]
    The Surgeon performs a minimally invasive surgical procedure by manipulating the Master Input Devices 107 and 108 so that the Processor 102 causes their respectively associated Slave Arms 128 and 129 (also referred to herein as “Slave Manipulators”) to manipulate their respective removably coupled and held Surgical Instruments 138 and 139 (also referred to herein as “Tools”) accordingly, while the Surgeon views three-dimensional (“3D”) images of the surgical site on the Master Display 104.
  • [0041]
    The Tools 138 and 139 are preferably Intuitive Surgical's proprietary EndoWrist™ articulating instruments, which are modeled after the human wrist so that when added to the motions of the robot arm holding the tool, they allow a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery. Additional details on such tools may be found in commonly owned U.S. Pat. No. 5,797,900 entitled “Wrist Mechanism for Surgical Instrument for Performing Minimally Invasive Surgery with Enhanced Dexterity and Sensitivity,” which is incorporated herein by this reference. At the operating end of each of the Tools 138 and 139 is a manipulatable end effector such as a clamp, grasper, scissor, stapler, blade, needle, or needle holder.
  • [0042]
    The Master Display 104 has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”). The system offers higher fidelity than polarization, shutter eyeglass, or other techniques. Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The Surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the Surgeon to display and manipulate 3-D intraoperative imagery.
  • [0043]
    A Stereoscopic Endoscope 140 (also referred to as a “Laparoscope”) provides right and left camera views to the Processor 102 so that it may process the information according to programmed instructions and cause it to be displayed on the Master Display 104. A Laparoscopic Ultrasound (“LUS”) Probe 150 provides two-dimensional (“2D”) ultrasound image slices of an anatomic structure to the Processor 102 so that the Processor 102 may generate a 3D ultrasound computer model of the anatomic structure and cause the 3D computer model (or alternatively, 2D “cuts” of it) to be displayed on the Master Display 104 as an overlay to the endoscope derived 3D images or within a Picture-in-Picture (“PIP”) in either 2D or 3D and from various angles and/or perspectives according to Surgeon or stored program instructions.
  • [0044]
    Each of the Tools 138 and 139, as well as the Endoscope 140 and LUS Probe 150, is preferably inserted through a cannula or trocar (not shown) or other tool guide into the Patient so as to extend down to the surgical site through a corresponding minimally invasive incision such as Incision 166. Each of the Slave Arms 121-124 is conventionally formed of linkages which are coupled together and manipulated through motor controlled joints (also referred to as “active joints”). Setup Arms (not shown) comprising linkages and setup joints are used to position the Slave Arms 121-124 vertically and horizontally so that their respective surgical related instruments may be coupled for insertion into the cannulae.
  • [0045]
    The number of surgical tools used at one time and consequently, the number of slave arms being used in the System 100 will generally depend on the diagnostic or surgical procedure and the space constraints within the operating room, among other factors. If it is necessary to change one or more of the tools being used during a procedure, the Assistant may remove the tool no longer being used from its slave arm, and replace it with another tool, such as Tool 131, from a Tray (“T”) in the Operating Room.
  • [0046]
    Preferably, the Master Display 104 is positioned near the Surgeon's hands so that it will display a projected image that is oriented so that the Surgeon feels that he or she is actually looking directly down onto the surgical site. To that end, an image of the Tools 138 and 139 preferably appear to be located substantially where the Surgeon's hands are located even though the observation points (i.e., that of the Endoscope 140 and LUS Probe 150) may not be from the point of view of the image.
  • [0047]
    In addition, the real-time image is preferably projected into a perspective image such that the Surgeon can manipulate the end effector of a Tool, 138 or 139, through its associated Master Input Device, 107 or 108, as if viewing the workspace in substantially true presence. By true presence, it is meant that the presentation of an image is a true perspective image simulating the viewpoint of an operator that is physically manipulating the Tools. Thus, the Processor 102 transforms the coordinates of the Tools to a perceived position so that the perspective image is the image that one would see if the Endoscope 140 was looking directly at the Tools from a Surgeon's eye-level during an open cavity procedure.
  • [0048]
    The Processor 102 performs various functions in the System 100. One important function that it performs is to translate and transfer the mechanical motion of Master Input Devices 107 and 108 to their associated Slave Arms 121 and 122 through control signals over Bus 110 so that the Surgeon can effectively manipulate their respective Tools 138 and 139. Another important function is to implement the various methods described herein providing a robotic assisted LUS capability.
  • [0049]
    Although described as a processor, it is to be appreciated that the Processor 102 may be implemented in practice by any combination of hardware, software and firmware. Also, its functions as described herein may be performed by one unit, or divided up among different components, each of which may be implemented in turn by any combination of hardware, software and firmware.
  • [0050]
    Prior to performing a minimally invasive surgical procedure, ultrasound images captured by the LUS Probe 150, right and left 2D camera images captured by the stereoscopic Endoscope 140, and end effector positions and orientations as determined using kinematics of the Slave Arms 121-124 and their sensed joint positions, are calibrated and registered with each other.
  • [0051]
    In order to associate the ultrasound image with the rest of the surgical environment, both need to be expressed in the same coordinate frame. Typically, the LUS Probe 150 is either labeled with markers and tracked by a tracking device such as the Optotrak® position sensing system manufactured by Northern Digital Inc. of Ontario, Canada, or held by a robot with precise joint encoders. Then the rigid transformation between the ultrasound image and the frame being tracked is determined (which is typically referred to as the ultrasound calibration).
  • [0052]
    For example, using the Optotrak® frame for the ultrasound calibration, the ultrasound image generated by the LUS Probe 150 is calibrated to an Optotrak® rigid body using an AX=XB formulation. “AX=XB” is a rubric for a class of calibration/registration problem commonly encountered in computer vision, surgical navigation, medical imaging, and robotics. The mathematical techniques are well known. See, e.g., E. Boctor, A. Viswanathan, M. Chioti, R. Taylor, G. Fichtinger, and G. Hager, “A Novel Closed Form Solution for Ultrasound Calibration,” International Symposium on Biomedical Imaging, Arlington, Va., 2004, pp. 527-530.
  • [0053]
    “A” and “B” in this case, are transformations between poses of the Optotrak® rigid body (A) and the ultrasound image (B). Thus, “X” is the transformation from the ultrasound image to the rigid body.
  • [0054]
    To perform the ultrasound calibration, the LUS Probe 150 may be placed in three known orientations defined by the AX=XB calibration phantom. The ultrasound image frame may then be defined by three fiducials which appear in each of the three poses. The three poses allow three relative transformations based on Optotrak® readings (A) and three relative transformations based on the ultrasound images (B) for the AX=XB registration.
  • [0055]
    Camera calibration is a common procedure in computer vision applications. As an example, in order to determine the intrinsic and extrinsic parameters of the Endoscope 140, a checkerboard phantom with a multi-plane formulation provided by the Caltech Camera Calibration Toolbox may be used. To construct the phantom, Optotrak® markers are added to a typical checkerboard video calibration phantom, and each corner of the checkerboard is digitized using a calibrated Optotrak® pointer. Thus, the corner positions may be reported with respect to the Optotrak®.
  • [0056]
    The calibration may then be performed by placing the phantom in view of the Endoscope 140 in several dozen orientations, and recording both stereo image data and Optotrak® readings of the four checkerboard corners. The images may then be fed into the calibration toolbox, which determines the intrinsic and extrinsic camera parameters, as well as the 3D coordinates of the grid corners in the camera frame. These coordinates may then be used with the Optotrak® readings to perform a point-cloud to point-cloud registration between the Endoscope 140 rigid body and camera frame.
  • [0057]
    The Controller 102 is configured to use the robot kinematics to report a coordinate frame for the LUS Probe 150 tip relative to the Endoscope 140. However, due to inaccuracies in the setup joint encoders, both of these coordinate frames may be offset from their correct values. Thus, it may be necessary to register the offsets between the real camera frame of the Endoscope 140 and the camera frame calculated from the kinematics as well as between the real and kinematic LUS Probe 150 frames. With this complete, the kinematics may be used in place of the Optotrak® readings to determine ultrasound image overlay placement.
  • [0058]
    As long as the position of the Endoscope 140 doesn't overly change, a constant transformation may be assumed between the kinematic tool tip and the laparoscopic Optotrak® rigid body. Using an AX=XB formulation, the LUS Probe 150 may be moved, for example, to several positions, and the static offset between the tool tip and Optotrak® rigid body registered. Knowing this offset, the Endoscope 140 offset may be calculated directly:
    C CD =D LusD(C LusUrb)−1 T OUrb(T OErb)−1 F CErb   (1)
  • [0059]
    where CCD is the camera offset from the real Endoscope 140 (also referred to herein simply as the “camera”) frame to the camera frame calculated from the kinematics, FCErb is the transformation from the camera to the endoscope rigid body, TOUrb·(TOErb)−1 is the transformation from the camera rigid body to the LUS rigid body, CLusUrb is the transformation from the LUS rigid body to the kinematic ultrasound tool tip, and DLusD is the reading from the Controller 102 giving the transformation from the kinematic ultrasound tool tip to a fixed reference point associated with the Slave Arms 121-124.
  • [0060]
    However, the aforedescribed registration should be redone each time the camera is moved, thus making it best suited for pre-operative calibration and registration. For intra-operative, the registration may be better performed using video tracking of a visual marker on the LUS Probe 150 instead of the Optotrak® readings. Thus, if the camera were moved while using tool tracking, the registration can be corrected on the fly as the tool is tracked. For additional details on tool tracking, see, e.g., commonly owned U.S. patent application Ser. No. 11/130,471 entitled “Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive surgery,” filed May 16, 2005, which is incorporated herein by reference. In addition to, or alternatively, manual registration of ultrasound and camera images may be performed using conventional grab, move and rotate actions on a 3D ultrasound computer model of an anatomic structure, so that the computer model is properly registered over a camera model of the anatomic structure in the Master Display 104.
  • [0061]
    Slave Arms 123 and 124 may manipulate the Endoscope 140 and LUS Probe 150 in similar manners as Slave Arms 121 and 122 manipulate Tools 138 and 139. When there are only two master input devices in the system, however, such as Master Input Devices 107 and 108 in the System 100, in order for the Surgeon to manually control movement of either the Endoscope 140 or LUS Probe 150, it may be required to temporarily associate one of the Master Input Devices 107 and 108 with the Endoscope 140 or the LUS Probe 150 that the Surgeon desires manual control over, while its previously associated Tool and Slave Manipulator are locked in position.
  • [0062]
    FIG. 2 illustrates, as an example, a block diagram of the LUS Robotic Surgical System 100. In this system, there are two Master Input Devices 107 and 108. Master Input Device 107 controls movement of either a Tool 138 or a stereoscopic Endoscope 140, depending upon which mode its Control Switch Mechanism 211 is in, and Master Input Device 108 controls movement of either a Tool 139 or a LUS Probe 150, depending upon which mode its Control Switch Mechanism 231 is in.
  • [0063]
    The Control Switch Mechanisms 211 and 231 may be placed in either a first or second mode by a Surgeon using voice commands, switches physically placed on or near the Master Input Devices 107 and 108, Foot Pedals 105 and 106 on the Console, or Surgeon selection of appropriate icons or other graphical user interface selection means displayed on the Master Display 104 or an auxiliary display (not shown).
  • [0064]
    When Control Switch Mechanism 211 is placed in the first mode, it causes Master Controller 202 to communicate with Slave Controller 203 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Tool 138 by Slave Arm 121, while the Endoscope 140 is locked in position. On the other hand, when Control Switch Mechanism 211 is placed in the second mode, it causes Master Controller 202 to communicate with Slave Controller 233 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Endoscope 140 by Slave Arm 123, while the Tool 138 is locked in position.
  • [0065]
    Similarly, when Control Switch Mechanism 231 is placed in the first mode, it causes Master Controller 222 to communicate with Slave Controller 223 so that manipulation of the Master Input 108 by the Surgeon results in corresponding movement of Tool 139 by Slave Arm 122. In this case, however, the LUS Probe 150 is not necessarily locked in position. Its movement may be guided by an Auxiliary Controller 242 according to stored instructions in Memory 240. The Auxiliary Controller 242 also provides haptic feedback to the Surgeon through Master Input 108 that reflects readings of a LUS Probe Force Sensor 247. On the other hand, when Control Switch Mechanism 231 is placed in the second mode, it causes Master Controller 222 to communicate with Slave Controller 243 so that manipulation of the Master Input 222 by the Surgeon results in corresponding movement of LUS Probe 150 by Slave Arm 124, while the Tool 139 is locked in position.
  • [0066]
    Before switching back to the first or normal mode, the Master Input Device 107 or 108 is preferably repositioned to where it was before the switch to the second mode of Control Switch 211 or 231, as the case may be, or kinematic relationships between the Master Input Device 107 or 108 and its respective Tool Slave Arm 121 or 122 is readjusted so that upon switching back to the first or normal mode, abrupt movement of the Tool 138 or 139 does not occur. For additional details on control switching, see, e.g., commonly owned U.S. Pat. No. 6,659,939 “Cooperative Minimally Invasive Telesurgical System,” which is incorporated herein by this reference.
  • [0067]
    The Auxiliary Controller 242 also performs other functions related to the LUS Probe 150 and the Endoscope 140. It receives output from a LUS Probe Force Sensor 247, which senses forces being exerted against the LUS Probe 150, and feeds the force information back to the Master Input Device 108 through the Master Controller 222 so that the Surgeon may feel those forces even if he or she is not directly controlling movement of the LUS Probe 150 at the time. Thus, potential injury to the Patient is minimized since the Surgeon has the capability to immediately stop any movement of the LUS Probe 150 as well as the capability to take over manual control of its movement.
  • [0068]
    Another key function of the Auxiliary Control 242 is to cause processed information from the Endoscope 140 and the LUS Probe 150 to be displayed on the Master Display 104 according to user selected display options. As will be described in more detail below, such processing includes generating a 3D ultrasound image from 2D ultrasound image slices received from the LUS Probe 150 through an Ultrasound Processor 246, causing either 3D or 2D ultrasound images corresponding to a selected position and orientation to be displayed in a picture-in-picture window of the Master Display 104, and causing either 3D or 2D ultrasound images of an anatomic structure to overlay a camera captured image of the anatomic structure being displayed on the Master Display 104.
  • [0069]
    Although shown as separate entities, the Master Controllers 202 and 222, Slave Controllers 203, 233, 223, and 243, and Auxiliary Controller 242 are preferably implemented as software modules executed by the Processor 102, as well as certain mode switching aspects of the Control Switch Mechanisms 211 and 231. The Ultrasound Processor 246 and Video Processor 236, on the other hand, are separate boards or cards typically provided by the manufacturers of the LUS Probe 150 and Endoscope 140 that are inserted into appropriate slots coupled to or otherwise integrated with the Processor 102 to convert signals received from these image capturing devices into signals suitable for display on the Master Display 104 and/or for additional processing by the Auxiliary Controller 242 before being displayed on the Master Display 104.
  • [0070]
    FIG. 3 illustrates a side view of one embodiment of the LUS Probe 150. The LUS Probe 150 is a dexterous tool with preferably two distal degrees of freedom, permitting reorientation of LUS Sensor 301 through, for example, approximately ±80° in distal “pitch” and “yaw”, and ±240° in “roll” about a ball joint type, pitch-yaw mechanism 311 (functioning as and also referred to herein as a “Wrist” mechanism). Opposing pairs of Drive Rods or Cables (not shown) physically connected to a proximal end of the LUS Sensor 301 and extending through an internal passage of Elongated Shaft 312 mechanically control pitch and yaw movement of the LUS Sensor 301 using conventional push-pull type action. This flexibility of the LUS Probe 150 (provided by the pitch/yaw wrist mechanism) is especially useful in optimally orienting the LUS Probe 150 for performing ultrasonography on an anatomic structure during a minimally invasive surgical procedure.
  • [0071]
    The LUS Sensor 301 captures 2D ultrasound slices of a proximate anatomic structure, and transmits the information back to the Processor 102 through LUS Cable 304. Although shown as running outside of the Elongated Shaft 312, the LUS Cable 304 may also extend within it. A Clamshell Sheath 321 encloses the Elongate Shaft 312 and LUS Cable 304 to provide a good seal passing through a Cannula 331 (or trocar). Fiducial Marks 302 and 322 are placed on the LUS Sensor 301 and the Sheath 321 for video tracking purposes.
  • [0072]
    A force sensing capability is provided by Strain Gauges 303 which provide direct feedback of how hard the LUS Probe 150 is pushing on a structure being sonographed, supplementing whatever limited feedback is available from joint motor torques. Potential uses of this information include: providing a redundant safety threshold check warning the Surgeon or preventing motion into the structure if forces get too great; providing the Surgeon with an approved haptic appreciation of how hard he or she is pushing on a structure; and possibly permitting some measure of compensation for unmodeled deflections of the Pitch-Yaw or “Wrist” Mechanism 311 which are not detected for some reason by joint position sensors or encoders. The Strain Gauges 303 in this case serve the function of the LUS Probe Force Sensor 247 as previously described in reference to FIG. 2.
  • [0073]
    Robotic assisted LUS has the potential to reduce variability in the ultrasound images produced, compared to freehand scanning, and can reduce operator workload and difficulty. Behaviors as simple as rocking the LUS Probe 150 back and forth can maintain an updated 3D ultrasound image without operator intervention. More complicated behaviors can include movement of the LUS Probe 150 along the surface of a target anatomical structure in a methodical pattern to generate a full image of the target, or reliably returning to a previously scanned probe location and orientation.
  • [0074]
    FIG. 4 illustrates, as an example, a flow diagram of a method for training the Auxiliary Controller 242 (i.e., providing it with stored instructions) to cause the LUS Probe 150 to be robotically moved in the trained manner upon command, in order to capture a sequence of 2D ultrasound image slices of an anatomic structure, which are used by the Auxiliary Controller 242 to generate a 3D computer model of the structure. Prior to performing the training, the Control Switch Mechanism 231 is placed in its second mode so that the Surgeon may move the LUS Probe 150 for training purposes by manipulating the Master Input Device 108. After performing training, the Control Switch Mechanism 231 is then placed back into its first or normal mode so that the Surgeon may manipulate the Tool 139 to perform a minimally invasive surgical procedure using the Master Input Device 108.
  • [0075]
    In 401, the training module is initially idle (i.e., it is not being executed by the Processor 102). In 402, the Processor 102 (or a training module agent running in the background) may periodically check whether a start of training indication is received. Alternatively, the start of training indication may act as an interrupt which initiates running of the training module. The start of training indication may be initiated by a Surgeon through a recognized voice command, selection of a training option on a graphical user interface displayed on the Master Display 104, a switch mechanism that may physically be located on the corresponding Master Control Input 108 or other convenient location accessible to the Surgeon, or any other conventional means.
  • [0076]
    After the start of training indication is detected, in 403, the training module records or stores the current LUS Probe 150 position and orientation, and periodically (or upon Surgeon command) continues to do so by looping around 403 and 404 until a stop training indication is detected or received. The stop training indication in this case may also be initiated by the Surgeon in the same manner as the start of training indication, or it may be initiated in a different, but other conventional manner. After the stop training indication is detected or received, a last position and orientation of the LUS Probe 150 is recorded or stored.
  • [0077]
    Between the start and stop of training, the Surgeon moves the LUS Probe 150 and the Processor 102 stores its trajectory of points and orientations so that they may be retraced later upon command. In one type of training, the Surgeon moves the LUS Probe 150 back and forth near an anatomic structure in order to capture a sequence of 2D ultrasound image slices from which a 3D version (or computer model) of the anatomic structure may be rendered by the Processor 102. In another type of training, the Surgeon move the LUS Probe 150 once or more times along the surface of the anatomic structure in order to capture a different sequence of 2D ultrasound image slices from-which a 3D version (or computer model) of the anatomic structure may be rendered by the Processor 102.
  • [0078]
    Although described as recording the positions and orientations of the LUS Probe 150, in practice, the active joint positions of its Slave Arm 124 are stored instead since their measurements are directly obtainable through encoders attached to each of the joints and their positions correspond to the LUS Probe 150 positions and orientations.
  • [0079]
    After storing the trajectory of positions and orientations of the LUS Probe 150 in the Memory 240, the trajectory is then associated with a means for the Surgeon to command the Auxiliary Controller 242 to move the LUS Probe 150 in the desired fashion. For example, the trajectory may be associated with a voice command which upon its detection, the Auxiliary Controller 242 causes the Slave Arm 124 to move the LUS Probe 150 back and forth along the stored trajectory of positions and orientations. Likewise, the trajectory may also be associated with a user selectable option on a graphical user interface displayed on the Master Display 104, or it may be associated with a switch mechanism such as a button or unused control element on the Master Input Device 108. It may also be associated with the depression of the Foot Pedal 106, so that the Auxiliary Controller 242 causes the Slave Arm 124 to move the LUS Probe 150 back and forth along the stored trajectory of positions and orientations as long as the Foot Pedal 106 is being depressed, and stops such motion once the Surgeon takes his or her foot off the Foot Pedal 106.
  • [0080]
    FIG. 5 illustrates, as an example, a flow diagram of a method for generating clickable thumbnail images corresponding to LUS Probe 150 positions and orientations that are stored in Memory 240, so that when the Surgeon clicks on one of the thumbnail images, the Auxiliary Controller 242 causes the Slave Arm 124 to move the LUS Probe 150 to its stored position and orientation. This allows the Surgeon to move the LUS Probe 150 to see different views of an anatomic structure while the Control Switch Mechanism 231 is in its first or normal mode. Thus, the Surgeon can continue to perform a minimally invasive surgical procedure by manipulating Tool 139 using the Master Input Device 108. The method may then be combined with that described in reference to FIG. 4 in order to generate a sequence of 2D ultrasound image slices starting from that position and orientation, from which the Auxiliary Controller 242 may generate a 3D computer model rendition of the anatomic structure.
  • [0081]
    Prior to performing the method, however, the Control Switch Mechanism 231 is placed in its second mode so that the Surgeon may move the LUS Probe 150 into the desired positions and orientations by manipulating the Master Input Device 108. After generating the clickable thumbnail images, the Control Switch Mechanism 231 is then placed back into its first or normal mode so that the Surgeon may manipulate the Tool 139 to perform the minimally invasive surgical procedure using the Master Input Device 108.
  • [0082]
    In 501, the Auxiliary Controller 242 receives a snapshot command from the Surgeon. The snapshot command may be, for example, a voice command, graphical user interface selection, or switch position. In 502, the Auxiliary Controller 242 causes the LUS Probe 150 to capture a 2D ultrasound image slice, and in 503, a thumbnail of the image is generated. The thumbnail in this case may include a simple JPEG or GIF file of the captured image. In 504, the current position and orientation of the LUS Probe 150 is stored in Memory 240 along with information of its association with the thumbnail. In 505, a clickable version of the thumbnail is displayed on the Master Display 104, so that the Surgeon may command the Auxiliary Controller 242 to cause the LUS Probe to be positioned and oriented at the stored position and orientation at any time upon clicking with his or her mouse or other pointing device on the clickable thumbnail. The Surgeon may then move the LUS Probe 150 to other positions and/or orientations, and repeat 501-505 to generate additional thumbnail images.
  • [0083]
    FIG. 6 illustrates, as an example, a flow diagram of a method for automatically moving the LUS Probe 150 to a position and orientation associated with a clickable thumbnail upon command to do so by a Surgeon while performing a minimally invasive surgical procedure using Tool 139. In 601, the clicking of a thumbnail generated by the method described in reference to FIG. 5 is detected by, for example, a conventional interrupt handling process. Upon such detection, in 602, the Auxiliary Controller 242 is instructed by, for example, stored instructions corresponding to the interrupt handling process, to retrieve the position and orientation stored in Memory 240 which is associated with the thumbnail. The Auxiliary Controller 242 then causes the LUS Probe 150 to move to that position and orientation by appropriately controlling Slave Arm 124. Thus, the Surgeon is able to move the LUS Probe 150 to a desired position without having to change modes of the Control Switch Mechanism 231 and halt operation of the Tool 139 until the LUS Probe 150 is moved.
  • [0084]
    FIG. 7 illustrates, as an example, a flow diagram of a method for robotically assisted needle guidance and penetration into a marked lesion of a cancerous structure, which allows appreciation for the aspects of robotic assisted LUS described herein. In 701, a selected 2D ultrasound image slice view of a cancerous structure such as a liver is displayed at the proper depth on the Master Display 104 as an overlay to a 3D camera view of the cancerous structure. The selected 2D ultrasound image slice view may be a frontal view or an inner slice view as taken from a previously generated 3D ultrasound computer model of the cancerous structure.
  • [0085]
    As an example clarifying the 701 process, FIG. 8 illustrates a simplified perspective view of a 3D ultrasound computer model 800 of the cancerous structure, which has been generated, for example, using the method described in reference to FIG. 4, and has been translated into the camera reference frame (EX, EY, EZ). FIG. 9, on the other hand, illustrates a simplified perspective view of a 3D camera view 900 of the cancerous structure as taken by the stereoscopic Endoscope 140. If the Surgeon selects a frontal slice 801 of the 3D ultrasound computer model 800 to be viewed as an overlay to the 3D camera view 900, then the overlay will appear as shown in FIG. 10. On the other hand, if the Surgeon selects one of the inner slices 802-804 of the 3D ultrasound computer model 800, such as inner slice 803, to be viewed as an overlay to the 3D camera view 900, then the overlay will appear as shown in FIG. 11 with the 2D ultrasound image slice 803 displayed at the proper depth. To avoid confusion, the portion of the 3D camera view above that depth is made transparent.
  • [0086]
    Alternatively, in 701, the surgeon may manually control movement of the LUS Probe 150 so that 2D ultrasound image slices captured by it appear as emanating in proper perspective and direction from the 3D camera image of the LUS Probe 150 in the Master Display 104. Preferably, the emanated 2D image slices being displayed in the Master Display 104 do not occlude the anatomic structure being probed. This manual approach may be particularly useful to the Surgeon for quickly spotting lesions in the anatomic structure.
  • [0087]
    In 702, the Surgeon marks lesions on the cancerous structure displayed as a result of 701. Each marked lesion is preferably marked using a designated color in order to clearly show that the Surgeon has already identified it, thereby avoiding double counting. The location in the camera reference frame (EX, EY, EZ) of each marked lesion is stored in Memory 240, and in 703, the Processor 102 determines an optimal needle tip path to that location.
  • [0088]
    In 703, the Processor 102 generates a virtual fixture to help guide the needle to the marked lesion. To generate the virtual fixture, local kinematic constraints on the Slave Arm manipulating the needle Tool may be specified by providing a table of constraints of the form:
    ({right arrow over (x)}−{right arrow over (x)} 0)T A K({right arrow over (x)}−{right arrow over (x)} 0)+{right arrow over (b)} K({right arrow over (x)}−{right arrow over (x)} 0)≦c   (2)
  • [0089]
    where {right arrow over (x)} represents, in simplified terms, the current 6 DOF kinematic pose of a master arm, or, in more general terms, a parameterization of a Cartesian pose F linearized about some nominal pose F0 so that ({right arrow over (x)}−{right arrow over (x)}0)˜F0 −1F. The tables are to be updated periodically based on visual feedback, user interaction, etc.
  • [0090]
    As can be appreciated, equation (2) can be easily checked and enforced.
  • [0091]
    Similarly, a simple table-driven interface for surgeon interaction forces can be implemented approximately as follows:
    {right arrow over (f)} 0 ; y {right arrow over (x)} − {right arrow over (x)}0 ; (3)
    for k 1 to N do
    { ε {right arrow over (y)}T CK {right arrow over (y)} + {right arrow over (d)}K {right arrow over (y)} − eK ;
    if ε > 0 then {{right arrow over (g)} 2 CK {right arrow over (y)} {right arrow over (d)}K ; {right arrow over (f)} {right arrow over (f)} + ƒ (ε){right arrow over (g)}/∥ {right arrow over (g)} ∥; };
    };
    output {right arrow over (f)} (after limiting & spacing )
  • [0092]
    where ε corresponds, roughly, to a distance from a surface in state space and the function ƒ(ε) corresponds to a (non-linear) stiffness.
  • [0093]
    The above formulation suffices to support a variety of virtual chamfers, virtual springs, detents, etc. It is also easily extended to virtual dampers by adding velocity terms.
  • [0094]
    Now, more particularly, in the present case where it is desired to help aim an injection needle at a target in a live ultrasound image, let: P TROCAR = position where needle enters patient = RCM point for needle insertion arm ( 4 ) R NEEDLE = R 0 R ( α ) = orientation of needle arm ( 5 ) α = vector representation for small rotation ( 6 ) F LUS = [ R LUS , P _ LUS ] = pose of LUS sensor ( 7 ) V TARGET = position of target wrt LUS sensor ( 8 )
  • [0095]
    Then the basic constraint is that the needle axis (which is assumed for this example to be the {right arrow over (Z)} axis of the needle driver) should be aimed at the target lesion, which will be given by FLUS {right arrow over (V)}TARGET. One metric for the aiming direction error will be: ɛ AIMING ( α ) = ( R NEEDLE z ) × ( F LUS v TARGET - P TROCAR ) 2 = ( R ( α ) z ) × R 0 - 1 ( F LUS v TARGET - P TROCAR ) 2 ( 9 )
  • [0096]
    which can be approximated as a quadratic form in {right arrow over (α)} and converted to a virtual fixture using the method described above. Similarly, if the position of the needle tip is {right arrow over (P)}TIP, the penetration depth beyond the LUS target will be given by:
    εBEYOND=(R 0 R({right arrow over (α)}){right arrow over (z)})·(F LUS {right arrow over (v)} TARGET −{right arrow over (P)} TIP)   (10)
  • [0097]
    which can easily be transcribed into a virtual detent or barrier preventing over-penetration. Alternatively, a simple spherical attractor virtual fixture can be developed to minimize ∥FLUS {right arrow over (v)}TARGET−{right arrow over (P)}TIP∥.
  • [0098]
    In 705, the Processor 102 determines the needle tip position as it moves towards the target lesion, and in 706, the Processor 102 determines the distance between the needle tip position and the target lesion. The needle tip position may be determined from the Slave Arm kinematics and/or through visual tracking in the camera image.
  • [0099]
    In 707, the color of the lesion or some other object in the display changes as the needle tip gets closer to the target. For example, the color may start off as blue when the needle tip is still far away from the target, and it may change through color spectrum so that it becomes red as it nears the target. Alternatively, a bar graph or other visual indicator may be used to give a quick sense of the distance.
  • [0100]
    In 708, a determination is made whether the distance has reached a threshold distance (usually specified as some distance close to or even at the surface of the target lesion). If the threshold has not been reached, then the method loops back to 705 and continually repeats 705-708 until the threshold is reached. Once the threshold is reached, in 709, a 90 degree view of the cancerous structure and the approaching needle is shown in a picture-in-picture window of the Master Display 104. The method may then go back to 705 and repeat 705-708 as the needle penetrates the cancerous structure or withdraws back to its start position.
  • [0101]
    Although the various aspects of the present invention have been described with respect to a preferred embodiment, it will be understood that the invention is entitled to full protection within the full scope of the appended claims.

Claims (41)

1. A laparoscopic ultrasound robotic surgical system comprising:
a first robotic arm mechanically coupled to an ultrasound probe;
a second robotic arm mechanically coupled to a surgery related device;
a master manipulator;
a control switch having user selectable first and second modes; and
a processor configured to cause the second robotic arm to be locked in position and the first robotic arm to move the ultrasound probe according to user manipulation of the master manipulator when the control switch is in the first mode, and cause the second robotic arm to manipulate the tool according to manipulation of the master manipulator and the first robotic arm to move the ultrasound probe according to stored instructions upon detection of a user command associated with the stored instructions when the control switch is in the second mode.
2. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the surgery related device is a surgical tool.
3. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the user command is a voice command.
4. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the user command derives from user selection of an option provided in a graphical user interface.
5. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the user command derives from a selection indicated by a switch position.
6. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the stored instructions instruct the processor to cause the first robotic arm to move the ultrasound probe so that 2D ultrasound image slices captured by the ultrasound probe while moving provide information for the processor to generate a 3D computer model of at least part of an anatomic structure during a minimally invasive surgical procedure.
7. The laparoscopic ultrasound robotic surgical system according to claim 6, wherein the stored instructions instruct the processor to cause the first robotic arm to move the ultrasound probe along a trajectory of stored points.
8. The laparoscopic ultrasound robotic surgical system according to claim 7, wherein the stored instructions instruct the processor to cause the first robotic arm to repetitively move the ultrasound probe back and forth along the trajectory of stored points.
9. The laparoscopic ultrasound robotic surgical system according to claim 7, wherein the trajectory of stored points include points on a surface of the anatomic structure.
10. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the stored instructions instruct the processor to cause the first robotic arm to move the ultrasound probe so as to be positioned and oriented at a stored position and orientation.
11. The laparoscopic ultrasound robotic surgical system according to claim 10, wherein the stored position and orientation correspond to a clickable thumbnail of an ultrasound image generated by the ultrasound probe at the position and orientation.
12. The laparoscopic ultrasound robotic surgical system according to claim 11, wherein the processor is configured to cause the first robotic arm to move the ultrasound probe to the stored position and orientation upon receiving an indication that a user has clicked the clickable thumbnail.
13. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the surgery related device is a surgical tool, and further comprising:
an endoscope for capturing images of a surgical site in a patient;
wherein the processor is configured to display the captured images in a picture section of a display screen and an ultrasound image captured by the ultrasound probe within a picture-in-picture section of the display screen.
14. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the surgery related device is a surgical tool, and further comprising:
an endoscope for capturing video images;
wherein the processor is configured to spatially register ultrasound images captured by the ultrasound probe with the video images captured by the endoscope.
15. The laparoscopic ultrasound robotic surgical system according to claim 14, wherein the processor is configured to determine positions and orientations of the ultrasound images relative to a frame of reference associated with the endoscope, and display the ultrasound images so as to overlay the video images at the determined positions and orientations on a display screen.
16. The laparoscopic ultrasound robotic surgical system according to claim 15, wherein the endoscope is a stereoscopic endoscope capturing right and left 2D camera views including captured images of the ultrasound probe, the display screen is a 3D display screen, and the processor is configured to: generate a 3D camera image from the captured right and left 2D camera views, and display the 3D camera images in the 3D display screen with 2D ultrasound image slices captured by the ultrasound probe overlayed in the 3D camera images so as to appear as emanating in proper perspective from the ultrasound probe.
17. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the control switch is placed in the first or second mode by a user voice command.
18. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the control switch is placed in the first or second mode by user selection of an option provided in a graphical user interface.
19. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the control switch is placed in the first or second mode by a user selected switch position.
20. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the ultrasound probe includes a force sensor to sense forces exerted against the ultrasound probe and provides information of the sensed forces to the processor, and the processor reflects information of the sensed forces back to the master manipulator so as to be felt by a user while manipulating the master manipulator.
21. The laparoscopic ultrasound robotic surgical system according to claim 1, wherein the ultrasound probe comprises:
an ultrasound sensor;
an elongated shaft; and
a wrist mechanism coupling the ultrasound sensor to the elongated shaft so as to allow pitch and yaw movement of the ultrasound sensor relative to an axis running along a length of the elongated shaft.
22. The laparoscopic ultrasound robotic surgical system according to claim 21, wherein the ultrasound probe further comprises:
a first pair of cables coupled to the ultrasound sensor such that the ultrasound sensor moves in a yaw direction by pulling on only one of the first pair of cables and moves in an opposite yaw direction when pulling on only the other of the first pair of cables.
23. The laparoscopic ultrasound robotic surgical system according to claim 22, wherein the ultrasound probe further comprises:
a second pair of cables coupled to the ultrasound sensor such that the ultrasound sensor moves in a pitch direction by pulling on only one of the second pair of cables and moves in an opposite pitch direction when pulling on only the other of the second pair of cables.
24. A method for providing robotic assisted laparoscopic ultrasound, comprising:
storing a current ultrasound probe position and orientation upon detection of a start of training indication; and
periodically storing ultrasound probe positions and orientations to define a trajectory of positions and orientations until detection of an end of training indication.
25. The method according to claim 24, wherein the start and end of training indications are generated by user voice commands.
26. The method according to claim 24, wherein the start and end of training indications are generated by user selections using a graphical user interface.
27. The method according to claim 24, wherein the start and end of training indications are generated by positions of one or more switch mechanisms.
28. The method according to claim 24, further comprising: associating the trajectory of positions and orientations with a first voice command.
29. The method according to claim 28, further comprising: causing the ultrasound probe to move along the trajectory of positions and orientations upon detection of the first voice command.
30. The method according to claim 28, further comprising: causing the ultrasound probe to move back and forth along the trajectory of positions and orientations upon detection of the first voice command and until detection of a second voice command.
31. The method according to claim 24, further comprising: associating the trajectory of positions and orientations with a clickable thumbnail displayed on a display screen.
32. The method according to claim 31, further comprising: causing the ultrasound probe to move along the trajectory of positions and orientations upon detection of the clickable thumbnail having been clicked upon and until a stop indication is received.
33. The method according to claim 24, further comprising: associating the trajectory of positions and orientations with a first switch position.
34. The method according to claim 33, further comprising: causing the ultrasound probe to move along the trajectory of positions and orientations upon detection of a switch being in the first switch position and until detection of the switch being in a second switch position.
35. A method for providing robotic assisted laparoscopic ultrasound, comprising:
capturing an ultrasound image using an ultrasound probe disposed at a position and orientation;
storing information of the position and orientation;
generating a clickable thumbnail of the ultrasound image;
associating the stored position and orientation with the clickable thumbnail; and
displaying the clickable thumbnail on a display screen.
36. The method according to claim 35, further comprising:
receiving information of a user having clicked on the clickable thumbnail; and
causing the ultrasound probe to be disposed at the position and orientation associated with the clicked upon clickable thumbnail.
37. A method for providing robotic assisted laparoscopic ultrasound, comprising:
displaying an ultrasound view of an anatomic structure in a patient as a registered overlay to a camera view of the anatomic structure;
receiving information of a target marked on the ultrasound view;
determining a path for a tool to travel to the target within the patient; and
generating a virtual fixture to assist in electronically constraining the tool to travel over the determined path.
38. The method according to claim 37, wherein the anatomic structure is cancerous, the target is a lesion marked by a surgeon on the displayed ultrasound view, and the tool is a needle.
39. The method according to claim 37, further comprising:
determining a distance of the tool from the target; and
changing a color of the target marked on the ultrasound view so as to indicate the distance.
40. The method according to claim 37, further comprising:
determining when a tip of the tool reaches a threshold distance from a surface of the anatomic structure; and
displaying a right angle view of the ultrasound image and the tool upon determining that the tip of the tool has reached the threshold distance in a picture-in-picture window.
41. The method according to claim 40, wherein the ultrasound image is a 3D image generated from a sequence of 2D ultrasound image slices of the anatomic structure.
US11447668 2005-06-06 2006-06-06 Laparoscopic ultrasound robotic surgical system Pending US20070021738A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US68801305 true 2005-06-06 2005-06-06
US11447668 US20070021738A1 (en) 2005-06-06 2006-06-06 Laparoscopic ultrasound robotic surgical system

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US11447668 US20070021738A1 (en) 2005-06-06 2006-06-06 Laparoscopic ultrasound robotic surgical system
US12189615 US8398541B2 (en) 2006-06-06 2008-08-11 Interactive user interfaces for robotic minimally invasive surgical systems
US13775574 US9795446B2 (en) 2005-06-06 2013-02-25 Systems and methods for interactive user interfaces for robotic minimally invasive surgical systems
US15413380 US20170128041A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413375 US20170128144A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413378 US20170128145A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System

Related Child Applications (4)

Application Number Title Priority Date Filing Date
US12189615 Continuation-In-Part US8398541B2 (en) 2005-06-06 2008-08-11 Interactive user interfaces for robotic minimally invasive surgical systems
US15413380 Division US20170128041A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413375 Division US20170128144A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413378 Division US20170128145A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System

Publications (1)

Publication Number Publication Date
US20070021738A1 true true US20070021738A1 (en) 2007-01-25

Family

ID=39488202

Family Applications (4)

Application Number Title Priority Date Filing Date
US11447668 Pending US20070021738A1 (en) 2005-06-06 2006-06-06 Laparoscopic ultrasound robotic surgical system
US15413375 Pending US20170128144A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413380 Pending US20170128041A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413378 Pending US20170128145A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15413375 Pending US20170128144A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413380 Pending US20170128041A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System
US15413378 Pending US20170128145A1 (en) 2005-06-06 2017-01-23 Laparoscopic Ultrasound Robotic Surgical System

Country Status (6)

Country Link
US (4) US20070021738A1 (en)
JP (6) JP4999012B2 (en)
KR (1) KR101258912B1 (en)
CN (1) CN101193603B (en)
EP (4) EP2289452A3 (en)
WO (1) WO2007030173A1 (en)

Cited By (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20070032720A1 (en) * 2003-06-17 2007-02-08 Onesys Oy Method and system for navigating in real time in three-dimensional medical image model
US20070129626A1 (en) * 2005-11-23 2007-06-07 Prakash Mahesh Methods and systems for facilitating surgical procedures
US20070239021A1 (en) * 2006-03-02 2007-10-11 Yutaka Oonuki Ultrasonic diagnostic apparatus and ultrasonic diagnostic processing method
US20070238985A1 (en) * 2006-02-16 2007-10-11 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion in the body
US20080065105A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Minimally invasive surgical system
US20080146915A1 (en) * 2006-10-19 2008-06-19 Mcmorrow Gerald Systems and methods for visualizing a cannula trajectory
US20080154389A1 (en) * 2006-02-16 2008-06-26 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) Method and system for performing invasive medical procedures using a surgical robot
US20080215181A1 (en) * 2007-02-16 2008-09-04 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) Method and system for performing invasive medical procedures using a surgical robot
US20080283572A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Surgical stapling instrument with chemical sealant
US20080283574A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Maneuverable surgical stapler
US20080287987A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dispensing system for tissue sealants
US20080287963A1 (en) * 2005-12-30 2008-11-20 Rogers Theodore W Methods and apparatus to shape flexible entry guides for minimally invasive surgery
US20080283577A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Steerable surgical stapler
US20080283570A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Gentle touch surgical stapler
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20090069804A1 (en) * 2007-09-12 2009-03-12 Jensen Jeffrey L Apparatus for efficient power delivery
US20090112256A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Suturing device with tissue sealant dispenser
US20090112243A1 (en) * 2007-10-25 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Surgical cutter with dispensing system for tissue sealants
US20090143816A1 (en) * 2007-11-30 2009-06-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Grasper with surgical sealant dispenser
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle
US20090149740A1 (en) * 2007-12-11 2009-06-11 Siemens Aktiengesellschaft A medical intervention device
US20090192519A1 (en) * 2008-01-29 2009-07-30 Terumo Kabushiki Kaisha Surgical system
WO2009126955A3 (en) * 2008-04-11 2010-02-11 The Regents Of The University Of Michigan Minimal access tool
KR100944412B1 (en) 2008-10-13 2010-02-25 (주)미래컴퍼니 Surgical slave robot
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
KR100956762B1 (en) * 2009-08-28 2010-05-12 주식회사 래보 Surgical robot system using history information and control method thereof
US20100166323A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical. Inc. Robust sparse image matching for robotic surgery
US20100169815A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Visual force feedback in a minimally invasive surgical procedure
US20100168918A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Obtaining force information in a minimally invasive surgical procedure
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100217279A1 (en) * 2009-02-20 2010-08-26 Tyco Healthcare Group Lp Marking Articulating Direction For Surgical Instrument
US20100234996A1 (en) * 2007-10-20 2010-09-16 Kuka Roboter Gmbh Manipulator, Particularly Industrial Robot, Having A Redundant Sensor Arrangement, And Method For The Control Thereof
WO2010117684A1 (en) 2009-03-31 2010-10-14 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US20100294826A1 (en) * 2007-05-16 2010-11-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Gentle touch surgical stapler
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US20100324733A1 (en) * 2007-12-28 2010-12-23 Kuka Roboter Gmbh Robot And Method For Monitoring The Torque On Such A Robot
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US20110050852A1 (en) * 2005-12-30 2011-03-03 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US7922064B2 (en) 2007-05-16 2011-04-12 The Invention Science Fund, I, LLC Surgical fastening device with cutter
US20110112549A1 (en) * 2008-05-28 2011-05-12 Zipi Neubach Ultrasound guided robot for flexible needle steering
US20110224684A1 (en) * 2005-12-30 2011-09-15 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US20110234754A1 (en) * 2008-11-24 2011-09-29 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
US20110238080A1 (en) * 2010-03-25 2011-09-29 Date Ranjit Robotic Surgical Instrument System
US20110270443A1 (en) * 2010-04-28 2011-11-03 Kabushiki Kaisha Yaskawa Denki Apparatus and method for detecting contact position of robot
WO2012037257A2 (en) * 2010-09-14 2012-03-22 The Johns Hopkins University Robotic system to augment endoscopes
US20120092450A1 (en) * 2010-10-18 2012-04-19 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
US20120177276A1 (en) * 2009-09-18 2012-07-12 Manabu Migita Ultrasonograph and method of diagnosis using same
WO2012065058A3 (en) * 2010-11-11 2012-08-02 The Johns Hopkins University Remote center of motion robot for medical image scanning and image-guided targeting
US20120226150A1 (en) * 2009-10-30 2012-09-06 The Johns Hopkins University Visual tracking and annotaton of clinically important anatomical landmarks for surgical interventions
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
WO2013025613A1 (en) * 2011-08-12 2013-02-21 Jointvue, Llc 3-d ultrasound imaging device and methods
US20130079794A9 (en) * 2006-06-13 2013-03-28 Intuitive Surgical Operations, Inc. Surgical system entry guide
US20130296883A1 (en) * 2009-11-27 2013-11-07 Mcmaster University Automated detection, diagnostic and therapeutic method and system
US20140142593A1 (en) * 2007-04-16 2014-05-22 Tim Fielding Frame Mapping and Force Feedback Methods, Devices and Systems
CN103869189A (en) * 2014-03-14 2014-06-18 南京东恒通信科技有限公司 Passive device debugging system
WO2014093367A1 (en) 2012-12-10 2014-06-19 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US20140212025A1 (en) * 2011-09-13 2014-07-31 Koninklijke Philips Electronics N.V. Automatic online registration between a robot and images
US20140336669A1 (en) * 2013-05-08 2014-11-13 Samsung Electronics Co., Ltd. Haptic gloves and surgical robot systems
US20150045951A1 (en) * 2012-03-29 2015-02-12 Reis Group Holding Gmbh & Co. Kg Device and method for operating an industrial robot
US20150073436A1 (en) * 2012-05-18 2015-03-12 Olympus Corporation Operation support device
US20150145966A1 (en) * 2013-11-27 2015-05-28 Children's National Medical Center 3d corrected imaging
US9068820B2 (en) 2010-10-15 2015-06-30 Scopis Gmbh Method and device for calibrating an optical system, distance determining device, and optical system
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
WO2015161297A1 (en) * 2014-04-17 2015-10-22 The Johns Hopkins University Robot assisted ultrasound system
WO2015175278A1 (en) * 2014-05-15 2015-11-19 Covidien Lp Systems and methods for controlling a camera position in a surgical robotic system
US20150335315A1 (en) * 2014-05-15 2015-11-26 Samsung Medison Co., Ltd. Ultrasonic diagnosis device and method of diagnosing by using the same
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9387048B2 (en) 2011-10-14 2016-07-12 Intuitive Surgical Operations, Inc. Catheter sensor systems
WO2016119039A1 (en) * 2015-01-29 2016-08-04 Synaptive Medical (Barbados) Inc. Physiological phantoms incorporating feedback sensors and sensing materials
US9421068B2 (en) * 2007-03-01 2016-08-23 Titan Medical Inc. Methods, systems and devices for three dimensional input and control methods and systems based thereon
WO2016133633A1 (en) * 2015-02-19 2016-08-25 Covidien Lp Repositioning method of input device for robotic surgical system
US9452276B2 (en) 2011-10-14 2016-09-27 Intuitive Surgical Operations, Inc. Catheter with removable vision probe
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9629689B2 (en) 2008-04-11 2017-04-25 Flexdex, Inc. Attachment apparatus for remote access tools
WO2017098506A1 (en) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Autonomic goals-based training and assessment system for laparoscopic surgery
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US9717563B2 (en) 2008-06-27 2017-08-01 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9782229B2 (en) 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US9814451B2 (en) 2015-10-02 2017-11-14 Flexdex, Inc. Handle mechanism providing unlimited roll
US9869339B2 (en) 2008-04-11 2018-01-16 Flexdex, Inc. End-effector jaw closure transmission systems for remote access tools
US9883914B2 (en) 2015-07-30 2018-02-06 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9375284B2 (en) * 1999-09-17 2016-06-28 Intuitive Surgical Operations, Inc. Systems and methods for facilitating access to edges of cartesian-coordinate space using the null space
KR101320379B1 (en) * 2005-10-20 2013-10-22 인튜어티브 서지컬 인코포레이티드 Auxiliary image display and manipulation on a computer display in a medical robotic system
CA2684472C (en) * 2007-04-16 2015-11-24 Neuroarm Surgical Ltd. Methods, devices, and systems for automated movements involving medical robots
KR101075363B1 (en) * 2008-10-31 2011-10-19 정창욱 Surgical Robot System Having Tool for Minimally Invasive Surgery
WO2010109932A1 (en) 2009-03-24 2010-09-30 オリンパスメディカルシステムズ株式会社 Robot system for endoscope treatment
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
KR101039108B1 (en) * 2009-09-01 2011-06-07 한양대학교 산학협력단 Medical robot system and Method for controlling the same
WO2011069469A1 (en) * 2009-12-11 2011-06-16 Hospital Authority Stereoscopic visualization system for surgery
CN101889900B (en) * 2010-07-12 2012-04-11 天津大学 Master-slave integrated mechanical arm for assisting minimally invasive surgery
JP2012115471A (en) * 2010-11-30 2012-06-21 Olympus Corp Medical treatment instrument, and manipulator
US20120179038A1 (en) * 2011-01-07 2012-07-12 General Electric Company Ultrasound based freehand invasive device positioning system and method
CN102028519A (en) * 2011-01-14 2011-04-27 刘威 Ultrasonic guide liver parenchyma separator
JP2014506695A (en) * 2011-01-30 2014-03-17 ミレイ,ラム スリカンスMIRLAY,Ram Srikanth Technical evaluation
FR2972132B1 (en) * 2011-03-02 2014-05-09 Gen Electric Assistance device has the manipulation of an instrument or tool
CN103702631A (en) * 2011-05-05 2014-04-02 约翰霍普金斯大学 Method and system for analyzing a task trajectory
EP2729084A4 (en) * 2011-07-07 2015-03-04 Olympus Corp Medical master slave manipulator
JP5800609B2 (en) * 2011-07-07 2015-10-28 オリンパス株式会社 Medical master-slave manipulator
JP5800610B2 (en) * 2011-07-07 2015-10-28 オリンパス株式会社 Medical master-slave manipulator
JP5893330B2 (en) * 2011-10-18 2016-03-23 オリンパス株式会社 Initialization method of the operation input device and the operation input device
JP2013111726A (en) 2011-11-30 2013-06-10 Sony Corp Robot apparatus, method of controlling the same, and computer program
JP2015505686A (en) * 2011-12-03 2015-02-26 コーニンクレッカ フィリップス エヌ ヴェ Robot ultrasound probe induction in endoscopic surgery
KR20130081875A (en) * 2012-01-10 2013-07-18 삼성전자주식회사 Robot and control method thereof
CN102768541B (en) * 2012-04-28 2015-12-09 中国科学院深圳先进技术研究院 Method and system for controlling a surgical robot
CN103456222B (en) * 2012-06-01 2015-11-25 苏州敏行医学信息技术有限公司 Based coagulation method and system for laparoscopic surgery training simulation system
US20140114327A1 (en) * 2012-10-22 2014-04-24 Ethicon Endo-Surgery, Inc. Surgeon feedback sensing and display methods
CN102982233B (en) * 2012-11-01 2016-02-03 华中科技大学 Medical imaging workstation having a stereoscopic display
JP5908172B2 (en) 2012-11-30 2016-04-26 オリンパス株式会社 Method of controlling the operation support system and operation support system
DE102012025100A1 (en) * 2012-12-20 2014-06-26 avateramedical GmBH Decoupled multi-camera system for minimally invasive surgery
JP2013090968A (en) * 2013-02-18 2013-05-16 Toshiba Corp X-ray diagnostic apparatus
WO2014155322A1 (en) * 2013-03-29 2014-10-02 Koninklijke Philips N.V. Systems for measuring force and torque on ultrasound probe during imaging through strain measurement
CN103860205B (en) * 2013-06-21 2015-08-19 深圳开立生物医疗科技股份有限公司 A method for improving the accuracy of positioning the physical ultrasound probe transducer and probe laparoscopic
CN103356155B (en) * 2013-06-24 2014-12-31 清华大学深圳研究生院 Virtual endoscope assisted cavity lesion examination system
CN105979879A (en) * 2014-01-24 2016-09-28 皇家飞利浦有限公司 Virtual image with optical shape sensing device perspective
CN104200730B (en) * 2014-09-09 2017-05-10 华中科技大学 Virtual device for laparoscopic surgery, a method and system for
DE102014226240A1 (en) * 2014-12-17 2016-06-23 Kuka Roboter Gmbh A system for robot-assisted medical treatment
KR101703114B1 (en) * 2015-07-08 2017-02-06 한국기계연구원 Master device and conroling method of the master device for interventional procedure, and remote control interventional procedure device using the master device
WO2017031132A1 (en) * 2015-08-17 2017-02-23 Intuitive Surgical Operations, Inc. Unground master control devices and methods of use
CN105662588A (en) * 2016-03-16 2016-06-15 北京理工大学 Master-slave minimally invasive vascular interventional surgery remote operation system

Citations (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5408409A (en) * 1990-05-11 1995-04-18 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5482029A (en) * 1992-06-26 1996-01-09 Kabushiki Kaisha Toshiba Variable flexibility endoscope system
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US5551432A (en) * 1995-06-19 1996-09-03 New York Eye & Ear Infirmary Scanning control system for ultrasound biomicroscopy
US5759153A (en) * 1992-06-30 1998-06-02 Cardiovascular Imaging Systems, Inc. Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5788688A (en) * 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5797900A (en) * 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5797849A (en) * 1995-03-28 1998-08-25 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
US5810008A (en) * 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US5836880A (en) * 1995-02-27 1998-11-17 Micro Chemical, Inc. Automated system for measuring internal tissue characteristics in feed animals
US5842473A (en) * 1993-11-29 1998-12-01 Life Imaging Systems Three-dimensional imaging system
US5842993A (en) * 1997-12-10 1998-12-01 The Whitaker Corporation Navigable ultrasonic imaging probe assembly
US5853367A (en) * 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6226566B1 (en) * 1995-04-21 2001-05-01 International Business Machines Corporation Method of constrained cartesian control of robotic mechanisms with active and passive joints
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
US6425865B1 (en) * 1998-06-12 2002-07-30 The University Of British Columbia Robotically assisted medical ultrasound
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6493608B1 (en) * 1999-04-07 2002-12-10 Intuitive Surgical, Inc. Aspects of a control system of a minimally invasive surgical apparatus
US20020193800A1 (en) * 2001-06-11 2002-12-19 Kienzle Thomas C. Surgical drill for use with a computer assisted surgery system
US20030013949A1 (en) * 1998-11-20 2003-01-16 Frederic H. Moll Cooperative minimally invasive telesurgical system
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6547782B1 (en) * 1991-06-13 2003-04-15 International Business Machines, Corp. System and method for augmentation of surgery
US6599247B1 (en) * 2000-07-07 2003-07-29 University Of Pittsburgh System and method for location-merging of real-time tomographic slice images with human vision
US6602185B1 (en) * 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system
US6642836B1 (en) * 1996-08-06 2003-11-04 Computer Motion, Inc. General purpose distributed operating room control system
US6714839B2 (en) * 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US20060058988A1 (en) * 2001-05-29 2006-03-16 Defranoux Nadine A Method and apparatus for computer modeling a joint
US7107124B2 (en) * 1992-01-21 2006-09-12 Sri International Roll-pitch-roll wrist methods for minimally invasive robotic surgery
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US7144367B2 (en) * 1995-07-24 2006-12-05 Chen David T Anatomical visualization system
US7155316B2 (en) * 2002-08-13 2006-12-26 Microbotics Corporation Microsurgical robot system
US7413565B2 (en) * 2002-01-16 2008-08-19 Intuitive Surgical, Inc. Minimally invasive surgical training using robotics and telecollaboration
US8155479B2 (en) * 2008-03-28 2012-04-10 Intuitive Surgical Operations Inc. Automated panning and digital zooming for robotic surgical systems
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US9603508B2 (en) * 2003-04-21 2017-03-28 Karl Storz Imaging, Inc. Method for capturing and displaying endoscopic maps

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07184923A (en) * 1993-12-28 1995-07-25 Hitachi Ltd Remote precise surgical operation supporting device
JP3539645B2 (en) * 1995-02-16 2004-07-07 株式会社日立製作所 Remote surgery supporting system
JP3550966B2 (en) * 1996-09-18 2004-08-04 株式会社日立製作所 Surgical device
JP2000271147A (en) * 1999-03-19 2000-10-03 Olympus Optical Co Ltd Remote surgery support system
JP3668865B2 (en) * 1999-06-21 2005-07-06 株式会社日立製作所 Surgical device
US20010025183A1 (en) * 2000-02-25 2001-09-27 Ramin Shahidi Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body
DE10015826A1 (en) * 2000-03-30 2001-10-11 Siemens Ag Image generating system for medical surgery
JP2002253574A (en) * 2001-03-01 2002-09-10 Hitachi Ltd Operation support device
JP3579379B2 (en) * 2001-08-10 2004-10-20 株式会社東芝 Medical manipulator system
JP3529373B2 (en) * 2001-11-09 2004-05-24 ファナック株式会社 Work machine of the simulation apparatus
US7831292B2 (en) * 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
JP2004105638A (en) * 2002-09-20 2004-04-08 Shimadzu Corp Ultrasonic diagnostic apparatus
CN1410031A (en) 2002-11-21 2003-04-16 上海交通大学 Stereo positioning system in high strength focusing ultrasonic operation
JP2004174662A (en) * 2002-11-27 2004-06-24 Fanuc Ltd Operation state analysis device for robot
JP2004223128A (en) * 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
JP2005110878A (en) * 2003-10-06 2005-04-28 Olympus Corp Operation supporting system
EP1689290A2 (en) * 2003-10-21 2006-08-16 The Board of Trustees of The Leland Stanford Junior University Systems and methods for intraoperative targeting
US20050085718A1 (en) * 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting
US20050089205A1 (en) * 2003-10-23 2005-04-28 Ajay Kapur Systems and methods for viewing an abnormality in different kinds of images

Patent Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US5408409A (en) * 1990-05-11 1995-04-18 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US6547782B1 (en) * 1991-06-13 2003-04-15 International Business Machines, Corp. System and method for augmentation of surgery
US7107124B2 (en) * 1992-01-21 2006-09-12 Sri International Roll-pitch-roll wrist methods for minimally invasive robotic surgery
US5482029A (en) * 1992-06-26 1996-01-09 Kabushiki Kaisha Toshiba Variable flexibility endoscope system
US5759153A (en) * 1992-06-30 1998-06-02 Cardiovascular Imaging Systems, Inc. Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same
US5788688A (en) * 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5842473A (en) * 1993-11-29 1998-12-01 Life Imaging Systems Three-dimensional imaging system
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5836880A (en) * 1995-02-27 1998-11-17 Micro Chemical, Inc. Automated system for measuring internal tissue characteristics in feed animals
US5797849A (en) * 1995-03-28 1998-08-25 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US6226566B1 (en) * 1995-04-21 2001-05-01 International Business Machines Corporation Method of constrained cartesian control of robotic mechanisms with active and passive joints
US5551432A (en) * 1995-06-19 1996-09-03 New York Eye & Ear Infirmary Scanning control system for ultrasound biomicroscopy
US7144367B2 (en) * 1995-07-24 2006-12-05 Chen David T Anatomical visualization system
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US5797900A (en) * 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6642836B1 (en) * 1996-08-06 2003-11-04 Computer Motion, Inc. General purpose distributed operating room control system
US5810008A (en) * 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
US5853367A (en) * 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US5842993A (en) * 1997-12-10 1998-12-01 The Whitaker Corporation Navigable ultrasonic imaging probe assembly
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
US6425865B1 (en) * 1998-06-12 2002-07-30 The University Of British Columbia Robotically assisted medical ultrasound
US20030013949A1 (en) * 1998-11-20 2003-01-16 Frederic H. Moll Cooperative minimally invasive telesurgical system
US6659939B2 (en) * 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6837883B2 (en) * 1998-11-20 2005-01-04 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US20030220541A1 (en) * 1998-12-08 2003-11-27 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6714839B2 (en) * 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6602185B1 (en) * 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system
US6493608B1 (en) * 1999-04-07 2002-12-10 Intuitive Surgical, Inc. Aspects of a control system of a minimally invasive surgical apparatus
US6599247B1 (en) * 2000-07-07 2003-07-29 University Of Pittsburgh System and method for location-merging of real-time tomographic slice images with human vision
US20060058988A1 (en) * 2001-05-29 2006-03-16 Defranoux Nadine A Method and apparatus for computer modeling a joint
US20020193800A1 (en) * 2001-06-11 2002-12-19 Kienzle Thomas C. Surgical drill for use with a computer assisted surgery system
US7413565B2 (en) * 2002-01-16 2008-08-19 Intuitive Surgical, Inc. Minimally invasive surgical training using robotics and telecollaboration
US7155316B2 (en) * 2002-08-13 2006-12-26 Microbotics Corporation Microsurgical robot system
US9603508B2 (en) * 2003-04-21 2017-03-28 Karl Storz Imaging, Inc. Method for capturing and displaying endoscopic maps
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20130245375A1 (en) * 2005-06-06 2013-09-19 The Johns Hopkins University c/o John Hopkins Technology Transfer Interactive user interfaces for robotic minimally invasive surgical systems
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US8155479B2 (en) * 2008-03-28 2012-04-10 Intuitive Surgical Operations Inc. Automated panning and digital zooming for robotic surgical systems

Cited By (158)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9232984B2 (en) 1999-04-07 2016-01-12 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US20070032720A1 (en) * 2003-06-17 2007-02-08 Onesys Oy Method and system for navigating in real time in three-dimensional medical image model
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US9795446B2 (en) 2005-06-06 2017-10-24 Intuitive Surgical Operations, Inc. Systems and methods for interactive user interfaces for robotic minimally invasive surgical systems
US20070129626A1 (en) * 2005-11-23 2007-06-07 Prakash Mahesh Methods and systems for facilitating surgical procedures
US9526583B2 (en) 2005-12-30 2016-12-27 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber Bragg gratings
US20110224684A1 (en) * 2005-12-30 2011-09-15 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US20110050852A1 (en) * 2005-12-30 2011-03-03 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US9060793B2 (en) 2005-12-30 2015-06-23 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensor using fiber bragg gratings
US20080287963A1 (en) * 2005-12-30 2008-11-20 Rogers Theodore W Methods and apparatus to shape flexible entry guides for minimally invasive surgery
US9241769B2 (en) 2005-12-30 2016-01-26 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US20110224689A1 (en) * 2005-12-30 2011-09-15 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US9066739B2 (en) 2005-12-30 2015-06-30 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US9039685B2 (en) 2005-12-30 2015-05-26 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US9125679B2 (en) 2005-12-30 2015-09-08 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US9101380B2 (en) 2005-12-30 2015-08-11 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber Bragg gratings
US9084624B2 (en) 2005-12-30 2015-07-21 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US8219177B2 (en) * 2006-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US20070238985A1 (en) * 2006-02-16 2007-10-11 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion in the body
US8010181B2 (en) * 2006-02-16 2011-08-30 Catholic Healthcare West System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion in the body
US20080154389A1 (en) * 2006-02-16 2008-06-26 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) Method and system for performing invasive medical procedures using a surgical robot
US20070239021A1 (en) * 2006-03-02 2007-10-11 Yutaka Oonuki Ultrasonic diagnostic apparatus and ultrasonic diagnostic processing method
US8870775B2 (en) * 2006-03-02 2014-10-28 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and ultrasonic diagnostic processing method
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle
US8348861B2 (en) * 2006-06-05 2013-01-08 Technion Research & Development Foundation Ltd. Controlled steering of a flexible needle
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US8398541B2 (en) 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20130079794A9 (en) * 2006-06-13 2013-03-28 Intuitive Surgical Operations, Inc. Surgical system entry guide
US9345387B2 (en) 2006-06-13 2016-05-24 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US8784435B2 (en) * 2006-06-13 2014-07-22 Intuitive Surgical Operations, Inc. Surgical system entry guide
US9060678B2 (en) 2006-06-13 2015-06-23 Intuitive Surgical Operations, Inc. Minimally invasive surgical system
US20080065105A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Minimally invasive surgical system
US9757149B2 (en) 2006-06-13 2017-09-12 Intuitive Surgical Operations, Inc. Surgical system entry guide
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US9801690B2 (en) 2006-06-29 2017-10-31 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US20080146915A1 (en) * 2006-10-19 2008-06-19 Mcmorrow Gerald Systems and methods for visualizing a cannula trajectory
US9782229B2 (en) 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US20080215181A1 (en) * 2007-02-16 2008-09-04 Catholic Healthcare West (D/B/A St. Joseph's Hospital And Medical Center) Method and system for performing invasive medical procedures using a surgical robot
US8219178B2 (en) * 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US9078685B2 (en) 2007-02-16 2015-07-14 Globus Medical, Inc. Method and system for performing invasive medical procedures using a surgical robot
US9421068B2 (en) * 2007-03-01 2016-08-23 Titan Medical Inc. Methods, systems and devices for three dimensional input and control methods and systems based thereon
US20140142593A1 (en) * 2007-04-16 2014-05-22 Tim Fielding Frame Mapping and Force Feedback Methods, Devices and Systems
US9044257B2 (en) * 2007-04-16 2015-06-02 Tim Fielding Frame mapping and force feedback methods, devices and systems
US7931182B2 (en) 2007-05-16 2011-04-26 The Invention Science Fund I, Llc Steerable surgical stapler
US20080283572A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Surgical stapling instrument with chemical sealant
US20080287987A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Dispensing system for tissue sealants
US8485411B2 (en) 2007-05-16 2013-07-16 The Invention Science Fund I, Llc Gentle touch surgical stapler
US20080283577A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Steerable surgical stapler
US20100294826A1 (en) * 2007-05-16 2010-11-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Gentle touch surgical stapler
US7832611B2 (en) 2007-05-16 2010-11-16 The Invention Science Fund I, Llc Steerable surgical stapler
US7823761B2 (en) 2007-05-16 2010-11-02 The Invention Science Fund I, Llc Maneuverable surgical stapler
US7810691B2 (en) 2007-05-16 2010-10-12 The Invention Science Fund I, Llc Gentle touch surgical stapler
US7798385B2 (en) * 2007-05-16 2010-09-21 The Invention Science Fund I, Llc Surgical stapling instrument with chemical sealant
US7922064B2 (en) 2007-05-16 2011-04-12 The Invention Science Fund, I, LLC Surgical fastening device with cutter
US9445809B2 (en) 2007-05-16 2016-09-20 Deep Science, Llc Gentle touch surgical stapler
US7975894B2 (en) 2007-05-16 2011-07-12 The Invention Science Fund I, Llc Sensing surgical fastener
US20100294825A1 (en) * 2007-05-16 2010-11-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Steerable surgical stapler
US20080283574A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Maneuverable surgical stapler
US20080283570A1 (en) * 2007-05-16 2008-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Gentle touch surgical stapler
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9629520B2 (en) 2007-06-13 2017-04-25 Intuitive Surgical Operations, Inc. Method and system for moving an articulated instrument back towards an entry guide while automatically reconfiguring the articulated instrument for retraction into the entry guide
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US20090069804A1 (en) * 2007-09-12 2009-03-12 Jensen Jeffrey L Apparatus for efficient power delivery
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US20100234996A1 (en) * 2007-10-20 2010-09-16 Kuka Roboter Gmbh Manipulator, Particularly Industrial Robot, Having A Redundant Sensor Arrangement, And Method For The Control Thereof
US8594847B2 (en) * 2007-10-20 2013-11-26 Kuka Laboratories Gmbh Manipulator, particularly industrial robot, having a redundant sensor arrangement, and method for the control thereof
US20090112243A1 (en) * 2007-10-25 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Surgical cutter with dispensing system for tissue sealants
US20090112256A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Suturing device with tissue sealant dispenser
US20090143816A1 (en) * 2007-11-30 2009-06-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Grasper with surgical sealant dispenser
US8977342B2 (en) * 2007-12-11 2015-03-10 Siemens Aktiengesellschaft Medical intervention device
US20090149740A1 (en) * 2007-12-11 2009-06-11 Siemens Aktiengesellschaft A medical intervention device
US20100324733A1 (en) * 2007-12-28 2010-12-23 Kuka Roboter Gmbh Robot And Method For Monitoring The Torque On Such A Robot
US8649906B2 (en) * 2007-12-28 2014-02-11 Kuka Laboratories Gmbh Robot and method for monitoring the torque on such a robot
US8998797B2 (en) * 2008-01-29 2015-04-07 Karl Storz Gmbh & Co. Kg Surgical system
US20090192519A1 (en) * 2008-01-29 2009-07-30 Terumo Kabushiki Kaisha Surgical system
WO2009126955A3 (en) * 2008-04-11 2010-02-11 The Regents Of The University Of Michigan Minimal access tool
US9629689B2 (en) 2008-04-11 2017-04-25 Flexdex, Inc. Attachment apparatus for remote access tools
US8668702B2 (en) 2008-04-11 2014-03-11 The Regents Of The University Of Michigan Minimal access tool
US9869339B2 (en) 2008-04-11 2018-01-16 Flexdex, Inc. End-effector jaw closure transmission systems for remote access tools
US9675370B2 (en) 2008-04-11 2017-06-13 The Regents Of The University Of Michigan Minimal access tool
US8663130B2 (en) * 2008-05-28 2014-03-04 Technion Researh & Development Foundation Ltd. Ultrasound guided robot for flexible needle steering
US20110112549A1 (en) * 2008-05-28 2011-05-12 Zipi Neubach Ultrasound guided robot for flexible needle steering
US20140142429A1 (en) * 2008-05-28 2014-05-22 Technion Research & Development Foundation Ltd. Ultrasound guided robot for flexible needle steering
US9420995B2 (en) * 2008-05-28 2016-08-23 Technion Research & Development Foundation Ltd. Ultrasound guided robot for flexible needle steering
US9717563B2 (en) 2008-06-27 2017-08-01 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
US8892224B2 (en) 2008-09-26 2014-11-18 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8315720B2 (en) 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8583274B2 (en) 2008-09-26 2013-11-12 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of medical robotic system
KR100944412B1 (en) 2008-10-13 2010-02-25 (주)미래컴퍼니 Surgical slave robot
US20110234754A1 (en) * 2008-11-24 2011-09-29 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
US8639000B2 (en) 2008-12-31 2014-01-28 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100168918A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Obtaining force information in a minimally invasive surgical procedure
US8184880B2 (en) 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US20100169815A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Visual force feedback in a minimally invasive surgical procedure
US20100166323A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical. Inc. Robust sparse image matching for robotic surgery
US8374723B2 (en) 2008-12-31 2013-02-12 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US8706301B2 (en) 2008-12-31 2014-04-22 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US8594841B2 (en) * 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US20100217279A1 (en) * 2009-02-20 2010-08-26 Tyco Healthcare Group Lp Marking Articulating Direction For Surgical Instrument
WO2010117684A1 (en) 2009-03-31 2010-10-14 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US9155592B2 (en) * 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
KR100956762B1 (en) * 2009-08-28 2010-05-12 주식회사 래보 Surgical robot system using history information and control method thereof
US8942453B2 (en) * 2009-09-18 2015-01-27 Konica Minolta, Inc. Ultrasonograph and method of diagnosis using same
US20120177276A1 (en) * 2009-09-18 2012-07-12 Manabu Migita Ultrasonograph and method of diagnosis using same
US20120226150A1 (en) * 2009-10-30 2012-09-06 The Johns Hopkins University Visual tracking and annotaton of clinically important anatomical landmarks for surgical interventions
US9814392B2 (en) * 2009-10-30 2017-11-14 The Johns Hopkins University Visual tracking and annotaton of clinically important anatomical landmarks for surgical interventions
US20130296883A1 (en) * 2009-11-27 2013-11-07 Mcmaster University Automated detection, diagnostic and therapeutic method and system
US9259271B2 (en) 2009-11-27 2016-02-16 Mehran Anvari Automated in-bore MR guided robotic diagnostic and therapeutic system
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US20110238080A1 (en) * 2010-03-25 2011-09-29 Date Ranjit Robotic Surgical Instrument System
CN102821671A (en) * 2010-03-31 2012-12-12 富士胶片株式会社 Oscope observation supporting system and method, and device and programme
US20110270443A1 (en) * 2010-04-28 2011-11-03 Kabushiki Kaisha Yaskawa Denki Apparatus and method for detecting contact position of robot
US8798790B2 (en) * 2010-04-28 2014-08-05 Kabushiki Kaisha Yaskawa Denki Apparatus and method for detecting contact position of robot
WO2012037257A2 (en) * 2010-09-14 2012-03-22 The Johns Hopkins University Robotic system to augment endoscopes
WO2012037257A3 (en) * 2010-09-14 2012-06-14 The Johns Hopkins University Robotic system to augment endoscopes
US9068820B2 (en) 2010-10-15 2015-06-30 Scopis Gmbh Method and device for calibrating an optical system, distance determining device, and optical system
US8537201B2 (en) * 2010-10-18 2013-09-17 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
US20120092450A1 (en) * 2010-10-18 2012-04-19 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
CN103200877A (en) * 2010-11-11 2013-07-10 约翰霍普金斯大学 Remote center of motion robot for medical image scanning and image-guided targeting
WO2012065058A3 (en) * 2010-11-11 2012-08-02 The Johns Hopkins University Remote center of motion robot for medical image scanning and image-guided targeting
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
WO2013025613A1 (en) * 2011-08-12 2013-02-21 Jointvue, Llc 3-d ultrasound imaging device and methods
US20140212025A1 (en) * 2011-09-13 2014-07-31 Koninklijke Philips Electronics N.V. Automatic online registration between a robot and images
US9387048B2 (en) 2011-10-14 2016-07-12 Intuitive Surgical Operations, Inc. Catheter sensor systems
US9452276B2 (en) 2011-10-14 2016-09-27 Intuitive Surgical Operations, Inc. Catheter with removable vision probe
US9579800B2 (en) * 2012-03-29 2017-02-28 Reis Group Holding Gmbh & Co. Kg Device and method for operating an industrial robot
US20150045951A1 (en) * 2012-03-29 2015-02-12 Reis Group Holding Gmbh & Co. Kg Device and method for operating an industrial robot
US20150073436A1 (en) * 2012-05-18 2015-03-12 Olympus Corporation Operation support device
US9259282B2 (en) 2012-12-10 2016-02-16 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
WO2014093367A1 (en) 2012-12-10 2014-06-19 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
US20140336669A1 (en) * 2013-05-08 2014-11-13 Samsung Electronics Co., Ltd. Haptic gloves and surgical robot systems
WO2015081213A1 (en) * 2013-11-27 2015-06-04 Children's National Medical Center 3d corrected imaging
EP3073894A4 (en) * 2013-11-27 2017-08-30 Children's Nat Medical Center 3d corrected imaging
US20150145966A1 (en) * 2013-11-27 2015-05-28 Children's National Medical Center 3d corrected imaging
CN103869189A (en) * 2014-03-14 2014-06-18 南京东恒通信科技有限公司 Passive device debugging system
WO2015161297A1 (en) * 2014-04-17 2015-10-22 The Johns Hopkins University Robot assisted ultrasound system
US20150335315A1 (en) * 2014-05-15 2015-11-26 Samsung Medison Co., Ltd. Ultrasonic diagnosis device and method of diagnosing by using the same
WO2015175278A1 (en) * 2014-05-15 2015-11-19 Covidien Lp Systems and methods for controlling a camera position in a surgical robotic system
GB2550512A (en) * 2015-01-29 2017-11-22 Synaptive Medical (Barbados) Inc Physiological phantoms incorporating feedback sensors and sensing materials
WO2016119039A1 (en) * 2015-01-29 2016-08-04 Synaptive Medical (Barbados) Inc. Physiological phantoms incorporating feedback sensors and sensing materials
WO2016133633A1 (en) * 2015-02-19 2016-08-25 Covidien Lp Repositioning method of input device for robotic surgical system
US9883914B2 (en) 2015-07-30 2018-02-06 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US9814451B2 (en) 2015-10-02 2017-11-14 Flexdex, Inc. Handle mechanism providing unlimited roll
WO2017098506A1 (en) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Autonomic goals-based training and assessment system for laparoscopic surgery

Also Published As

Publication number Publication date Type
JP2012055717A (en) 2012-03-22 application
JP4999012B2 (en) 2012-08-15 grant
EP2289454A2 (en) 2011-03-02 application
JP5455173B2 (en) 2014-03-26 grant
JP2012050888A (en) 2012-03-15 application
JP6138227B2 (en) 2017-05-31 grant
KR20080027224A (en) 2008-03-26 application
US20170128041A1 (en) 2017-05-11 application
JP2012050887A (en) 2012-03-15 application
JP2008541990A (en) 2008-11-27 application
EP1887961A1 (en) 2008-02-20 application
US20170128145A1 (en) 2017-05-11 application
KR101258912B1 (en) 2013-04-30 grant
EP2289453A2 (en) 2011-03-02 application
EP2289453A3 (en) 2014-03-19 application
CN101193603B (en) 2010-11-03 grant
EP2289453B1 (en) 2015-08-05 grant
EP2289452A3 (en) 2015-12-30 application
JP2014138901A (en) 2014-07-31 application
JP2013252452A (en) 2013-12-19 application
US20170128144A1 (en) 2017-05-11 application
WO2007030173A1 (en) 2007-03-15 application
EP2289454A3 (en) 2015-12-30 application
JP2016041377A (en) 2016-03-31 application
EP2289452A2 (en) 2011-03-02 application
EP1887961B1 (en) 2012-01-11 grant
CN101193603A (en) 2008-06-04 application

Similar Documents

Publication Publication Date Title
Sung et al. Robotic laparoscopic surgery: a comparison of the da Vinci and Zeus systems
US8620473B2 (en) Medical robotic system with coupled control modes
US6097994A (en) Apparatus and method for determining the correct insertion depth for a biopsy needle
US20100317965A1 (en) Virtual measurement tool for minimally invasive surgery
US20110137156A1 (en) Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20080243142A1 (en) Videotactic and audiotactic assisted surgical methods and procedures
US20100318099A1 (en) Virtual measurement tool for minimally invasive surgery
US7008373B2 (en) System and method for robot targeting under fluoroscopy based on image servoing
US20020007108A1 (en) Anatomical visualization system
US20090248039A1 (en) Sterile Drape Interface for Robotic Surgical Instrument
US20120143029A1 (en) Systems and methods for guiding a medical instrument
US20100022874A1 (en) Image Guided Navigation System and Method Thereof
US20050085717A1 (en) Systems and methods for intraoperative targetting
US6676669B2 (en) Surgical manipulator
US20050085718A1 (en) Systems and methods for intraoperative targetting
US20100249507A1 (en) Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
US20100249506A1 (en) Method and system for assisting an operator in endoscopic navigation
US20090326318A1 (en) Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
Fichtinger et al. Image overlay guidance for needle insertion in CT scanner
US8170716B2 (en) Methods and apparatus for surgical planning
US20080004603A1 (en) Tool position and identification indicator displayed in a boundary area of a computer display screen
US7963288B2 (en) Robotic catheter system
US8560118B2 (en) Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis
US20070073137A1 (en) Virtual mouse for use in surgical navigation
US8190238B2 (en) Robotic catheter system and methods

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE JOHNS HOPKINS UNIVERSITY, MARYLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAYLOR, RUSSELL H.;CHOTI, MICHAEL;LEVEN, JOSHUA;AND OTHERS;REEL/FRAME:019245/0224;SIGNING DATES FROM 20070227 TO 20070409

Owner name: INTUITIVE SURGICAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAYLOR, RUSSELL H.;CHOTI, MICHAEL;LEVEN, JOSHUA;AND OTHERS;REEL/FRAME:019245/0224;SIGNING DATES FROM 20070227 TO 20070409

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:029804/0385

Effective date: 20100219