US20090192523A1 - Synthetic representation of a surgical instrument - Google Patents

Synthetic representation of a surgical instrument Download PDF

Info

Publication number
US20090192523A1
US20090192523A1 US12415332 US41533209A US2009192523A1 US 20090192523 A1 US20090192523 A1 US 20090192523A1 US 12415332 US12415332 US 12415332 US 41533209 A US41533209 A US 41533209A US 2009192523 A1 US2009192523 A1 US 2009192523A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
tool
position
end
image
effector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12415332
Inventor
David Q. Larkin
Brian D. Hoffman
Paul W. Mohr
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00154Holding or positioning arrangements using guide tubes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00165Optical arrangements with light-conductive means, e.g. fibre optics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B19/2203
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • A61B2017/00305Constructional details of the flexible means
    • A61B2017/00314Separate linked members
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • A61B2017/00318Steering mechanisms
    • A61B2017/00323Cables or rods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • A61B2034/306Wrists with multiple vertebrae
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39449Pendant, pda displaying camera images overlayed with graphics, augmented reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40607Fixed camera to observe workspace, object, workpiece, global
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45123Electrogoniometer, neuronavigator, medical robot used by surgeon to operate
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45169Medical, rontgen, x ray

Abstract

A synthetic representation of a tool for display on a user interface of a robotic system. The synthetic representation may be used to show force on the tool, an actual position of the tool, or to show the location of the tool when out of a field of view. A three-dimensional pointer is also provided for a viewer in the surgeon console of a telesurgical system.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • [0001]
    This application is a continuation-in-part of U.S. patent application Ser. No. 11/478,531 (filed Jun. 29, 2006), and this application is a continuation-in-part of U.S. patent application Ser. No. 11/762,202 (filed Jun. 13, 2007), both of which are incorporated herein by reference.
  • BACKGROUND
  • [0002]
    Minimally invasive surgeries performed by robotic surgical systems are known and commonly used in clinical procedures where it is advantageous for a human not to perform surgery directly. One example of such a system is the minimally invasive robotic surgery system described in commonly owned U.S. Pat. No. 7,155,315 (filed Dec. 12, 2005). The da Vinci® Surgical Systems manufactured by Intuitive Surgical, Inc. of Sunnyvale, Calif. are illustrative implementations of minimally invasive robotic surgical systems (e.g., teleoperated; telesurgical).
  • [0003]
    A common form of minimally invasive surgery is endoscopy. Endoscopic surgical instruments in minimally invasive medical techniques generally include an endoscope for viewing the surgical field and working tools that include end effectors. Typical surgical end effectors include clamps, graspers, scissors, staplers, or needle holders, as examples. The working tools are similar to those used in conventional (open) surgery, except that the end effector of each tool is supported on the end of, for example, an approximately 12-inch-long extension tube.
  • [0004]
    To manipulate end effectors, a human operator, typically a surgeon, manipulates or otherwise commands a master manipulator. Commands from the master manipulator are translated as appropriate and sent to a slave manipulator. The slave manipulator then manipulates the end effectors according to the operator's commands.
  • [0005]
    Force feedback may be included in minimally invasive robotic surgical systems. To provide such feedback, the remote slave manipulators typically provide force information to the master manipulator, and that force information is utilized to provide force feedback to the surgeon so that the surgeon is given the perception of feeling forces acting on a slave manipulator. In some force feedback implementations, haptic feedback may provide an artificial feel to the surgeon of tissue reactive forces on a working tool and its end effector.
  • [0006]
    Often, the master controls, which are typically located at a surgeon console, will include a feature for releasing control of one of the work tools at the patient site. This feature may be used, for example, in a system where there are more than two working tools (and thus more surgical instruments than surgeon's hands). In such a system, the surgeon may release control of one working tool by one master and then establish control (grab) of another working tool with that master.
  • [0007]
    When reaching to grab another working tool, the master manipulator may provide haptic feel so that a surgeon receives feedback that the tool has been grabbed or released. Such feedback is sometimes referred to as a “haptic detent.” The haptic detent permits the surgeon to recognize when the master manipulator is in the correct location and orientation to grab a tool. An example of a haptic detent is described, for example, in U.S. Pat. App. Pub. No. US 2007/0021738 A1 (filed Jun. 6, 2006). While such haptic detents work well for their intended purpose, the hardware required to provide any haptic feedback to a surgeon's hands can be complicated and expensive.
  • [0008]
    Utilizing more than two working tools can present other issues. For example, when a surgeon releases one working tool and tries to grasp a new working tool, the new working tool may be out of the endoscopic field of view for the surgeon.
  • [0009]
    In general, in telesurgical systems, the surgeon is provided an “internal user interface.” This internal user interface is the screen that can be seen by the surgeon while looking into the viewer of the surgeon console. The items shown on this user interface typically include the field of view that is provided from the endoscope and often other critical information, such as system or tool status information. Special care is taken in the design of this internal user interface to ensure it is as natural as possible so as to not distract the surgeon from the surgery itself. In addition to this user interface, often a second “external” user interface is provided in which another operator may view some features of the telesurgical system and provide some noncritical adjustments, such as endoscopic illumination brightness, for example. In practice, however, the surgeon sometimes has to remove his or her head from the viewer to access and view the information available on the secondary interface, which interrupts the surgical work.
  • BRIEF SUMMARY
  • [0010]
    The following presents a simplified summary of some embodiments of the invention in order to provide a basic understanding of the invention. This summary is not an extensive overview of the invention. It is not intended to identify key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some embodiments of the invention in a simplified form as a prelude to the more detailed description that is presented later.
  • [0011]
    In an embodiment, a robotic surgical system is provided. The system includes a tool for performing surgery on a patient; data for providing a synthetic representation of the tool; an image capturing device for capturing a field of view including an image of the tool; a display; a first component coupling the image capture device to the display so as to display the field of view in the display; and a second component coupling the data to the display so as to display the synthetic representation of the tool including a graphical representation of an orientation of the tool.
  • [0012]
    A method is provided including a tool for performing surgery on a patient; data for providing a synthetic representation of the tool; an image capturing device for capturing a field of view including an image of the tool; a display; a first component coupling the image capture device to the display so as to display the field of view in the display; and a second component coupling the data to the display so as to display the synthetic representation of the tool including a graphical representation of an orientation of the tool.
  • [0013]
    In another embodiment, a method is provided a visual representation a position of a tool in a robotic system. The method includes displaying a first image comprising a video feed of a tool within a field of view; and superimposing on the first image a second image representing a position of the tool, an orientation of the tool, or both.
  • [0014]
    In yet another embodiment, a robotic system is provided. The system includes a tool for performing surgery on a patient; data for providing a synthetic representation of the tool; an image capturing device for capturing a field of view including an image of the tool; a display; a first component coupling the image capturing device to the display so as to display the field of view in the display; and a second component coupling the data to the display so as to superimpose over the field of view the synthetic representation of the tool including a graphical representation of a position of the tool, an orientation of the tool, or both.
  • [0015]
    In still another embodiment, a robotic surgical system is provided. The method includes a tool for performing surgery on a patient; an image capturing device for capturing a field of view including an image of the tool; a master for inputting a movement; a display for displaying the field of view; the master selectively operatively connectable to the tool by a first component so as to generate a following movement of the tool in response to the input movement; and the master selectively operatively connectable to the display by a second component so as to generate a three-dimensional pointing image displayed on the display, and so as to follow movement of the master controller with the three dimensional pointing image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0016]
    FIG. 1 shows a top view of an operating room which includes a minimally invasive telesurgical system in accordance with an embodiment;
  • [0017]
    FIG. 2 is front view of a patient cart for the minimally invasive telesurgical system of FIG. 1;
  • [0018]
    FIG. 3 is a block diagram representing components of the minimally invasive telesurgical system of FIG. 1;
  • [0019]
    FIG. 4 is a block diagram representing components for a computer for use in the minimally invasive telesurgical system of FIG. 1 in accordance with an embodiment;
  • [0020]
    FIG. 5 is a flowchart representing steps for calculating force on an end effector in accordance with an embodiment;
  • [0021]
    FIG. 6 is a diagrammatic representation of movement of an end effector between positions A and B with force F resisting the movement;
  • [0022]
    FIG. 7 is a flowchart representing steps for displaying force in accordance with an embodiment;
  • [0023]
    FIG. 8 is a side perspective view of an end effector and synthetic representation of an end effector showing force in accordance with an embodiment;
  • [0024]
    FIG. 9 is a flowchart representing steps for displaying a synthetic model in accordance with an embodiment;
  • [0025]
    FIG. 10 is a representation of a view through a viewer of a surgeon console, with the view including a field of view and an outer view pane;
  • [0026]
    FIG. 11 is a flowchart showing steps for displaying a synthetic tool at the actual location of a tool in accordance with an embodiment; and
  • [0027]
    FIG. 12 is a diagrammatic representation of a display utilizing a pointing device in accordance with an embodiment; and
  • [0028]
    FIG. 13 is a side perspective view of a master controller in accordance with an embodiment.
  • DETAILED DESCRIPTION
  • [0029]
    In the following description, various aspects and embodiments of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. Furthermore, well-known features may be omitted from this description or simplified in order not to obscure the embodiment being described.
  • [0030]
    Referring now to the drawings, in which like reference numerals represent like parts throughout several views, FIG. 1 shows a minimally invasive telesurgical system 20 having an operator station or surgeon console 30 in accordance with an embodiment. The surgeon console 30 includes a viewer 32 where an image of a surgical site is displayed to a surgeon S. As is known, a support (not shown) is provided on which the surgeon S can rest his or her forearms while gripping two master controls 700 (FIG. 13), one in each hand. More controls may be provided if more end effectors are available, but typically a surgeon manipulates only two controls at a time and, if multiple end effectors are used, the surgeon releases one end effector with a master control 700 and grasps another with same master control. When using the surgeon console 30, the surgeon S typically sits in a chair in front of the surgeon console, positions his or her eyes in front of the viewer 32, and grips the master controls 700, one in each hand, while resting his or her forearms on the support.
  • [0031]
    A patient side cart 40 of the telesurgical system 20 is positioned adjacent to a patient P. In use, the patient side cart 40 is positioned close to the patient P requiring surgery. The patient side cart 40 typically is stationary during a surgical procedure. The surgeon console 30 is typically positioned remote from the patient side cart 40, and it may be separated from the patient side cart by a great distance—even miles away—but will typically be used within the same operating room as the patient side cart.
  • [0032]
    The patient side cart 40, shown in more detail in FIG. 2, typically includes two or more robotic arm assemblies. In the embodiment shown in FIG. 2, the patient side cart 40 includes four robotic arm assemblies 42, 44, 46, 48, but more or less may be provided. Each robotic arm assembly 42, 44, 46, 48 is normally operatively connected to one of the master controls of the surgeon console 30. Thus, movement of the robotic arm assemblies 44, 46 48 is controlled by manipulation of the master controls.
  • [0033]
    One of the robotic arm assemblies, indicated by the reference numeral 42, is arranged to hold an image capturing device 50, e.g., an endoscope, or the like. The endoscope or image capturing device 50 includes a viewing end 56 at a remote end of an elongate shaft 54. The elongate shaft 54 permits the viewing end 56 to be inserted through a surgery entry port of the patient P. The image capturing device 50 is operatively connected to the viewer 32 of the surgeon console 30 to display an image captured at its viewing end 56.
  • [0034]
    Each of the other robotic arm assemblies 44, 46, 48 includes a surgical instrument or tool 60, 62, 64, respectively. The tools 60, 62, 64 of the robotic arm assemblies 44, 46, 48 include end effectors 66, 68, 70, respectively. The end effectors 66, 68, 70 are mounted on wrist members which are pivotally mounted on distal ends of elongate shafts of the tools, as is known in the art. The tools 60, 62, 64 have elongate shafts to permit the end effectors 66, 68, 70 to be inserted through surgical entry ports of the patient P. Movement of the end effectors 66, 68, 70 relative to the ends of the shafts of the tools 60, 62, 64 is also controlled by the master controls of the surgeon console 30.
  • [0035]
    The depicted telesurgical system 20 includes a vision cart 80. In an embodiment, the vision cart 80 includes most of the computer equipment or other controls for operating the telesurgical system 20. As an example, signals sent by the master controllers of the surgeon console 30 may be sent to the vision cart 80, which in turn may interpret the signals and generate commands for the end effectors 66, 68, 70 and/or robotic arm assemblies 44, 46, 48. In addition, video sent from the image capturing device 50 to the viewer 34 may be processed by, or simply transferred by, the vision cart 80.
  • [0036]
    FIG. 3 is a diagrammatic representation of the telesurgical system 20. As can be seen, the system includes the surgeon console 30, the patient side cart 40, and the vision cart 80. In addition, in accordance with an embodiment, an additional computer 82 and display 84 are provided. These components may be incorporated in one or more of the surgeon console 30, the patient side cart 40, and/or the vision cart 80. For example, the features of the computer 82 may be incorporated into the vision cart 80. In addition, the features of the display 84 may be incorporated into the surgeon console 30, for example, in the viewer 32, or maybe provided by a completely separate display or the surgeon console or on another location. In addition, in accordance with an embodiment, the computer 82 may generate information that may be utilized without a display, such as the display 84.
  • [0037]
    Although described as a “computer,” the computer 82 may be a component of a computer system or any other software or hardware that is capable of performing the functions described herein. Moreover, as described above, functions and features of the computer 82 may be distributed over several devices or software components. Thus, the computer 82 shown in the drawings is for the convenience of discussion, and it may be replaced by a controller, or its functions may be provided by one or more components.
  • [0038]
    FIG. 4 shows components of the computer 82 in accordance with an embodiment. In the embodiment shown in the drawing, the computer 82 includes a tool tracking component 90, a kinematic component 92, and a force component 94. Briefly described, the tool tracking component 90 and kinematic component 92 provide information to the force component 94, which in turn outputs a force output 96.
  • [0039]
    A positional component is included in or is otherwise associated with the computer 82. The positional component provides information about a position of an end effector, such as one of the end effectors 66, 68, 70. In the embodiment shown in the drawings, the tool tracking component 90 is the positional component, and it provides information about a position of an end effector, such as the end effectors 66, 68, 70. As used herein, “position” means at least one of the location and/or the orientation of the end effector. A variety of different technologies may be used to provide information about a position of an end effector, and such technologies may or may not be considered tool tracking devices. In a simple embodiment, the positional component utilizes video feed 98 from the image capturing device 50 to provide information about the position of an end effector, but other information may be used instead of, or in addition to, this visual information, including sensor information, kinematic information, any combination of these, or additional information that may provide the position and/or orientation of the end effectors 66, 68, 70. Examples of systems that may be used for the tool tracking component 90 are disclosed in, U.S. Pat. No. 5,950,629 (filed Apr. 28, 1994), U.S. Pat. No. 6,468,265 (filed Nov. 9, 1999), U.S. Pat. App. Pub. No. US 2006/0258938 A1 (filed May 16, 2005), and U.S. Pat. App. Pub. No. US 2008/0004603 A1 (filed Jun. 29, 2006). In accordance with an embodiment, the tool tracking component 90 utilizes the systems and methods described in commonly owned U.S. Pat. App. No. 61/204,084 (filed Dec. 31, 2008). In general, the positional component maintains information about the actual position and orientation of end effectors. This information is updated depending upon when the information is available, and may be, for example, asynchronous information.
  • [0040]
    The kinematic component 92 is generally any device that estimates a position, herein a “kinematic position,” of an end effector utilizing information available through the telesurgical system 20. In an embodiment, the kinematic component 92 utilizes kinematic position information from joint states of a linkage to the end effector. For example, the kinematic component 92 may utilize the master/slave architecture for the telesurgical system 20 to calculate intended Cartesian positions of the end effectors 66, 68, 70 based upon encoder signals for the joints in the linkage for each of the tools 60, 62, 64. An example of a kinematic system is described in U.S. Pat. No. 7,155,315, although others may be utilized.
  • [0041]
    FIG. 5 is a flowchart representing steps for calculating force on an end effector in accordance with an embodiment. At step 500, the end effector begins at position A. At step 502, the actual position of the end effector is stored. This actual position is obtained by, for example, the tool tracking component 90. At step 504, the kinematic information for the end effector is stored. This information may be obtained, for example, via the kinematic component 92.
  • [0042]
    Because of the large number of joints and movable parts, current kinematics typically does not provide exact information for the location of a surgical end effector in space. A system with sufficient rigidity and sensing could theoretically provide near-exact kinematic location information. However, in current minimal invasive surgery systems, often the kinematic information may be inaccurate by up to an inch in any direction. Thus, in accordance with an embodiment, but not necessarily used with the method disclosed in FIG. 5, an offset may be stored at step 506. This offset provides information regarding the difference between the kinematic information stored in step 504 and the actual position information stored in step 502. Utilizing the offset, the kinematic information and the actual position information may be registered to the same position.
  • [0043]
    At step 508, the end effector moves to position B. In step 510, the change in actual position of the end effector is calculated between the actual position of the tool at position B versus the actual position of the tool in position A. At step 512, the change in position is calculated using kinematic information obtained via the kinematic component 92. If desired, although not required, another offset may be determined at position B. At step 514, the force on the tool is represented by the difference between the change in actual positions between A and B and the change in kinematic positions between A and B. The difference between the change in actual position and the change in kinematic position is a representation of the direction and amount of force applied to the end effector, for example, supplied by contact of the end effector with body parts.
  • [0044]
    As an example, FIG. 6 is a diagrammatic representation of movement of an end effector from position A to position B with force F resisting the movement. At position A, an image of an end effector 110 has an actual position shown by the solid outer line for the end effector. Kinematic information for the end effector is represented by the dotted line 112. In the diagram shown in the drawing, the kinematic position information matches the actual position information. In reality, however, as described above, the kinematic position information may vary to some degree, and it may not match the actual position unless the offset provided in step 506 is utilized. For this example, it is assumed that the offset is used or that the kinematic information matches the actual information exactly at position A. Thus, the dotted line 112, representing the kinematic position information provided by the kinematic component 92, matches the position of the image 110 of the end effector, representing actual position information provided by the tool tracking component 90.
  • [0045]
    At position B, the actual position of the end effector, represented by the image 113, is shown as being moved from position A. This actual position, as described above, is calculated by the tool tracking component 90. The kinematic position information, estimates, however, that the tool, in movement from position A to position B, is now at the dotted line 114 shown with position B. The dotted line 114 represents a position where the end effector would be if moved without force being applied to the end effector 110. Absent force being applied to the end effector, this estimate is typically accurate. Although, as described above, kinematic position information is typically not accurate for determining a position of an end effector in space at a start of a process, the kinematic position information typically is accurate in determining a change in position of an end effector.
  • [0046]
    The position shown by the dotted line 114 assumes that the beginning point of movement for the end effector, with respect to the kinematic component 92, is the line 112. If the kinematic position information did not match the actual position information at position A, then the offset provided in step 506 may be utilized at position B to project the relative position of the dotted line 114 assuming a start at line 112.
  • [0047]
    The dotted line 114 is in a different location than the actual position of the end effector due to the difference between the kinematic position information and the actual position information. The difference between the two is due to force applied to the end effector in the movement from position A to position B. For example, in the example shown in FIG. 6, a force F is applied to the end effector during movement. This force prevents the end effector from moving fully as estimated by the kinematic component 92, shown by the dotted line 114. Instead, the combination of the movement of the linkage for the end effector 110 and the force F results in the end effector being positioned as shown by the image 113 in FIG. 6B.
  • [0048]
    The force output 96 provided by the change in kinematic position information versus actual position information may be useful for a variety of different applications. For example, the force output 96 may be forwarded to the vision cart 80, which in turn may generate instructions for the surgeon console 30 to create haptic feedback to the surgeon S so that the surgeon is provided positive feedback of the existence of force. In addition, in accordance with an embodiment and as is described above with reference to FIG. 6, the force output 96 may be utilized to generate an image representing force applied to the end effector. For example, by displaying the diagram at the B portion of FIG. 6, a representation of force applied on the end effector is provided. That is, providing the visual image of where the end effector would be absent force (i.e., the dotted line 114), and simultaneously displaying the image 113 of the actual location of the end effector, a viewer is provided a visual representation of the force applied to the end effector and the force's effect on the end effector.
  • [0049]
    In an embodiment, the timing of the position A may be selected by the computer 82. As an example, the position A may be initiated by an event, such as closing of grippers or scissors. Alternatively, the position A may be selected by a user, such as the surgeon S. If desired, the position A may be some combination of an event, information that is available to the image capturing device 50, taken at regular intervals, or any combination of these. The amount of time elapsed before establishing position B may also be determined by time, information available, or may be requested by the surgeon S.
  • [0050]
    As an example, a surgeon may grasp an organ or other part of the patient's body and may initiate the position A measurement described with reference to FIG. 5. Position B may then be displayed after a particular amount of time, or it may be selected by the surgeon as desired.
  • [0051]
    The display provided herein, for example, as shown in FIG. 6B, may be useful in displaying visual information about force, regardless of the force input. That is, the display may be used to display force sensed or otherwise provided from sources other than the computer 82. For example, force sensors may be utilized to determine the force on an end effector. This force may be displayed on the display 84 without the need for kinematic information.
  • [0052]
    FIG. 7 is a flowchart representing steps for displaying force in accordance with an embodiment. At step 700, an end effector begins at position A. At step 702, the position of A is stored. At step 704, the end effector is moved to position B. At step 706, the force applied to the end effector in the movement between position A and B is determined. At step 708, an image representing the actual position of the end effector at position B is displayed. This image may be a video view of the actual end effector, or another suitable image, such as a representation of the end effector. At step 710, an image representing the force being applied is displayed. This may be the dotted line 114 shown in FIG. 6B, or any other appropriate image. As an example, the display in step 710 may display force in a particular direction. In either event, a user is provided a visual indication of force that is applied to the end effector.
  • [0053]
    The features described herein may be provided in stereoscopic vision so that a user may visualize force in apparent three-dimensional (3-D) form. As can be understood, in a stereoscopic view, force that is transverse to a direction of view is more visual in such a representation. Force that is parallel to a direction of view may not be displayed, and feedback for forces in these directions may be provided by other mechanisms, such as haptic or a different type of screen display.
  • [0054]
    In addition, in accordance with an embodiment, the force information provided above may be provided with other force information, such as sensed force information, to provide a more detailed analysis of force being applied to an end effector.
  • Synthetic Model to Show Force
  • [0055]
    In accordance with an embodiment, instead of the dotted line 114, a synthetic image of an end effector may be displayed as a representation of the actual end effector without load. To this end, modeling data 150 (FIG. 3) may be provided that is associated with the patient side cart 40 and/or the computer 82. The modeling data 150 may be, for example, a two-dimensional (2-D)_or 3-D image of the end effector. In an embodiment, such an end effector is a 3-D model of the end effector, and thus may represent an actual solid model of the end effector. The modeling data 150 may be, for example, CAD data or other 3-D solid model data representing an end effector, such as the end effector 152 shown in FIG. 8. In an embodiment, the 3-D model is manipulatable at each joint so that movements of the end effector 152 may be mimicked by a synthetic model 154 of the end effector. As shown in FIG. 8, the synthetic model 154 may be the same size as the image of the actual end effector 152, but it may be larger or smaller.
  • [0056]
    The synthetic model 154 may be represented in a number of different ways. As an example, the synthetic model 154 may be a semi-transparent or translucent image of the end effector 152, or it may be a wire diagram image of the end effector. The synthetic model 154 may alternatively be an image that appears solid (i.e., not transparent/translucent), but such a model may make viewing of the actual end effector 152 difficult.
  • [0057]
    FIG. 9 is a flowchart representing steps for displaying the synthetic model 154 in accordance with an embodiment. In step 900, the end effector 152 begins at position A. In the embodiment shown in FIG. 9, the synthetic model is displayed in accordance with the actual position information (i.e., is displayed at the actual position of the end effector 152) at step 902. Thus, the synthetic model is superimposed over the image of the end effector 152, which may be a video image of the end effecter. For example, as shown in FIG. 8, the synthetic model 154 is translucent and may be displayed over the video image of the actual end effector 152. As another option, the synthetic model 154 may start at a location other than the actual position of the end effector 152.
  • [0058]
    At step 904, the end effector moves to position B. Optionally, at step 906, kinematic position information is received for the end effector 152. An adjustment for offset is taken at step 908, and then the synthetic model 154 is displayed in step 910.
  • [0059]
    In accordance with the method in FIG. 9, the synthetic model 154 may continue to be updated so that force information is represented by the synthetic model 154 and its position relative to the end effector 152. In the display shown, the end effector 152 is a video image of the end effector. As such, steps 906-910 may be updated in real time, for both the video image and the synthetic model 154, so that the synthetic model 154 and its position are updated as the end effector 152 is moved. In such continual real time display of the synthetic model 154, step 902 may be substituted with the display of the model at the last location instead of the actual position. In addition, as described above, the offset and the original position A may be determined in accordance with an event or timing or in another manner.
  • [0000]
    Synthetic Tool to Show a Tool Hidden from a Field of View
  • [0060]
    As described in the Background section of this document, there are times when a tool may be out of a field of view for the viewer 32. In accordance with an embodiment, a synthetic tool may be utilized in a viewing pane that is outside the field of view. FIG. 10 shows such an embodiment where the field of view 200 includes two tools 202, 204, that are currently linked to master controllers 700 of the surgeon console 30. These two tools 202, 204 are within the field of view 200 of the viewer 32. A third tool 206 is outside the field of view 200.
  • [0061]
    Although master controllers 700 are well known, a brief description is given here for the benefit of the reader. In the embodiment shown in FIG. 13, a hand held part or wrist gimbal of the master control device 700 is generally indicated by reference numeral 699. Part 699 has an articulated arm portion including a plurality of members or links 702 connected together by pivotal connections or joints 704. The surgeon grips the part 699 by positioning his or her thumb and index finger over a pincher formation 706. The surgeon's thumb and index finger are typically held on the pincher formation 706 by straps (not shown) threaded through slots 710. When the pincher formation 706 is squeezed between the thumb and index finger, the fingers or end effector elements of the end effector 66 close. When the thumb and index finger are moved apart the fingers of the end effector 66 move apart in sympathy with the moving apart of the pincher formation 706. The joints of the part 699 are operatively connected to actuators, e.g., electric motors, or the like, to provide for, e.g., force feedback, gravity compensation, and/or the like. Such a system is described in greater detail in U.S. Pat. No. 7,155,315. Furthermore, appropriately positioned sensors, e.g., encoders, or potentiometers, or the like, are positioned on each joint 704 of the part 699, so as to enable joint positions of the part 699 to be determined by the control system.
  • [0062]
    In accordance with an embodiment, instruments (e.g., tools) that are outside the field of view 200, such as the tool 206 above, are displayed within a viewing pane 208 that is outside and borders the field of view 200 image. Using a view panel in this manner is sometimes called image mosaicing, which is known and taught by, for example, U.S. Pat. No. 7,194,118 (filed Nov. 10, 2000) and U.S. Pat. App. Pub. No. US 2008/0065109 A1 (filed Jun. 13, 2007) (e.g., FIG. 29F and associated text).
  • [0063]
    In accordance with an embodiment, a synthetic tool is utilized to display the location of the tool 206 that is outside the field of view 200 and in the viewing pane 208. The synthetic tool may be a 3-D model of the tool, and may be oriented consistent with the orientation of the tool. Thus, the tool 206 may be recognized by a surgeon S so that the surgeon may know that the tool is available or so that the surgeon may release one of the tools 202, 204 that are in the field of view 200 and grab the tool corresponding to the synthetic tool image 206 that is in the viewing pane 208. In an embodiment, the surgeon S may grab this additional tool utilizing the alignment features described above.
  • Synthetic Tool Image at Actual Location of End Effector
  • [0064]
    In accordance with another embodiment, a synthetic tool image may be displayed over the actual location of a tool. As an illustrative example, this feature permits a surgeon S to follow the tool even when the tool is within the endoscopic field of view but is out of sight, for instance when the tool is behind an organ or is covered by blood.
  • [0065]
    FIG. 11 is a flowchart showing steps for displaying a synthetic tool at the actual location of a tool in accordance with an embodiment. Beginning at step 1100, the tool is at position A. At step 1102, the synthetic model is optionally displayed over the image of the actual tool, which may be, for example, live video of the tool. At step 1104, the tools move to position B. Kinematic position information is received at step 1106, and an adjustment for offset, as described above, is taken in step 1108. At step 1110, the synthetic tool is displayed at to the kinematically adjusted position of the tool.
  • [0066]
    Utilizing the method of FIG. 11, the movements of the synthetic tool can match the movements of a tool, and the synthetic tool may be superimposed over the actual tool. Preferably, the movement of the synthetic tool is updated in real time so that the movement of the synthetic tool closely matches the movement of the actual tool. As described above, although kinetic position information typically does not provide an accurate position of a tool in space, a change in position is relatively accurate. Thus, by utilizing the synthetic tool described with reference to FIG. 11, the position of a tool can be followed fairly accurately, even when video or other position information for the tool is lost. For example, an end effector within an endoscope's field of view may be in a pool of blood, behind an organ, and/or obscured by cauterization smoke. In such instances, the synthetic tool may provide a surgeon with visual feedback information regarding the location and orientation of the tool. Further, if applied to two tools, the relative positions of the tools with reference to each other may be shown to the surgeon, even if both tools are obscured from endoscopic view.
  • [0067]
    If desired, as shown with optional step 1109, if the actual tool image is visible, then step 1109 can branch to step 1100, the process can start again, and no display of the synthetic tool may be provided (in this loop, position A is reset as the current position of the tool, and position B is to be the next position of the tool). In contrast, if the tool is not visible, then the synthetic tool may be displayed at step 1110. Utilizing this option, if image information is not available, the synthetic tool may instead provide a visual representation of the tool, and thus the surgeon is provided visual information, either actual or synthetic, regarding the position of the tool at all times. The process may continue to loop back, causing alternative displays of a synthetic representation of the tool and an image, such as video, of the tool, as needed. As with previous embodiments, this synthetic tool for use in the method of FIG. 11 may be a 3-D model of the tool, or it may be a line drawing of the tool or broken lines representing portions of the tool, or any other representation of the tool.
  • 3-D Pointer
  • [0068]
    As described in the Background section of this document, conventional telesurgical systems provide only limited information at the user interface provided to a surgeon via the viewer 32. Often, additional information may be provided on a secondary user interface, such as a display that is separate from the viewer.
  • [0069]
    In accordance with an embodiment, the surgeon S may release control of one of the end effectors and enable control of a 3-D pointing tool. The surgeon S, for example, may control the 3-D pointing tool with one of the master controls and may use the pointing tool to select user interface items. The 3-D pointing tool may be used, for example, to select items, other than an image in the field of view from the image capturing device 50, for view within a user interface window pane.
  • [0070]
    For example, in the viewing pane 300 shown in FIG. 12, the surgeon is utilizing two end effectors 302, 304. In accordance with an embodiment, when the surgeon releases control of one of the end effectors, for example, the end effector 302, the master manipulator that has been released is moved away from the end effector 302, and a pointing device 306 is shown in the viewing pane representing the location of the master manipulator. The pointing device 306 may be, for example, a 2-D or 3-D icon, but in an embodiment is 3-D. Also, in an embodiment, the pointing device 306 is generated from modeling information, such as the modeling data 150, so that it can represent a version of a synthetic tool. The synthetic tool may be much smaller than a view of an actual tool in the viewer. In an embodiment, one or more user interface selection devices, such as a drop-down menu bar 308, such may be provided permitting the surgeon S to select, using the pointing device 306, additional screens, additional features, or other items.
  • [0071]
    In an embodiment, when the surgeon is utilizing the pointing device 306, movement of the tools is disabled. A warning signal or other indicator may be provided to show that the pointing device is being utilized and that the tools are not movable.
  • [0072]
    Utilizing the pointing device 306, a surgeon may be provided a larger number of options within the viewing pane 300, and can access these options without having to remove his or her head from the viewer 32. As such, the need for the surgeon to access a secondary user interface is diminished.
  • [0073]
    Other variations are within the spirit of the present invention. Thus, while the invention is susceptible to various modifications and alternative constructions, a certain illustrated embodiment thereof is shown in the drawings and has been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific form or forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention, as defined in the appended claims.
  • [0074]
    All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
  • [0075]
    The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. The term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
  • [0076]
    Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate, and the inventors intend for the invention to be practiced otherwise than as specifically described herein. Accordingly, this invention includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.

Claims (26)

  1. 1. A robotic surgical system, comprising:
    a tool for performing surgery;
    data for providing a synthetic representation of the tool;
    an image capturing device that captures an image of the tool;
    a display;
    a first component coupling the image capture device to the display so as to display an image of the tool in the display; and
    a second component coupling the data to the display so as to display the synthetic representation of the tool simultaneously with the displayed image of the tool.
  2. 2. The system of claim 1, wherein the display comprises a viewer of a surgeon console of the robotic surgical system.
  3. 3. The system of claim 1, wherein the synthetic representation comprises a model of the tool.
  4. 4. The system of claim 1, wherein the synthetic representation is displayed to appear overlaid at least in part on the image of the tool.
  5. 5. The system of claim 1, wherein the synthetic representation represents a position of the tool, an orientation of the tool, or both.
  6. 6. The system of claim 1: wherein the synthetic representation is displayed to appear outside a displayed image from a field of view of the image capture device.
  7. 7. In a robotic surgical system comprising a surgical tool and an image capture device, a method comprising:
    capturing an image of the surgical tool within a field of view of the image capture device;
    displaying the captured image in a surgical display; and
    displaying a synthetic representation of the surgical tool in the surgical display.
  8. 8. The method of claim 7, wherein the display comprises a viewer of a surgeon console of the robotic surgical system.
  9. 9. The method of claim 7, wherein the synthetic representation comprises a model of the surgical tool.
  10. 10. The method of claim 9, wherein displaying the synthetic representation of the surgical tool comprises displaying the synthetic representation of the surgical tool to appear at least in part overlaid on the displayed image of the surgical tool.
  11. 11. The method of claim 7, wherein the synthetic representation represents a position of the surgical tool, an orientation of the surgical tool, or both.
  12. 12. The method of claim 7, wherein the synthetic representation of the surgical tool is displayed to appear outside the displayed image of the surgical tool.
  13. 13. A method of providing a visual representation of a position of a tool in a robotic system, the method comprising:
    displaying a first image comprising video feed of a field of view showing a region, with the region containing a tool; and
    displaying a second image representing a position of the tool, an orientation of the tool, or both, superimposed on the first image.
  14. 14. The method of claim 13, wherein the second image comprises a synthetic representation of the tool.
  15. 15. The method of claim 14, wherein the synthetic representation of the tool comprises a three-dimensional model of the tool.
  16. 16. The method of claim 13, further comprising, tracking movement of the tool, and responsive to a movement of the tool to a new location, changing the second image to match the new position of the tool, the orientation of the tool, or both.
  17. 17. The method of claim 14, further comprising:
    displaying the second image as a non-three-dimensional model of the tool if at least a portion of the tool appears in the first image; and
    displaying the second image as a three-dimensional model of the tool if the tool does not appear in the first image.
  18. 18. The method of claim 13:
    displaying the second image only if the tool does not appear in the first image.
  19. 19. The method of claim 13, further comprising altering the second image based upon the availability of the first image.
  20. 20. A robotic system, comprising:
    a surgical tool;
    data for providing a synthetic representation of the tool;
    an image capturing device that captures an image within a field of view;
    a display comprising a first portion in which the captured image is displayed and a second portion in which a window that represents space adjacent the captured image and outside the field of view is displayed; and
    a first component coupling the data to the display so as to superimpose in the window at least part of the synthetic representation of the tool, wherein the synthetic representation comprises a graphical representation of a position of the tool, an orientation of the tool, or both.
  21. 21. The system of claim 20, wherein the synthetic representation of the tool comprises a three-dimensional model of the tool.
  22. 22. The system of claim 21, further comprising:
    a second component coupled to the tool;
    wherein the second component generates information about movement of the tool to a new position, a new orientation, or both; and
    wherein the second component is configured to change the position, the orientation, or both, of the displayed synthetic representation based upon the information.
  23. 23. The system of claim 22, wherein the new position is in the field of view.
  24. 24. A robotic surgical system, comprising:
    a surgical tool;
    a display;
    a synthetic rendering of the tool, wherein the synthetic rendering appears on the display;
    a synthetic rendering of a pointing device that appears in three dimensions on the display; and
    a master input device;
    wherein in a first mode the master input device controls movement of the surgical tool, in a second mode the master input device controls movement on the display of the synthetic rendering of the surgical tool, and in a third mode the master input device controls movement on the display of the synthetic rendering of the pointing device.
  25. 25. The system of claim 24:
    wherein the pointing device comprises a synthetic rendering of a surgical tool that appears to be three-dimensional on the display; and
    wherein in the third mode the pointing device appears to move on the display in a plurality of degrees of freedom in response to corresponding movement of the master input device.
  26. 26. The system of claim 24, further comprising:
    a menu that appears on the display;
    wherein the pointing device moves and selects an item on the menu.
US12415332 2006-06-13 2009-03-31 Synthetic representation of a surgical instrument Abandoned US20090192523A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11478531 US9718190B2 (en) 2006-06-29 2006-06-29 Tool position and identification indicator displayed in a boundary area of a computer display screen
US11762202 US9345387B2 (en) 2006-06-13 2007-06-13 Preventing instrument/tissue collisions
US12415332 US20090192523A1 (en) 2006-06-29 2009-03-31 Synthetic representation of a surgical instrument

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US12415332 US20090192523A1 (en) 2006-06-29 2009-03-31 Synthetic representation of a surgical instrument
PCT/US2010/028886 WO2010117684A1 (en) 2009-03-31 2010-03-26 Synthetic representation of a surgical instrument
US14076833 US9788909B2 (en) 2006-06-29 2013-11-11 Synthetic representation of a surgical instrument
US15136658 US9901408B2 (en) 2007-06-13 2016-04-22 Preventing instrument/tissue collisions
US15597558 US9801690B2 (en) 2006-06-29 2017-05-17 Synthetic representation of a surgical instrument

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11478531 Continuation-In-Part US9718190B2 (en) 2006-06-29 2006-06-29 Tool position and identification indicator displayed in a boundary area of a computer display screen

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14076833 Continuation US9788909B2 (en) 2006-06-13 2013-11-11 Synthetic representation of a surgical instrument

Publications (1)

Publication Number Publication Date
US20090192523A1 true true US20090192523A1 (en) 2009-07-30

Family

ID=42236583

Family Applications (4)

Application Number Title Priority Date Filing Date
US12415332 Abandoned US20090192523A1 (en) 2006-06-13 2009-03-31 Synthetic representation of a surgical instrument
US14076833 Active 2028-04-28 US9788909B2 (en) 2006-06-13 2013-11-11 Synthetic representation of a surgical instrument
US15136658 Active US9901408B2 (en) 2006-06-13 2016-04-22 Preventing instrument/tissue collisions
US15597558 Active US9801690B2 (en) 2006-06-13 2017-05-17 Synthetic representation of a surgical instrument

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14076833 Active 2028-04-28 US9788909B2 (en) 2006-06-13 2013-11-11 Synthetic representation of a surgical instrument
US15136658 Active US9901408B2 (en) 2006-06-13 2016-04-22 Preventing instrument/tissue collisions
US15597558 Active US9801690B2 (en) 2006-06-13 2017-05-17 Synthetic representation of a surgical instrument

Country Status (2)

Country Link
US (4) US20090192523A1 (en)
WO (1) WO2010117684A1 (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20090326553A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100168918A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Obtaining force information in a minimally invasive surgical procedure
US20100169815A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Visual force feedback in a minimally invasive surgical procedure
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
WO2011060139A2 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US20110118752A1 (en) * 2009-11-13 2011-05-19 Brandon Itkowitz Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
WO2011060187A1 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. A master finger tracking device and method of use in a minimally invasive surgical system
WO2011060185A1 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. Method and system for hand presence detection in a minimally invasive surgical system
WO2012044334A2 (en) 2009-11-13 2012-04-05 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US20120209288A1 (en) * 2011-02-15 2012-08-16 Intuitive Surgical Operations, Inc. Indicator for Knife Location in a Stapling or Vessel Sealing Instrument
US20120221145A1 (en) * 2011-02-24 2012-08-30 Olympus Corporation Master input device and master-slave manipulator
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
WO2017044965A1 (en) * 2015-09-10 2017-03-16 Duke University Systems and methods for arbitrary viewpoint robotic manipulation and robotic surgical assistance
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3083160A4 (en) * 2013-12-17 2017-08-23 Syddansk Univ Device for dynamic switching of robot control points
JP6278747B2 (en) * 2014-02-28 2018-02-14 オリンパス株式会社 Calibration method of the manipulator, the manipulator and the manipulator system,
KR20150128049A (en) * 2014-05-08 2015-11-18 삼성전자주식회사 Surgical robot and control method thereof
WO2017122322A1 (en) * 2016-01-14 2017-07-20 オリンパス株式会社 Medical manipulator system

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US5408409A (en) * 1990-05-11 1995-04-18 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5748767A (en) * 1988-02-01 1998-05-05 Faro Technology, Inc. Computer-aided surgery apparatus
US5755725A (en) * 1993-09-07 1998-05-26 Deemed International, S.A. Computer-assisted microsurgery methods and equipment
US5820545A (en) * 1995-08-14 1998-10-13 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Method of tracking a surgical instrument with a mono or stereo laparoscope
US5950629A (en) * 1991-06-13 1999-09-14 International Business Machines Corporation System for assisting a surgeon during surgery
US5987591A (en) * 1995-12-27 1999-11-16 Fanuc Limited Multiple-sensor robot system for obtaining two-dimensional image and three-dimensional position information
US6224542B1 (en) * 1999-01-04 2001-05-01 Stryker Corporation Endoscopic camera system with non-mechanical zoom
US20010035871A1 (en) * 2000-03-30 2001-11-01 Johannes Bieger System and method for generating an image
US6398726B1 (en) * 1998-11-20 2002-06-04 Intuitive Surgical, Inc. Stabilizer for robotic beating-heart surgery
US6442417B1 (en) * 1999-11-29 2002-08-27 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for transforming view orientations in image-guided surgery
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US6493608B1 (en) * 1999-04-07 2002-12-10 Intuitive Surgical, Inc. Aspects of a control system of a minimally invasive surgical apparatus
US20030032878A1 (en) * 1996-06-28 2003-02-13 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for volumetric image navigation
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US20030109780A1 (en) * 2001-06-07 2003-06-12 Inria Roquencourt Methods and apparatus for surgical planning
US6659939B2 (en) * 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US20040106916A1 (en) * 2002-03-06 2004-06-03 Z-Kat, Inc. Guidance system and method for surgical procedures with improved feedback
US6770081B1 (en) * 2000-01-07 2004-08-03 Intuitive Surgical, Inc. In vivo accessories for minimally invasive robotic surgery and methods
US20050203380A1 (en) * 2004-02-17 2005-09-15 Frank Sauer System and method for augmented reality navigation in a medical intervention procedure
US20050251113A1 (en) * 2000-11-17 2005-11-10 Kienzle Thomas C Iii Computer assisted intramedullary rod surgery system with enhanced features
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US7155315B2 (en) * 1999-04-07 2006-12-26 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US7181315B2 (en) * 2003-10-08 2007-02-20 Fanuc Ltd Manual-mode operating system for robot
US7194118B1 (en) * 2000-11-10 2007-03-20 Lucid, Inc. System for optically sectioning and mapping surgically excised tissue
US20070135803A1 (en) * 2005-09-14 2007-06-14 Amir Belson Methods and apparatus for performing transluminal and other procedures
US20070265491A1 (en) * 1998-05-14 2007-11-15 Calypso Medical Technologies, Inc. Systems and methods for stabilizing a target location within a human body
US20070270650A1 (en) * 2006-05-19 2007-11-22 Robert Eno Methods and apparatus for displaying three-dimensional orientation of a steerable distal tip of an endoscope
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20080118115A1 (en) * 2006-11-17 2008-05-22 General Electric Company Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US7491198B2 (en) * 2003-04-28 2009-02-17 Bracco Imaging S.P.A. Computer enhanced surgical navigation imaging system (camera probe)

Family Cites Families (340)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3628535A (en) 1969-11-12 1971-12-21 Nibot Corp Surgical instrument for implanting a prosthetic heart valve or the like
US3818284A (en) 1972-12-07 1974-06-18 Marotta Scientific Controls Valve control with pulse width modulation
US3923166A (en) 1973-10-11 1975-12-02 Nasa Remote manipulator system
US3905215A (en) 1974-06-26 1975-09-16 John R Wright Ultrasensitive force measuring instrument employing torsion balance
US4150326A (en) 1977-09-19 1979-04-17 Unimation, Inc. Trajectory correlation and error detection method and apparatus
US4349837A (en) 1979-07-03 1982-09-14 Spar Aerospace Limited Satellite servicing
US5493595A (en) 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US4588348A (en) 1983-05-27 1986-05-13 At&T Bell Laboratories Robotic system utilizing a tactile sensor array
US4577621A (en) 1984-12-03 1986-03-25 Patel Jayendrakumar I Endoscope having novel proximate and distal portions
JPS61230895A (en) 1985-04-04 1986-10-15 Mitsubishi Heavy Ind Ltd Manipulator interference preventive device
US4672963A (en) 1985-06-07 1987-06-16 Israel Barken Apparatus and method for computer controlled laser surgery
US4644237A (en) 1985-10-17 1987-02-17 International Business Machines Corp. Collision avoidance system
US4722056A (en) 1986-02-18 1988-01-26 Trustees Of Dartmouth College Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope
JPH085018B2 (en) 1986-02-26 1996-01-24 株式会社日立製作所 Remote Manipiyure - Chillon method and apparatus
US4762456A (en) 1986-06-11 1988-08-09 Nelson Arthur J Accommodations to exchange containers between vessels
JPH0766290B2 (en) 1986-06-26 1995-07-19 東芝機械株式会社 Tool path generation method
US4791934A (en) 1986-08-07 1988-12-20 Picker International, Inc. Computer tomography assisted stereotactic surgery system and method
GB2194656B (en) 1986-09-03 1991-10-09 Ibm Method and system for solid modelling
US4759074A (en) 1986-10-28 1988-07-19 General Motors Corporation Method for automatically inspecting parts utilizing machine vision and system utilizing same
JPH0829509B2 (en) 1986-12-12 1996-03-27 株式会社日立製作所 Manipiyure - other control device
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
US4860215A (en) 1987-04-06 1989-08-22 California Institute Of Technology Method and apparatus for adaptive force and position control of manipulators
US4863133A (en) 1987-05-26 1989-09-05 Leonard Medical Arm device for adjustable positioning of a medical instrument or the like
US4762455A (en) 1987-06-01 1988-08-09 Remote Technology Corporation Remote manipulator
US4833383A (en) 1987-08-13 1989-05-23 Iowa State University Research Foundation, Inc. Means and method of camera space manipulation
US5079699A (en) 1987-11-27 1992-01-07 Picker International, Inc. Quick three-dimensional display
US5170347A (en) 1987-11-27 1992-12-08 Picker International, Inc. System to reformat images for three-dimensional display using unique spatial encoding and non-planar bisectioning
US4815450A (en) 1988-02-01 1989-03-28 Patel Jayendra I Endoscope having variable flexibility
EP0326768A3 (en) 1988-02-01 1991-01-23 Faro Medical Technologies Inc. Computer-aided surgery apparatus
US5046022A (en) 1988-03-10 1991-09-03 The Regents Of The University Of Michigan Tele-autonomous system and method employing time/position synchrony/desynchrony
US5187796A (en) 1988-03-29 1993-02-16 Computer Motion, Inc. Three-dimensional vector co-processor having I, J, and K register files and I, J, and K execution units
US4989253A (en) 1988-04-15 1991-01-29 The Montefiore Hospital Association Of Western Pennsylvania Voice activated microscope
US4979949A (en) 1988-04-26 1990-12-25 The Board Of Regents Of The University Of Washington Robot-aided system for surgery
US4984157A (en) 1988-09-21 1991-01-08 General Electric Company System and method for displaying oblique planar cross sections of a solid body using tri-linear interpolation to determine pixel position dataes
GB8826986D0 (en) 1988-11-18 1988-12-21 Crockard A Surgical device
US4942539A (en) 1988-12-21 1990-07-17 Gmf Robotics Corporation Method and system for automatically determining the position and orientation of an object in 3-D space
US5099846A (en) 1988-12-23 1992-03-31 Hardy Tyrone L Method and apparatus for video presentation from a variety of scanner imaging sources
US5098426A (en) 1989-02-06 1992-03-24 Phoenix Laser Systems, Inc. Method and apparatus for precision laser surgery
US5184009A (en) 1989-04-10 1993-02-02 Wright Scott M Optical attenuator movement detection system
US5053976A (en) 1989-05-22 1991-10-01 Honda Giken Kogyo Kabushiki Kaisha Method of teaching a robot
US5257203A (en) 1989-06-09 1993-10-26 Regents Of The University Of Minnesota Method and apparatus for manipulating computer-based representations of objects of complex and unique geometry
DE3935256C1 (en) 1989-10-23 1991-01-03 Bauerfeind, Peter, Dr., 8264 Waldkraiburg, De
US5181823A (en) 1989-10-27 1993-01-26 Grumman Aerospace Corporation Apparatus and method for producing a video display
DE69026196T2 (en) 1989-11-08 1996-09-05 George S Allen Mechanical arm for an interactive, image-controlled surgery system
US5176702A (en) 1991-04-04 1993-01-05 Symbiosis Corporation Ratchet locking mechanism for surgical instruments
EP0487110B1 (en) 1990-11-22 1999-10-06 Kabushiki Kaisha Toshiba Computer-aided diagnosis system for medical use
US5217003A (en) 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
US5217453A (en) 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
US5251611A (en) 1991-05-07 1993-10-12 Zehel Wendell E Method and apparatus for conducting exploratory procedures
US5313306A (en) 1991-05-13 1994-05-17 Telerobotics International, Inc. Omniview motionless camera endoscopy system
US5181514A (en) 1991-05-21 1993-01-26 Hewlett-Packard Company Transducer positioning system
US5266875A (en) 1991-05-23 1993-11-30 Massachusetts Institute Of Technology Telerobotic system
US5182641A (en) 1991-06-17 1993-01-26 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Composite video and graphics display for camera viewing systems in robotics and teleoperation
US5261404A (en) 1991-07-08 1993-11-16 Mick Peter R Three-dimensional mammal anatomy imaging system and method
US5184601A (en) 1991-08-05 1993-02-09 Putman John M Endoscope stabilizer
US5889670A (en) 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US5230623A (en) 1991-12-10 1993-07-27 Radionics, Inc. Operating pointer with interactive computergraphics
US5531742A (en) 1992-01-15 1996-07-02 Barken; Israel Apparatus and method for computer controlled cryosurgery
US6963792B1 (en) 1992-01-21 2005-11-08 Sri International Surgical method
DE69312053T2 (en) 1992-01-21 1997-10-30 Stanford Res Inst Int Tele operator system and process with tele presence
DE4204397C2 (en) 1992-02-14 2001-08-30 Sinz Dirk Peter transport container
US5430643A (en) 1992-03-11 1995-07-04 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Configuration control of seven degree of freedom arms
US5737500A (en) 1992-03-11 1998-04-07 California Institute Of Technology Mobile dexterous siren degree of freedom robot arm with real-time control system
US5321353A (en) 1992-05-13 1994-06-14 Storage Technolgy Corporation System and method for precisely positioning a robotic tool
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5482029A (en) 1992-06-26 1996-01-09 Kabushiki Kaisha Toshiba Variable flexibility endoscope system
US5361768A (en) 1992-06-30 1994-11-08 Cardiovascular Imaging Systems, Inc. Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same
US5239246A (en) 1992-07-08 1993-08-24 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Force reflection with compliance control
US5704897A (en) 1992-07-31 1998-01-06 Truppe; Michael J. Apparatus and method for registration of points of a data field with respective points of an optical image
US5515478A (en) 1992-08-10 1996-05-07 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5657429A (en) 1992-08-10 1997-08-12 Computer Motion, Inc. Automated endoscope system optimal positioning
US5524180A (en) 1992-08-10 1996-06-04 Computer Motion, Inc. Automated endoscope system for optimal positioning
US5754741A (en) 1992-08-10 1998-05-19 Computer Motion, Inc. Automated endoscope for optimal positioning
US5397323A (en) 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
US5788688A (en) 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5629594A (en) 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
US5842473A (en) 1993-11-29 1998-12-01 Life Imaging Systems Three-dimensional imaging system
DE9302650U1 (en) 1993-02-24 1993-04-15 Karl Storz Gmbh & Co, 7200 Tuttlingen, De
JP3578457B2 (en) 1993-03-31 2004-10-20 ルマ コーポレーション Management of information in an endoscopy system
WO1994026167A1 (en) 1993-05-14 1994-11-24 Sri International Remote center positioner
US5791231A (en) 1993-05-17 1998-08-11 Endorobotics Corporation Surgical robotic system and hydraulic actuator therefor
WO1995001757A1 (en) 1993-07-07 1995-01-19 Cornelius Borst Robotic system for close inspection and remote treatment of moving parts
US5382885A (en) 1993-08-09 1995-01-17 The University Of British Columbia Motion scaling tele-operating system with force feedback suitable for microsurgery
US5343385A (en) 1993-08-17 1994-08-30 International Business Machines Corporation Interference-free insertion of a solid body into a cavity
US5503320A (en) 1993-08-19 1996-04-02 United States Surgical Corporation Surgical apparatus with indicator
EP0646358A1 (en) 1993-10-05 1995-04-05 Pacesetter AB Instrument for laparoscopy
US6059718A (en) 1993-10-18 2000-05-09 Olympus Optical Co., Ltd. Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
US5876325A (en) 1993-11-02 1999-03-02 Olympus Optical Co., Ltd. Surgical manipulation system
US6241725B1 (en) 1993-12-15 2001-06-05 Sherwood Services Ag High frequency thermal ablation of cancerous tumors and functional targets with image data assistance
WO1995016396A1 (en) 1993-12-15 1995-06-22 Computer Motion, Inc. Automated endoscope system for optimal positioning
JPH07184923A (en) 1993-12-28 1995-07-25 Hitachi Ltd Remote precise surgical operation supporting device
US5631973A (en) 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
US5454827A (en) 1994-05-24 1995-10-03 Aust; Gilbert M. Surgical instrument
US5835693A (en) 1994-07-22 1998-11-10 Lynch; James D. Interactive system for simulation and display of multi-body systems in three dimensions
US6115053A (en) 1994-08-02 2000-09-05 New York University Computer animation method and system for synthesizing human-like gestures and actions
JPH08107875A (en) 1994-08-18 1996-04-30 Olympus Optical Co Ltd Endoscope shape detector
DE19527245A1 (en) 1994-08-30 1996-05-23 Vingmed Sound As Device for endoscopic or gastroscopic investigations
US6120433A (en) 1994-09-01 2000-09-19 Olympus Optical Co., Ltd. Surgical manipulator system
US5528955A (en) 1994-09-08 1996-06-25 Hannaford; Blake Five axis direct-drive mini-robot having fifth actuator located at non-adjacent joint
JP3695779B2 (en) 1994-09-27 2005-09-14 オリンパス株式会社 Manipulator system
US5765561A (en) 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
JP3642812B2 (en) 1994-11-17 2005-04-27 株式会社町田製作所 Medical observation device
JPH08154321A (en) 1994-11-29 1996-06-11 Tokyo Electric Power Co Inc:The Remote control robot
JP3640087B2 (en) 1994-11-29 2005-04-20 豊田工機株式会社 Machine Tools
JPH08164148A (en) 1994-12-13 1996-06-25 Olympus Optical Co Ltd Surgical operation device under endoscope
JP3539645B2 (en) 1995-02-16 2004-07-07 株式会社日立製作所 Remote surgery supporting system
US6019724A (en) 1995-02-22 2000-02-01 Gronningsaeter; Aage Method for ultrasound guidance during clinical procedures
US5836880A (en) 1995-02-27 1998-11-17 Micro Chemical, Inc. Automated system for measuring internal tissue characteristics in feed animals
US5817022A (en) 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US5797849A (en) 1995-03-28 1998-08-25 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
JPH08275958A (en) 1995-04-07 1996-10-22 Olympus Optical Co Ltd Manipulator device for operation
US5887121A (en) 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
JP3986099B2 (en) 1995-05-02 2007-10-03 オリンパス株式会社 Manipulator system for surgery
US5649956A (en) 1995-06-07 1997-07-22 Sri International System and method for releasably holding a surgical instrument
US5759151A (en) 1995-06-07 1998-06-02 Carnegie Mellon University Flexible steerable device for conducting exploratory procedures
US5814038A (en) 1995-06-07 1998-09-29 Sri International Surgical manipulator for a telerobotic system
US20040243147A1 (en) 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
US5551432A (en) 1995-06-19 1996-09-03 New York Eye & Ear Infirmary Scanning control system for ultrasound biomicroscopy
JPH10505286A (en) 1995-06-20 1998-05-26 シン ング、ワン Joint arms for medical treatment
US6702736B2 (en) 1995-07-24 2004-03-09 David T. Chen Anatomical visualization system
US6129670A (en) 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6256529B1 (en) 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US5638819A (en) 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target
US5784542A (en) 1995-09-07 1998-07-21 California Institute Of Technology Decoupled six degree-of-freedom teleoperated robot system
US5825982A (en) 1995-09-15 1998-10-20 Wright; James Head cursor control interface for an automated endoscope system for optimal positioning
US5601085A (en) 1995-10-02 1997-02-11 Nycomed Imaging As Ultrasound imaging
JPH09141580A (en) 1995-11-22 1997-06-03 Yaskawa Electric Corp Operating range limiting device for direct teaching robot
US5624398A (en) 1996-02-08 1997-04-29 Symbiosis Corporation Endoscopic robotic surgical tools and methods
US6436107B1 (en) 1996-02-20 2002-08-20 Computer Motion, Inc. Method and apparatus for performing minimally invasive surgical procedures
US6699177B1 (en) 1996-02-20 2004-03-02 Computer Motion, Inc. Method and apparatus for performing minimally invasive surgical procedures
US5971976A (en) 1996-02-20 1999-10-26 Computer Motion, Inc. Motion minimization and compensation system for use in surgical procedures
US6063095A (en) 1996-02-20 2000-05-16 Computer Motion, Inc. Method and apparatus for performing minimally invasive surgical procedures
US5855583A (en) 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US5762458A (en) 1996-02-20 1998-06-09 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
JP4018147B2 (en) 1996-05-17 2007-12-05 バイオセンス・インコーポレイテッド Self-aligning catheter
US5792135A (en) 1996-05-20 1998-08-11 Intuitive Surgical, Inc. Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5807377A (en) 1996-05-20 1998-09-15 Intuitive Surgical, Inc. Force-reflecting surgical instrument and positioning mechanism for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5797900A (en) 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6642836B1 (en) 1996-08-06 2003-11-04 Computer Motion, Inc. General purpose distributed operating room control system
GB9616261D0 (en) * 1996-08-02 1996-09-11 Philips Electronics Nv Virtual environment manipulation device modelling and control
JP3550966B2 (en) 1996-09-18 2004-08-04 株式会社日立製作所 Surgical device
US7302288B1 (en) * 1996-11-25 2007-11-27 Z-Kat, Inc. Tool position indicator
US5810008A (en) 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
US7963913B2 (en) 1996-12-12 2011-06-21 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US5853367A (en) 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
US5938678A (en) 1997-06-11 1999-08-17 Endius Incorporated Surgical instrument
JPH11309A (en) 1997-06-12 1999-01-06 Hitachi Ltd Image processor
US6330837B1 (en) 1997-08-28 2001-12-18 Microdexterity Systems, Inc. Parallel mechanism
US6002184A (en) 1997-09-17 1999-12-14 Coactive Drive Corporation Actuator with opposing repulsive magnetic forces
US6786896B1 (en) 1997-09-19 2004-09-07 Massachusetts Institute Of Technology Robotic apparatus
US5993391A (en) 1997-09-25 1999-11-30 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus
EP1419757A3 (en) 1997-11-07 2004-07-21 Hill-Rom Services, Inc. Medical equipment controller
WO1999028725A1 (en) 1997-12-02 1999-06-10 Ozo Diversified Automation, Inc. Automated system for chromosome microdissection and method of using same
US5842993A (en) 1997-12-10 1998-12-01 The Whitaker Corporation Navigable ultrasonic imaging probe assembly
US6292712B1 (en) 1998-01-29 2001-09-18 Northrop Grumman Corporation Computer interface system for a robotic system
EP1053071A1 (en) 1998-02-03 2000-11-22 Hexel Corporation Systems and methods employing a rotary track for machining and manufacturing
EP1056388B1 (en) 1998-02-19 2004-12-22 California Institute Of Technology Apparatus for providing spherical viewing during endoscopic procedures
US7766894B2 (en) 2001-02-15 2010-08-03 Hansen Medical, Inc. Coaxial catheter system
US6810281B2 (en) 2000-12-21 2004-10-26 Endovia Medical, Inc. Medical mapping system
JP3582348B2 (en) 1998-03-19 2004-10-27 株式会社日立製作所 Surgical device
US5980461A (en) 1998-05-01 1999-11-09 Rajan; Subramaniam D. Ultrasound imaging apparatus for medical diagnostics
US6425865B1 (en) 1998-06-12 2002-07-30 The University Of British Columbia Robotically assisted medical ultrasound
EP1109497B1 (en) 1998-08-04 2009-05-06 Intuitive Surgical, Inc. Manipulator positioning linkage for robotic surgery
US6383951B1 (en) 1998-09-03 2002-05-07 Micron Technology, Inc. Low dielectric constant material for integrated circuit fabrication
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US5993390A (en) 1998-09-18 1999-11-30 Hewlett- Packard Company Segmented 3-D cardiac ultrasound imaging method and apparatus
US6665554B1 (en) * 1998-11-18 2003-12-16 Steve T. Charles Medical manipulator for use with an imaging device
US8527094B2 (en) * 1998-11-20 2013-09-03 Intuitive Surgical Operations, Inc. Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures
US6342889B1 (en) 1998-11-27 2002-01-29 Dicomit Dicom Information Technologies Corp. Method and system for selecting at least one optimal view of a three dimensional image
US6620173B2 (en) 1998-12-08 2003-09-16 Intuitive Surgical, Inc. Method for introducing an end effector to a surgical site in minimally invasive surgery
US6799065B1 (en) 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6331181B1 (en) 1998-12-08 2001-12-18 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US8944070B2 (en) 1999-04-07 2015-02-03 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US6714839B2 (en) 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6325808B1 (en) 1998-12-08 2001-12-04 Advanced Realtime Control Systems, Inc. Robotic system, docking station, and surgical tool for collaborative control in minimally invasive surgery
JP2000193893A (en) 1998-12-28 2000-07-14 Suzuki Motor Corp Bending device of insertion tube for inspection
US6394998B1 (en) 1999-01-22 2002-05-28 Intuitive Surgical, Inc. Surgical tools for use in minimally invasive telesurgical applications
US6602185B1 (en) 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system
US6084371A (en) 1999-02-19 2000-07-04 Lockheed Martin Energy Research Corporation Apparatus and methods for a human de-amplifier system
EP1075851A4 (en) 1999-02-25 2009-01-07 Tetsuya Korenaga Electric therapeutic device
US7324081B2 (en) 1999-03-02 2008-01-29 Siemens Aktiengesellschaft Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus
US6243624B1 (en) 1999-03-19 2001-06-05 Northwestern University Non-Linear muscle-like compliant controller
US6569084B1 (en) 1999-03-31 2003-05-27 Olympus Optical Co., Ltd. Endoscope holder and endoscope device
US6594552B1 (en) 1999-04-07 2003-07-15 Intuitive Surgical, Inc. Grip strength with tactile feedback for robotic surgery
JP2000300579A (en) 1999-04-26 2000-10-31 Olympus Optical Co Ltd Multifunctional manipulator
JP3668865B2 (en) 1999-06-21 2005-07-06 株式会社日立製作所 Surgical device
US7955340B2 (en) 1999-06-25 2011-06-07 Usgi Medical, Inc. Apparatus and methods for forming and securing gastrointestinal tissue folds
JP4302246B2 (en) 1999-08-25 2009-07-22 住友ベークライト株式会社 Medical treatment tool insertion tool
JP2001104333A (en) 1999-10-07 2001-04-17 Hitachi Ltd Surgery support device
US6312435B1 (en) 1999-10-08 2001-11-06 Intuitive Surgical, Inc. Surgical instrument with extended reach for use in minimally invasive surgery
US6654031B1 (en) 1999-10-15 2003-11-25 Hitachi Kokusai Electric Inc. Method of editing a video program with variable view point of picked-up image and computer program product for displaying video program
JP2001202531A (en) 1999-10-15 2001-07-27 Hitachi Kokusai Electric Inc Method for editing moving image
US6204620B1 (en) 1999-12-10 2001-03-20 Fanuc Robotics North America Method of controlling an intelligent assist device
DE19961971B4 (en) 1999-12-22 2009-10-22 Forschungszentrum Karlsruhe Gmbh An apparatus for secure automatic tracking of an endoscope and track an instrument
US6847922B1 (en) 2000-01-06 2005-01-25 General Motors Corporation Method for computer-aided layout of manufacturing cells
JP2001287183A (en) 2000-01-31 2001-10-16 Matsushita Electric Works Ltd Automatic conveyance robot
DE10004264C2 (en) 2000-02-01 2002-06-13 Storz Karl Gmbh & Co Kg A device for intracorporal, minimal-invasive treatment of a patient
US6817973B2 (en) 2000-03-16 2004-11-16 Immersion Medical, Inc. Apparatus for controlling force for manipulation of medical instruments
US7819799B2 (en) 2000-03-16 2010-10-26 Immersion Medical, Inc. System and method for controlling force applied to and manipulation of medical instruments
US6984203B2 (en) 2000-04-03 2006-01-10 Neoguide Systems, Inc. Endoscope with adjacently positioned guiding apparatus
US20010055062A1 (en) 2000-04-20 2001-12-27 Keiji Shioda Operation microscope
DE10025285A1 (en) 2000-05-22 2001-12-06 Siemens Ag Fully automated, robot-based camera using position sensors for laparoscopic surgeries
US6887245B2 (en) 2001-06-11 2005-05-03 Ge Medical Systems Global Technology Company, Llc Surgical drill for use with a computer assisted surgery system
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US6599247B1 (en) 2000-07-07 2003-07-29 University Of Pittsburgh System and method for location-merging of real-time tomographic slice images with human vision
EP1182541A3 (en) 2000-08-22 2005-11-30 Siemens Aktiengesellschaft System and method for combined use of different display/apparatus types with system controlled context dependant information representation
JP4765155B2 (en) 2000-09-28 2011-09-07 ソニー株式会社 Authoring system and authoring method, and storage medium
DE10063089C1 (en) 2000-12-18 2002-07-25 Siemens Ag User controlled linking of information within an augmented reality system
US6676669B2 (en) 2001-01-16 2004-01-13 Microdexterity Systems, Inc. Surgical manipulator
US6765569B2 (en) * 2001-03-07 2004-07-20 University Of Southern California Augmented-reality tool employing scene-feature autocalibration during camera motion
JP3769469B2 (en) 2001-03-28 2006-04-26 株式会社東芝 Operation training devices
US6456901B1 (en) 2001-04-20 2002-09-24 Univ Michigan Hybrid robot motion task level control system
US6862561B2 (en) 2001-05-29 2005-03-01 Entelos, Inc. Method and apparatus for computer modeling a joint
ES2292593T3 (en) * 2001-06-13 2008-03-16 Volume Interactions Pte. Ltd. Guidance system.
CA2486525C (en) * 2001-06-13 2009-02-24 Volume Interactions Pte. Ltd. A guide system and a probe therefor
WO2003007129A3 (en) 2001-07-13 2003-11-13 Broks Automation Inc Trajectory planning and motion control strategies for a planar three-degree-of-freedom robotic arm
US6550757B2 (en) 2001-08-07 2003-04-22 Hewlett-Packard Company Stapler having selectable staple size
JP3579379B2 (en) 2001-08-10 2004-10-20 株式会社東芝 Medical manipulator system
US6587750B2 (en) * 2001-09-25 2003-07-01 Intuitive Surgical, Inc. Removable infinite roll master grip handle and touch sensor for robotic surgery
WO2003034705A3 (en) 2001-10-19 2003-11-20 Jeremy D Ackerman Methods and systems for dynamic virtual convergence and head mountable display
JP3529373B2 (en) 2001-11-09 2004-05-24 ファナック株式会社 Work machine of the simulation apparatus
US6663559B2 (en) 2001-12-14 2003-12-16 Endactive, Inc. Interface for a variable direction of view endoscope
US6852107B2 (en) 2002-01-16 2005-02-08 Computer Motion, Inc. Minimally invasive surgical training using robotics and tele-collaboration
US6951535B2 (en) 2002-01-16 2005-10-04 Intuitive Surgical, Inc. Tele-medicine system that transmits an entire state of a subsystem
EP1474271A1 (en) 2002-01-31 2004-11-10 Abb Research Ltd. Robot machining tool position and orientation calibration
US7747311B2 (en) 2002-03-06 2010-06-29 Mako Surgical Corp. System and method for interactive haptic positioning of a medical device
JP2003300444A (en) 2002-04-11 2003-10-21 Hitachi Ltd Drive support device for moving body
JP4056791B2 (en) 2002-05-22 2008-03-05 策雄 米延 Fracture reduction guidance system
US6678582B2 (en) 2002-05-30 2004-01-13 Kuka Roboter Gmbh Method and control device for avoiding collisions between cooperating robots
US6783491B2 (en) 2002-06-13 2004-08-31 Vahid Saadat Shape lockable apparatus and method for advancing an instrument through unsupported anatomy
EP2070487B1 (en) 2002-08-13 2014-03-05 NeuroArm Surgical, Ltd. Microsurgical robot system
JP4169549B2 (en) 2002-09-06 2008-10-22 オリンパス株式会社 Endoscope
JP2004105638A (en) 2002-09-20 2004-04-08 Shimadzu Corp Ultrasonic diagnostic apparatus
US20040077940A1 (en) 2002-10-11 2004-04-22 Kienzle Thomas C. Instrument guide for use with a tracking system
JP2004174662A (en) 2002-11-27 2004-06-24 Fanuc Ltd Operation state analysis device for robot
EP1435737A1 (en) 2002-12-30 2004-07-07 Abb Research Ltd. An augmented reality system and method
US7637905B2 (en) 2003-01-15 2009-12-29 Usgi Medical, Inc. Endoluminal tool deployment system
JP2004223128A (en) 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
FR2850775B1 (en) 2003-01-30 2005-07-22 Ge Med Sys Global Tech Co Llc Medical imaging device a reorientation semi-automatic X-ray object
JP3972854B2 (en) 2003-04-10 2007-09-05 ソニー株式会社 Robot motion control device
JP3975959B2 (en) 2003-04-23 2007-09-12 トヨタ自動車株式会社 Robot equipped with an apparatus and to a robot movement limiting method
EP1628632B1 (en) 2003-05-21 2013-10-09 The Johns Hopkins University Devices and systems for minimally invasive surgery of the throat and other portions of mammalian body
US20050054895A1 (en) 2003-09-09 2005-03-10 Hoeg Hans David Method for using variable direction of view endoscopy in conjunction with image guided surgical systems
DE202004014857U1 (en) 2003-09-29 2005-04-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device for virtual viewing position of at least one intracorporeally introduced into a body medical instrument
JP2005110878A (en) 2003-10-06 2005-04-28 Olympus Corp Operation supporting system
EP1689290A2 (en) 2003-10-21 2006-08-16 The Board of Trustees of The Leland Stanford Junior University Systems and methods for intraoperative targeting
US20050096502A1 (en) 2003-10-29 2005-05-05 Khalili Theodore M. Robotic surgical device
JP3732494B2 (en) 2003-10-31 2006-01-05 ファナック株式会社 Simulation device
US20050267359A1 (en) 2004-05-27 2005-12-01 General Electric Company System, method, and article of manufacture for guiding an end effector to a target position within a person
CA2513202C (en) 2004-07-23 2015-03-31 Mehran Anvari Multi-purpose robotic operating system and method
US7594912B2 (en) 2004-09-30 2009-09-29 Intuitive Surgical, Inc. Offset remote center manipulator for robotic surgery
US7904202B2 (en) 2004-10-25 2011-03-08 University Of Dayton Method and system to provide improved accuracies in multi-jointed robots through kinematic robot model parameters determination
US20060149129A1 (en) 2005-01-05 2006-07-06 Watts H D Catheter with multiple visual elements
US8872906B2 (en) 2005-01-05 2014-10-28 Avantis Medical Systems, Inc. Endoscope assembly with a polarizing filter
US7763015B2 (en) 2005-01-24 2010-07-27 Intuitive Surgical Operations, Inc. Modular manipulator support for robotic surgery
CA2598627C (en) 2005-02-22 2013-11-26 Mako Surgical Corp. Haptic guidance system and method
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US8004229B2 (en) 2005-05-19 2011-08-23 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US7391177B2 (en) 2005-05-20 2008-06-24 Hitachi, Ltd. Master-slave manipulator system and this operation input devices
US8398541B2 (en) 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
EP2289452A3 (en) 2005-06-06 2015-12-30 Intuitive Surgical Operations, Inc. Laparoscopic ultrasound robotic surgical system
KR101298492B1 (en) 2005-06-30 2013-08-21 인튜어티브 서지컬 인코포레이티드 Indicator for tool state and communication in multiarm robotic telesurgery
US20070005002A1 (en) 2005-06-30 2007-01-04 Intuitive Surgical Inc. Robotic surgical instruments for irrigation, aspiration, and blowing
JP2007029232A (en) 2005-07-25 2007-02-08 Hitachi Medical Corp System for supporting endoscopic operation
JP2007090481A (en) 2005-09-28 2007-04-12 Fanuc Ltd Robot simulation device
JP4728075B2 (en) 2005-09-28 2011-07-20 オリンパスメディカルシステムズ株式会社 The endoscope system
US20070106307A1 (en) 2005-09-30 2007-05-10 Restoration Robotics, Inc. Methods for implanting follicular units using an automated system
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
KR101320379B1 (en) 2005-10-20 2013-10-22 인튜어티브 서지컬 인코포레이티드 Auxiliary image display and manipulation on a computer display in a medical robotic system
US8190238B2 (en) 2005-12-09 2012-05-29 Hansen Medical, Inc. Robotic catheter system and methods
US7819859B2 (en) 2005-12-20 2010-10-26 Intuitive Surgical Operations, Inc. Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator
US7689320B2 (en) 2005-12-20 2010-03-30 Intuitive Surgical Operations, Inc. Robotic surgical system with joint motion controller adapted to reduce instrument tip vibrations
US7453227B2 (en) 2005-12-20 2008-11-18 Intuitive Surgical, Inc. Medical robotic system with sliding mode control
US9266239B2 (en) 2005-12-27 2016-02-23 Intuitive Surgical Operations, Inc. Constraint based control in a minimally invasive surgical apparatus
US9962066B2 (en) 2005-12-30 2018-05-08 Intuitive Surgical Operations, Inc. Methods and apparatus to shape flexible entry guides for minimally invasive surgery
US20110290856A1 (en) 2006-01-31 2011-12-01 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical instrument with force-feedback capabilities
EP1815949A1 (en) 2006-02-03 2007-08-08 The European Atomic Energy Community (EURATOM), represented by the European Commission Medical robotic system with manipulator arm of the cylindrical coordinate type
EP1815950A1 (en) 2006-02-03 2007-08-08 The European Atomic Energy Community (EURATOM), represented by the European Commission Robotic surgical system for performing minimally invasive medical procedures
US20100036198A1 (en) 2006-03-13 2010-02-11 Roberto Tacchino Device for the manipulation of body tissue
US8924021B2 (en) 2006-04-27 2014-12-30 Honda Motor Co., Ltd. Control of robots from human motion descriptors
WO2007136803A3 (en) 2006-05-17 2008-04-17 Hansen Medical Inc Robotic instrument system
US9724165B2 (en) 2006-05-19 2017-08-08 Mako Surgical Corp. System and method for verifying calibration of a surgical device
CA2651780C (en) 2006-05-19 2015-03-10 Mako Surgical Corp. A method and apparatus for controlling a haptic device
US20090192523A1 (en) 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical instrument
US8620473B2 (en) 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
WO2007146984A3 (en) 2006-06-13 2008-04-24 Intuitive Surgical Inc Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US8029516B2 (en) 2006-06-13 2011-10-04 Intuitive Surgical Operations, Inc. Bracing of bundled medical devices for single port entry, robotically assisted medical procedures
US8377045B2 (en) 2006-06-13 2013-02-19 Intuitive Surgical Operations, Inc. Extendable suction surface for bracing medial devices during robotically assisted medical procedures
US20140055489A1 (en) 2006-06-29 2014-02-27 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
DE102006046689A1 (en) 2006-09-29 2008-04-10 Siemens Ag Technical Medical Treatment System
DE102006061178A1 (en) 2006-12-22 2008-06-26 Siemens Ag Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit
WO2008103383A1 (en) 2007-02-20 2008-08-28 Gildenberg Philip L Videotactic and audiotactic assisted surgical methods and procedures
JP4891823B2 (en) 2007-03-29 2012-03-07 オリンパスメディカルシステムズ株式会社 Endoscope apparatus
CA2684459C (en) 2007-04-16 2016-10-04 Neuroarm Surgical Ltd. Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis
WO2009034477A3 (en) 2007-04-16 2010-02-04 Neuroarm Surgical Ltd. Frame mapping and force feedback methods, devices and systems
CA2684472C (en) 2007-04-16 2015-11-24 Neuroarm Surgical Ltd. Methods, devices, and systems for automated movements involving medical robots
US8931682B2 (en) 2007-06-04 2015-01-13 Ethicon Endo-Surgery, Inc. Robotically-controlled shaft based rotary drive systems for surgical instruments
JP2009006410A (en) 2007-06-26 2009-01-15 Fuji Electric Systems Co Ltd Remote operation support device and remote operation support program
DE102007029884A1 (en) 2007-06-28 2009-01-15 Siemens Ag Method and apparatus for generating a composite of a plurality of frames endoscopic overall image of an inner surface of a body cavity
JP2009012106A (en) 2007-07-03 2009-01-22 Fuji Electric Systems Co Ltd Remote operation supporting device and program
JP2009039814A (en) 2007-08-08 2009-02-26 Institute Of Physical & Chemical Research Power assist device and its control method
US8073528B2 (en) 2007-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Tool tracking systems, methods and computer products for image guided surgery
US8108072B2 (en) 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
EP2217157A2 (en) 2007-10-05 2010-08-18 Ethicon Endo-Surgery, Inc. Ergonomic surgical instruments
US9037295B2 (en) 2008-03-07 2015-05-19 Perception Raisonnement Action En Medecine Dynamic physical constraint for hard surface emulation
US8155479B2 (en) 2008-03-28 2012-04-10 Intuitive Surgical Operations Inc. Automated panning and digital zooming for robotic surgical systems
US8808164B2 (en) 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US20090259105A1 (en) 2008-04-10 2009-10-15 Miyano Hiromichi Medical treatment system and suturing method
JP5384178B2 (en) * 2008-04-21 2014-01-08 株式会社森精機製作所 Machining simulation method and machining simulation apparatus
US8315738B2 (en) 2008-05-21 2012-11-20 Fanuc Robotics America, Inc. Multi-arm robot system interference check via three dimensional automatic zones
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US8864652B2 (en) 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US9179832B2 (en) 2008-06-27 2015-11-10 Intuitive Surgical Operations, Inc. Medical robotic system with image referenced camera control using partitionable orientational and translational modes
US20090326553A1 (en) 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US8414469B2 (en) 2008-06-27 2013-04-09 Intuitive Surgical Operations, Inc. Medical robotic system having entry guide controller with instrument tip velocity limiting
CN102149321A (en) 2008-09-12 2011-08-10 艾可瑞公司 Controlling X-ray imaging based on target motion
US8315720B2 (en) 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8126642B2 (en) 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
US20100331856A1 (en) 2008-12-12 2010-12-30 Hansen Medical Inc. Multiple flexible and steerable elongate instruments for minimally invasive operations
US8774969B2 (en) 2008-12-17 2014-07-08 Kuka Laboratories Gmbh Method for allowing a manipulator to cover a predetermined trajectory, and control device for carrying out said method
US8335590B2 (en) 2008-12-23 2012-12-18 Intuitive Surgical Operations, Inc. System and method for adjusting an image capturing device attribute using an unused degree-of-freedom of a master control device
US8594841B2 (en) * 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US8306656B1 (en) 2009-01-12 2012-11-06 Titan Medical Inc. Method and system for performing medical procedure
US8120301B2 (en) 2009-03-09 2012-02-21 Intuitive Surgical Operations, Inc. Ergonomic surgeon control console in robotic surgical systems
US8423182B2 (en) 2009-03-09 2013-04-16 Intuitive Surgical Operations, Inc. Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems
US8167823B2 (en) 2009-03-24 2012-05-01 Biomet Manufacturing Corp. Method and apparatus for aligning and securing an implant relative to a patient
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US8244402B2 (en) 2009-09-22 2012-08-14 GM Global Technology Operations LLC Visual perception system and method for a humanoid robot
CN102711586B (en) 2010-02-11 2015-06-17 直观外科手术操作公司 Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
US8918211B2 (en) 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
KR101840312B1 (en) 2010-08-02 2018-03-20 더 존스 홉킨스 유니버시티 Method for presenting force sensor information using cooperative robot control and audio feedback
KR20130080909A (en) 2012-01-06 2013-07-16 삼성전자주식회사 Surgical robot and method for controlling the same
KR101800189B1 (en) 2012-04-30 2017-11-23 삼성전자주식회사 Apparatus and method for controlling power of surgical robot
US20140232824A1 (en) 2013-02-15 2014-08-21 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US5748767A (en) * 1988-02-01 1998-05-05 Faro Technology, Inc. Computer-aided surgery apparatus
US5408409A (en) * 1990-05-11 1995-04-18 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5950629A (en) * 1991-06-13 1999-09-14 International Business Machines Corporation System for assisting a surgeon during surgery
US5755725A (en) * 1993-09-07 1998-05-26 Deemed International, S.A. Computer-assisted microsurgery methods and equipment
US5820545A (en) * 1995-08-14 1998-10-13 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Method of tracking a surgical instrument with a mono or stereo laparoscope
US5987591A (en) * 1995-12-27 1999-11-16 Fanuc Limited Multiple-sensor robot system for obtaining two-dimensional image and three-dimensional position information
US20030032878A1 (en) * 1996-06-28 2003-02-13 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for volumetric image navigation
US20070265491A1 (en) * 1998-05-14 2007-11-15 Calypso Medical Technologies, Inc. Systems and methods for stabilizing a target location within a human body
US6398726B1 (en) * 1998-11-20 2002-06-04 Intuitive Surgical, Inc. Stabilizer for robotic beating-heart surgery
US20060241414A1 (en) * 1998-11-20 2006-10-26 Intuitive Surgical Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesuregery
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US6837883B2 (en) * 1998-11-20 2005-01-04 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US6659939B2 (en) * 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US20070038080A1 (en) * 1998-12-08 2007-02-15 Intuitive Surgical Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6224542B1 (en) * 1999-01-04 2001-05-01 Stryker Corporation Endoscopic camera system with non-mechanical zoom
US6493608B1 (en) * 1999-04-07 2002-12-10 Intuitive Surgical, Inc. Aspects of a control system of a minimally invasive surgical apparatus
US7155315B2 (en) * 1999-04-07 2006-12-26 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6442417B1 (en) * 1999-11-29 2002-08-27 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for transforming view orientations in image-guided surgery
US6770081B1 (en) * 2000-01-07 2004-08-03 Intuitive Surgical, Inc. In vivo accessories for minimally invasive robotic surgery and methods
US20010035871A1 (en) * 2000-03-30 2001-11-01 Johannes Bieger System and method for generating an image
US7194118B1 (en) * 2000-11-10 2007-03-20 Lucid, Inc. System for optically sectioning and mapping surgically excised tissue
US20050251113A1 (en) * 2000-11-17 2005-11-10 Kienzle Thomas C Iii Computer assisted intramedullary rod surgery system with enhanced features
US20030109780A1 (en) * 2001-06-07 2003-06-12 Inria Roquencourt Methods and apparatus for surgical planning
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
US20040106916A1 (en) * 2002-03-06 2004-06-03 Z-Kat, Inc. Guidance system and method for surgical procedures with improved feedback
US7491198B2 (en) * 2003-04-28 2009-02-17 Bracco Imaging S.P.A. Computer enhanced surgical navigation imaging system (camera probe)
US7181315B2 (en) * 2003-10-08 2007-02-20 Fanuc Ltd Manual-mode operating system for robot
US20050203380A1 (en) * 2004-02-17 2005-09-15 Frank Sauer System and method for augmented reality navigation in a medical intervention procedure
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20070135803A1 (en) * 2005-09-14 2007-06-14 Amir Belson Methods and apparatus for performing transluminal and other procedures
US20070270650A1 (en) * 2006-05-19 2007-11-22 Robert Eno Methods and apparatus for displaying three-dimensional orientation of a steerable distal tip of an endoscope
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20080118115A1 (en) * 2006-11-17 2008-05-22 General Electric Company Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9232984B2 (en) 1999-04-07 2016-01-12 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US9345387B2 (en) 2006-06-13 2016-05-24 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US9801690B2 (en) 2006-06-29 2017-10-31 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US9901408B2 (en) 2007-06-13 2018-02-27 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9629520B2 (en) 2007-06-13 2017-04-25 Intuitive Surgical Operations, Inc. Method and system for moving an articulated instrument back towards an entry guide while automatically reconfiguring the articulated instrument for retraction into the entry guide
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9717563B2 (en) 2008-06-27 2017-08-01 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US20090326553A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8583274B2 (en) 2008-09-26 2013-11-12 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of medical robotic system
US8315720B2 (en) * 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8892224B2 (en) 2008-09-26 2014-11-18 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8374723B2 (en) 2008-12-31 2013-02-12 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US20100168918A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Obtaining force information in a minimally invasive surgical procedure
US8706301B2 (en) 2008-12-31 2014-04-22 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
WO2010078011A1 (en) * 2008-12-31 2010-07-08 Intuitive Surgical, Inc. Visual force feedback in a minimally invasive surgical procedure
US8594841B2 (en) 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US20140058564A1 (en) * 2008-12-31 2014-02-27 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US20100169815A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Visual force feedback in a minimally invasive surgical procedure
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9155592B2 (en) * 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9956044B2 (en) 2009-08-15 2018-05-01 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
WO2011060187A1 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. A master finger tracking device and method of use in a minimally invasive surgical system
WO2011060139A2 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
CN104856764A (en) * 2009-11-13 2015-08-26 直观外科手术操作公司 System for hand control of a teleoperated minimally invasive slave surgical instrument
US8831782B2 (en) 2009-11-13 2014-09-09 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a teleoperated surgical instrument
US8682489B2 (en) 2009-11-13 2014-03-25 Intuitive Sugical Operations, Inc. Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US20110118752A1 (en) * 2009-11-13 2011-05-19 Brandon Itkowitz Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US8543240B2 (en) 2009-11-13 2013-09-24 Intuitive Surgical Operations, Inc. Master finger tracking device and method of use in a minimally invasive surgical system
US8521331B2 (en) * 2009-11-13 2013-08-27 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
CN102665589A (en) * 2009-11-13 2012-09-12 直观外科手术操作公司 Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
KR101825712B1 (en) * 2009-11-13 2018-02-06 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
CN102596085A (en) * 2009-11-13 2012-07-18 直观外科手术操作公司 Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
WO2011060139A3 (en) * 2009-11-13 2012-04-19 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
WO2012044334A2 (en) 2009-11-13 2012-04-05 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
EP3092969A2 (en) 2009-11-13 2016-11-16 Intuitive Surgical Operations, Inc. A master finger tracking device and method of use in a minimally invasive surgical system
EP3092968A2 (en) 2009-11-13 2016-11-16 Intuitive Surgical Operations, Inc. System for hand presence detection in a minimally invasive surgical system
EP3097883A1 (en) 2009-11-13 2016-11-30 Intuitive Surgical Operations, Inc. Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US20110118753A1 (en) * 2009-11-13 2011-05-19 Brandon Itkowitz Master finger tracking device and method of use in a minimally invasive surgical system
EP3092968A3 (en) * 2009-11-13 2017-02-22 Intuitive Surgical Operations, Inc. System for hand presence detection in a minimally invasive surgical system
EP3092969A3 (en) * 2009-11-13 2017-03-01 Intuitive Surgical Operations, Inc. A master finger tracking device and method of use in a minimally invasive surgical system
WO2011060185A1 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. Method and system for hand presence detection in a minimally invasive surgical system
WO2011060171A1 (en) 2009-11-13 2011-05-19 Intuitive Surgical Operations, Inc. Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US20110118748A1 (en) * 2009-11-13 2011-05-19 Intuitive Surgical, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
KR101762631B1 (en) * 2009-11-13 2017-07-28 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 A master finger tracking device and method of use in a minimally invasive surgical system
KR101772958B1 (en) * 2009-11-13 2017-08-31 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9743989B2 (en) 2010-09-21 2017-08-29 Intuitive Surgical Operations, Inc. Method and system for hand presence detection in a minimally invasive surgical system
US8935003B2 (en) 2010-09-21 2015-01-13 Intuitive Surgical Operations Method and system for hand presence detection in a minimally invasive surgical system
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US9901402B2 (en) 2010-09-21 2018-02-27 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US20120209288A1 (en) * 2011-02-15 2012-08-16 Intuitive Surgical Operations, Inc. Indicator for Knife Location in a Stapling or Vessel Sealing Instrument
CN103370014A (en) * 2011-02-15 2013-10-23 直观外科手术操作公司 Indicator for knife location in a stapling or vessel sealing instrument
US20120221145A1 (en) * 2011-02-24 2012-08-30 Olympus Corporation Master input device and master-slave manipulator
US9186796B2 (en) * 2011-02-24 2015-11-17 Olympus Corporation Master input device and master-slave manipulator
WO2017044965A1 (en) * 2015-09-10 2017-03-16 Duke University Systems and methods for arbitrary viewpoint robotic manipulation and robotic surgical assistance

Also Published As

Publication number Publication date Type
US9901408B2 (en) 2018-02-27 grant
WO2010117684A1 (en) 2010-10-14 application
US20160235486A1 (en) 2016-08-18 application
US20140135792A1 (en) 2014-05-15 application
US9801690B2 (en) 2017-10-31 grant
US9788909B2 (en) 2017-10-17 grant
US20170245948A1 (en) 2017-08-31 application

Similar Documents

Publication Publication Date Title
Simaan et al. Design and integration of a telerobotic system for minimally invasive surgery of the throat
Hagn et al. DLR MiroSurge: a versatile system for research in endoscopic telesurgery
US20110118748A1 (en) Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US20090326552A1 (en) Medical robotic system having entry guide controller with instrument tip velocity limiting
US20080221591A1 (en) Methods, systems, and devices for surgical visualization and device manipulation
Guthart et al. The Intuitive/sup TM/telesurgery system: overview and application
US20080033240A1 (en) Auxiliary image display and manipulation on a computer display in a medical robotic system
US20080234866A1 (en) Master-slave manipulator system
US20030195664A1 (en) Grip strength with tactile feedback for robotic surgery
US7574250B2 (en) Image shifting apparatus and method for a telerobotic system
US8079950B2 (en) Autofocus and/or autoscaling in telesurgery
US7843158B2 (en) Medical robotic system adapted to inhibit motions resulting in excessive end effector forces
Ruurda et al. Robot-assisted surgical systems: a new era in laparoscopic surgery.
US20070142825A1 (en) Method for handling an operator command exceeding a medical device state limitation in a medical robotic system
US20080058776A1 (en) Robotic surgical system for laparoscopic surgery
US8335590B2 (en) System and method for adjusting an image capturing device attribute using an unused degree-of-freedom of a master control device
US20090326322A1 (en) Medical robotic system with image referenced camera control using partitionable orientational and translational modes
US8398541B2 (en) Interactive user interfaces for robotic minimally invasive surgical systems
US8315720B2 (en) Method for graphically providing continuous change of state directions to a user of a medical robotic system
WO2000033726A1 (en) Image shifting telerobotic system
US20110202068A1 (en) Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US20100318099A1 (en) Virtual measurement tool for minimally invasive surgery
US20100317965A1 (en) Virtual measurement tool for minimally invasive surgery
US20130041368A1 (en) Apparatus and Method for Using a Remote Control System in Surgical Procedures
US7107090B2 (en) Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTUITIVE SURGICAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LARKIN, DAVID Q.;HOFFMAN, BRIAN D.;MOHR, PAUL W.;REEL/FRAME:022483/0414

Effective date: 20090326

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:042413/0768

Effective date: 20100101

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR EXECUTION DATE PREVIOUSLY RECORDED AT REEL: 042413 FRAME: 0768. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:043107/0063

Effective date: 20100219