WO2019050829A1 - Collision handling algorithms for robotic surgical systems - Google Patents

Collision handling algorithms for robotic surgical systems Download PDF

Info

Publication number
WO2019050829A1
WO2019050829A1 PCT/US2018/049334 US2018049334W WO2019050829A1 WO 2019050829 A1 WO2019050829 A1 WO 2019050829A1 US 2018049334 W US2018049334 W US 2018049334W WO 2019050829 A1 WO2019050829 A1 WO 2019050829A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
input handle
pose
input
threshold
Prior art date
Application number
PCT/US2018/049334
Other languages
French (fr)
Inventor
William Peine
Original Assignee
Covidien Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien Lp filed Critical Covidien Lp
Priority to JP2020534809A priority Critical patent/JP7349992B2/en
Priority to EP18853152.9A priority patent/EP3678572A4/en
Priority to US16/643,306 priority patent/US11628022B2/en
Priority to CN201880006846.2A priority patent/CN110177516B/en
Priority to CA3074443A priority patent/CA3074443A1/en
Priority to AU2018328098A priority patent/AU2018328098A1/en
Publication of WO2019050829A1 publication Critical patent/WO2019050829A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/742Joysticks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/065Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones

Definitions

  • Robotic surgical systems have been used in minimally invasive medical procedures.
  • the robotic surgical system is controlled by a surgeon interfacing with a user interface.
  • the user interface allows the surgeon to manipulate an end effector that acts on a patient.
  • the end effector is inserted into a small incision (via a cannula) or a natural orifice of a patient to position the end effector at a work site within the body of the patient.
  • Some robotic surgical systems include a robotic console supporting a robot arm, and at least one end effector such as a scalpel, a forceps, or a grasping tool that is mounted to the robot arm.
  • the user interface includes an input controller or handle that is moveable by the surgeon to control the robotic surgical system.
  • Robotic surgical systems typically use a scaling factor to scale down the motions of the surgeons hands to determine the desired position of the robotic instruments within the patient. Often this scaling factor requires the motions of the handles to be larger than the range of motion of the input handle. The handles therefore reach a boundary limit of the workspace and prevent the surgeon from completing the desired motion.
  • Current robotic surgical systems on the market use a feature called "clutching" to decouple the motion of the input handles from the robotic instruments. The surgeon is then free to move the handles to a new position within the workspace of the user interface while the instruments remain stationary.
  • the surgeon can "reclutch” to recouple the motion of the input handle to complete the desired motion with the robotic instrument.
  • the robot arm or end effector may collide with tissue, an organ, or another surgical implement (e.g., another robot arm or end effector, access port, or camera). Such collisions can create a positional mismatch between the position of the input handles and the robot arm or end effector associated with the input handle. This positional mismatch can create undesired motions of the robot arm or the end effector during the surgical procedure.
  • a method of collision handling for a robotic surgical system includes slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and an input handle is moved in a direction that corresponds to moving the tool towards the obstruction.
  • the input handle having an offset relative to a desired pose of the tool after the input handle is slipped.
  • the method includes moving the input handle in a direction to move the portion of the surgical robot away from the obstruction after the slipping of the input handle.
  • the input handle may move a distance corresponding to the offset before the tool moves in a direction away from the obstruction.
  • the tool may move in a direction away from the obstruction while maintaining a trim between a position of the input handle and a pose of the tool.
  • the trim may be equal to the offset or the method may include dynamically scaling movement of the input handle relative to the pose of the tool in a direction parallel to the offset until the trim reaches a predetermined value.
  • the predetermined value may be zero or nonzero.
  • slipping the handle relative to the pose of the tool occurs after the surgical robot reaches the predetermined force threshold to move the tool towards a desired pose.
  • the method may further include a processing unit of the robotic surgical system to define the offset between a threshold position of the input handle when the tool reaches the predetermined force threshold and a position of the input handle after the input handle is pushed beyond the threshold position.
  • the method may include the robotic surgical system providing force feedback to a clinician to resist slipping of the input handle beyond the threshold position.
  • a method of collision handling of a robotic surgical system with a processing unit of the robotic surgical system includes receiving a first input signal from a user interface of the robotic surgical system to move a tool of a surgical robot of the robotic surgical system to a desired pose of the tool, transmitting an input control signal to the surgical robot to move the tool towards the desired pose, receiving a feedback signal from the surgical robot that a force to move the tool towards the desired pose is greater than a predetermined threshold, maintaining the tool at a threshold pose when the predetermined threshold is reached, and slipping a position of the input handle relative to the threshold pose to a second position of the input handle to define an offset between the second position of the input handle and a desired pose of the tool corresponding to the second position of the input handle.
  • the method includes transmitting a feedback control signal to the user interface to resist movement of the input handle beyond a threshold position corresponding to the threshold pose of the tool.
  • the method includes receiving a second input signal from the user interface after slipping the position of the input handle indicative of the input handle moving towards a threshold position corresponding to the threshold pose of the tool.
  • the method may include maintaining the tool in the threshold pose in responds to receiving the second input signal.
  • the method may include transmitting a second control signal to the surgical robot to move the tool away from the desired pose with a trim defined between the input handle and the pose of the tool.
  • Transmitting the second control signal may include the trim being equal to the offset between the second position of the input handle and the desired pose of the tool corresponding to the second position of the input handle.
  • the method may include dynamically scaling movement of the input handle to the pose of the tool to reduce the trim between the position of the input handle and the pose of the tool until the trim reaches a predetermined value.
  • the predetermined value may be zero or nonzero.
  • FIG. 1 is a schematic illustration of a user interface and a robotic system in accordance with the present disclosure.
  • FIG. 2 is a plan view, schematic illustration, of a workspace of the user interface of FIG. 1;
  • FIG. 3 is a view of a display device of the user interface of FIG. 1 illustrating a tool of a surgical robot within a surgical site;
  • FIG. 4 is a flowchart of a method of collision handling and collision recovery in accordance with the present disclosure.
  • FIG. 5 is a flowchart of another method of collision handling and collision recovery in accordance with the present disclosure.
  • the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel.
  • proximal refers to the portion of the device or component thereof that is closest to the clinician and the term “distal” refers to the portion of the device or component thereof that is farthest from the clinician.
  • neutral is understood to mean non-scaled.
  • This disclosure generally relates to collision handling and collision recovery algorithms or methods for robotic surgical systems.
  • a processing unit of a robotic surgical system may allow an input handle of a user interface to slip beyond a position corresponding to a pose of a tool of a surgical robot when a portion of the surgical robot collides with an obstruction. Slipping the input handle relative to the pose of the tool defines an offset between the position of the input handle and a pose of the tool.
  • the input handle may move through the entire offset before the tool moves from the pose when the surgical robot collided with the obstruction.
  • any movement of the input handle to move the surgical robot away from the obstruction would move the surgical robot away from the obstruction such that a trim is defined between the position of the input handle and a pose of the tool.
  • the trim may be equal to the offset or the robot surgical system may dynamically scale movement of the surgical robot to reduce or remove the trim in a manner imperceptible to a clinician.
  • a robotic surgical system 1 in accordance with the present disclosure is shown generally as a surgical robot 10, a processing unit 30, and a user interface 40.
  • the surgical robot 10 generally includes linkages 12 and a robot base 18.
  • the linkages 12 moveably support an end effector or tool 20 which is configured to act on tissue.
  • the linkages 12 may be in the form of arms each having an end 14 that supports an end effector or tool 20 which is configured to act on tissue.
  • the ends 14 of the linkages 12 may include an imaging device 16 for imaging a surgical site "S".
  • the user interface 40 is in communication with robot base 18 through the processing unit 30.
  • the user interface 40 includes a display device 44 which is configured to display three- dimensional images.
  • the display device 44 displays three-dimensional images of the surgical site "S" which may include data captured by imaging devices 16 positioned on the ends 14 of the linkages 12 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site "S", an imaging device positioned adjacent the patient "P", imaging device 56 positioned at a distal end of an imaging arm 52).
  • the imaging devices e.g., imaging devices 16, 56
  • the imaging devices may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site "S”.
  • the imaging devices transmit captured imaging data to the processing unit 30 which creates three-dimensional images of the surgical site "S” in real-time from the imaging data and transmits the three-dimensional images to the display device 44 for display.
  • the user interface 40 also includes input handles 42 which are supported on control arms 43 which allow a clinician to manipulate the surgical robot 10 (e.g., move the arms 12, the ends 14 of the linkages 12, and/or the tools 20).
  • Each of the input handles 42 is in communication with the processing unit 30 to transmit control signals thereto and to receive feedback signals therefrom.
  • each of the input handles 42 may include input devices 46 (FIG. 2) which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools 20 supported at the ends 14 of the linkages 12.
  • each of the input handles 42 is moveable through a predefined workspace to move the ends 14 of the linkages 12, e.g., tools 20 (FIG. 1), within a surgical site "S".
  • the three-dimensional images on the display device 44 are orientated such that the movement of the input handles 42 moves the ends 14 of the linkages 12 as viewed on the display device 44.
  • the three-dimensional images remain stationary while movement of the input handles 42 is scaled to movement of the ends 14 of the linkages 12 within the three-dimensional images.
  • kinematic mapping of the input handles 42 is based on a camera orientation relative to an orientation of the ends 14 of the linkages 12.
  • the orientation of the three-dimensional images on the display device 44 may be mirrored or rotated relative to view from above the patient "P".
  • the size of the three- dimensional images on the display device 44 may be scaled to be larger or smaller than the actual structures of the surgical site permitting a clinician to have a better view of structures within the surgical site "S”.
  • the tools 20 are moved within the surgical site "S” as detailed below. Movement of the tools 20 may also include movement of the ends 14 of the linkages 12 which support the tools 20.
  • the movement of the tools 20 is scaled relative to the movement of the input handles 42.
  • the input handles 42 send input signals to the processing unit 30.
  • the processing unit 30 analyzes the input signals to move the tools 20 in response to the input signals.
  • the processing unit 30 transmits scaled control signals to the robot base 18 to move the tools 20 in response to the movement of the input handles 42.
  • the processing unit 30 scales the input signals by dividing an Inputdistance (e.g., the distance moved by one of the input handles 42) by a scaling factor SF to arrive at a scaled Outputdistance (e.g., the distance that one of the ends 14 is moved).
  • the scaling factor SF is in a range between about 1 and about 10 (e.g., 3). This scaling is represented by the following equation:
  • a clinician interfaces with the input handle 42 to manipulate the tool 20 within the surgical site "S".
  • a clinician can visualize movement of the tool 20 within the surgical site "S” on the display 44.
  • a clinician moves an input handle 42 from a first position "PI” to a second position "P2", shown in dashed lines (FIG. 2).
  • the processing unit 30 receives an input signal sent from the user interface 40 and transmits a control signal to the surgical robot 10 to move the tool 20 from a first pose to a second pose.
  • the input handle 42 is moved a distance along a control X axis in a direction illustrated by arrow "Ml” and the tool 20 is moved in a direction along a robotic X axis illustrated by arrow "Rl” representing movement of the tool 20 from a first pose "Tl" towards a second pose "T2".
  • the tool 20 may collide with an obstruction within the surgical site "S", e.g., tissue T, another tool 20, an organ, or other surgical implement.
  • the processing unit 30 receives a feedback signal from the surgical robot 10 and transmits a feedback control signal to the user interface 40.
  • the user interface provides force feedback to the clinician indicative of the tool 20 colliding with the obstruction. For example, the clinician may feel resistance to continued movement along the control X axis in the direction of the arrow "Ml".
  • the clinician may push the input handle 42 against the force feedback (e.g., in a direction opposite to the direction of the force feedback) and continue to move the input handle 20 along the control X axis in the direction of arrow "Ml".
  • the processing unit 30 continues to send control signals to the surgical robot 10 to move the tool 20 along the robotic X axis in the direction of arrow "Rl” until the force of the surgical robot 10, to continue movement of the tool 20 along the robotic X axis, exceeds a predetermined threshold.
  • the predetermined threshold may be determined by a deflection of a portion of the surgical robot 10 or by a torque at one or more joints of the surgical robot 10.
  • the surgical robot 10 When the force of the surgical robot 10 exceeds the predetermined threshold, the surgical robot 10 "clutches" the movement of the input handle 42 from movement of the robotic system 10, scales down movement of the input handle 42 from movement of the surgical robot 10, and/or any other known means of collision handling.
  • any other known means of collision handling For a detailed discussion of systems and methods for detecting and handling of a collision of a tool or linkage of a robotic system and an obstruction reference may be made to U.S. Provisional Patent Application Serial No. 62/613,543, filed January 4, 2018, and entitled "SURGICAL ROBOT INCLUDING TORQUE SENSORS [Atty. Docket No. C00014971.USP1 (203-11527)], the entire contents of which are hereby incorporated by reference.
  • the force to move the tool 20 along the robotic X axis was reached the predetermined threshold when the input handle 42 was positioned at a threshold position "PT".
  • the input handle 42 was pushed through the threshold position "PT" to the second position "P2".
  • the tool 20 is substantially stationary within the surgical site "S”, e.g., the tool 20 remains in the first pose "Ti” as shown in FIG. 3, such that the input handle 42 "slips” relative to the tool 20.
  • This "slipping" of the input handle 42 relative to the tool 20 results in a position mismatch between a desired pose "T2" of the tool 20 based on the position of the input handle 42 and the actual pose of the tool 20 which remains at the first pose "Tl”.
  • the surgical robot 10 With the input handle 42 in the second position "P2", the surgical robot 10 maintains the tool 20 at the first pose "Tl", the pose at which the predetermined threshold was reached, until the input handle 42 is moved along the control X axis in a direction that requires a force below the predetermined threshold to reposition the tool 20 along the robotic X axis, e.g., in a direction opposite the arrow "Rl".
  • This position mismatch can create undesired motions of the tool 20 within the surgical site "S" during a surgical procedure.
  • the tool 20 may be maintained in the first pose "Tl” with the predetermined threshold force being directed towards an obstruction, e.g., tissue "T", such that, were the tool 20 to free itself from the obstruction, the tool 20 may move towards desired pose "T2" unexpectedly and/or at an undesired high velocity.
  • an obstruction e.g., tissue "T”
  • a method 200 for slipping the input handle 42 relative to the tool 20 in an event of a collision with an obstruction and a method for collision recovery is disclosed, in accordance with the present disclosure, with reference to the robotic surgical system 1 of FIGS. 1-3.
  • a collision between a tool 20 and tissue "T" of a patient is described; however, such a collision may be between any portion of the surgical robot 10 and an obstruction.
  • a collision may occur between a linkage 12 of the surgical robot 10 and another linkage 12.
  • a clinician moves the input handle 42 in a first direction along the control X axis towards the second position "P2" and transmits an input signal indicative of the movement (Step 210).
  • the processing unit 30 receives the input signal (Step 240) and transmits an input control signal to move the tool 20 towards the desired pose of the surgical robot 10 (Step 242).
  • the surgical robot 10 receives the control signal and moves the tool 20, and thus the surgical robot 10, towards the desired pose "T2" (Step 260).
  • a portion of the surgical robot 10, e.g., tool 20 may collide with tissue "T” such that the surgical robot 10 would require a force greater than a predetermined threshold to continue to move the surgical robot 10 towards the desired pose "T2" (Step 262); this pose is defined as the threshold pose "Tl”.
  • the surgical robot 10 transmits a feedback signal to the processing unit 30.
  • the processing unit 30 receives the feedback signal (Step 244) from the surgical robot 10 and transmits a control signal to the surgical robot 10 (Step 246) to maintain the surgical robot at the threshold pose "Tl" (Step 264). In addition, the processing unit 30 transmits a feedback control signal to the user interface 40 (Step 246). In response to the feedback control signal, a clinician experiences force feedback against moving the input handle beyond a threshold position "PT" that corresponds to the threshold pose "Tl" of the surgical robot 10 (Step 212).
  • the clinician may push the input handle 42 in the first direction through the force feedback of the user interface 40 to a second position "P2" (Step 214).
  • the processing unit 30 receives an input signal in response to movement of the input handle 42 in the first direction and slips the position of the input handle 42 relative to the pose of the surgical robot 10 (Step 248).
  • an offset is generated along the control X axis as the input handle 42 is “slipped" between the threshold position "PT" and the second position "P2".
  • the offset represents movement of the input handle 42 beyond the point at which the position of the input handle 42 corresponds to the pose of the surgical robot 10, e.g., the threshold position "PT", and the position of the input handle 42, e.g., the second position "P2".
  • the input handle 42 With the input handle 42 at the second position "P2", the input handle 42 can be moved along the control X axis in a second direction away from the obstruction, e.g., the direction opposite the arrow "Ml", (Step 216) such that the input handle 42 moves through a dead zone equal to the offset between the second position "P2" and the threshold position "PT" before the tool 20 of the surgical robot 10 moves along the robot X axis in a direction opposite the arrow "Rl".
  • the obstruction e.g., the direction opposite the arrow "Ml”
  • the surgical robot 10 is recovered from the collision such that the surgical robot 10 moves the tool 20 along the robot X axis in response to additional movement of the input handle 42 in the second direction (Steps 220, 254, 256, 266). It will be appreciated that movement of the input handle 42 along the control X axis towards the threshold position "PT" will be allowed with little or no resistance, e.g., force feedback, while additional movement of the input handle 42 along the control X axis away from the threshold position "PT" will be resisted with additional force feedback.
  • Step 248 the processing unit 30 slips the position of the input handle 42 relative to the threshold pose of the surgical robot 10 to define an offset
  • the input handle 42 is moved in the second direction along the control X axis (Step 302).
  • the processing unit 30 receives an input signal indicative of the movement of the input handle 42 in the second direction (Step 350) and transmits a second control signal to the surgical robot 10 to move away from the threshold pose "T2" with a trim between the input handle and the pose of the surgical robot (Step 352). It will be appreciated that the trim is substantially equal to the offset between the threshold position "PT" and the second position "P2".
  • the surgical robot 10 receives the second control signal and moves the surgical robot 10 away from the threshold pose (Step 366).
  • the robotic surgical system 1 may continue to manipulate the surgical robot 10 in response to movements of the input handle 42 with the trim maintained between the position of the input handle 42 and the pose of the surgical robot 10.
  • the robotic surgical system 1 may dynamically scale the movement of the input handle 42 and the tool 20 to reduce or eliminate the trim in a manner imperceptible to a clinician.
  • the input handle 42 can be moved in the first and second directions along the control X axis such that input signals are transmitted to the processing unit 30 (Step 304).
  • the processing unit 30 receives the input signals (Step 354) and dynamically scales movements of the input handle 42 to reduce the trim between the input handle 42 and the pose of the surgical robot 10 (Step 356).
  • the processing unit 30 transmits scaled control signals to the surgical robot 10 (Step 358) which moves the surgical robot 10 in response to the scaled control signals (Step 368).
  • the trim may be reduced to a predetermined value and the robotic surgical system 10 may continue to move the surgical robot 10 in response to movement of the input handle 42.
  • the predetermined value of the trim is nonzero and in other embodiments the trim is reduced to zero such that the position of the input handle 42 corresponds to the pose of the surgical robot 10.
  • Slipping a position of the input handle 42 relative to a pose of the tool 20 allows for movement or repositioning of the input handle 42 within the workspace of the user interface 40 without movement of the tool 20 within the surgical site "S".
  • the methods of collision recovery detailed above e.g., moving the input handle 42 through a dead zone, operating with an offset, and dynamically scaling to eliminate offset, allows for predictable movement of a tool, e.g., tool 20, of a surgical robot after a collision. Such predictable movement may improve surgical outcomes, reduce the surgical time, reduce recovery time, and/or reduce the cost of surgery.
  • the user interface 40 is in operable communication with the surgical robot 10 to perform a surgical procedure on a patient; however, it is envisioned that the user interface 40 may be in operable communication with a surgical simulator (not shown) to virtually actuate a robotic system and/or tool in a simulated environment.
  • the robotic surgical system 1 may have a first mode in which the user interface 40 is coupled to actuate the surgical robot 10 and a second mode in which the user interface 40 is coupled to the surgical simulator to virtually actuate a robotic system.
  • the surgical simulator may be a standalone unit or be integrated into the processing unit 30.
  • the surgical simulator virtually responds to a clinician interfacing with the user interface 40 by providing visual, audible, force, and/or haptic feedback to a clinician through the user interface 40.
  • the surgical simulator moves representative tools that are virtually acting on tissue.
  • the surgical simulator may allow a clinician to practice a surgical procedure before performing the surgical procedure on a patient.
  • the surgical simulator may be used to train a clinician on a surgical procedure.
  • the surgical simulator may simulate "complications" during a proposed surgical procedure to permit a clinician to plan a surgical procedure.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Robotics (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Methods of collision handling for robotic surgical systems include slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and an input handle is moved in a direction that corresponds to moving the tool towards the obstruction. The input handle having an offset relative to a desired pose of the tool after the input handle is slipped.

Description

COLLISION HANDLING ALGORITHMS FOR ROBOTIC SURGICAL
SYSTEMS
BACKGROUND
[0001] Robotic surgical systems have been used in minimally invasive medical procedures. During a medical procedure, the robotic surgical system is controlled by a surgeon interfacing with a user interface. The user interface allows the surgeon to manipulate an end effector that acts on a patient.
[0002] The end effector is inserted into a small incision (via a cannula) or a natural orifice of a patient to position the end effector at a work site within the body of the patient. Some robotic surgical systems include a robotic console supporting a robot arm, and at least one end effector such as a scalpel, a forceps, or a grasping tool that is mounted to the robot arm.
[0003] In general, the user interface includes an input controller or handle that is moveable by the surgeon to control the robotic surgical system. Robotic surgical systems typically use a scaling factor to scale down the motions of the surgeons hands to determine the desired position of the robotic instruments within the patient. Often this scaling factor requires the motions of the handles to be larger than the range of motion of the input handle. The handles therefore reach a boundary limit of the workspace and prevent the surgeon from completing the desired motion. Current robotic surgical systems on the market use a feature called "clutching" to decouple the motion of the input handles from the robotic instruments. The surgeon is then free to move the handles to a new position within the workspace of the user interface while the instruments remain stationary. Once the input handle is away from the workspace boundary, the surgeon can "reclutch" to recouple the motion of the input handle to complete the desired motion with the robotic instrument. [0004] During a robotic surgical procedure, the robot arm or end effector may collide with tissue, an organ, or another surgical implement (e.g., another robot arm or end effector, access port, or camera). Such collisions can create a positional mismatch between the position of the input handles and the robot arm or end effector associated with the input handle. This positional mismatch can create undesired motions of the robot arm or the end effector during the surgical procedure.
[0005] Accordingly, there is a need for collision handling algorithms for robotic surgical system.
SUMMARY
[0006] In an aspect of the present disclosure, a method of collision handling for a robotic surgical system includes slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and an input handle is moved in a direction that corresponds to moving the tool towards the obstruction. The input handle having an offset relative to a desired pose of the tool after the input handle is slipped.
[0007] In aspects, the method includes moving the input handle in a direction to move the portion of the surgical robot away from the obstruction after the slipping of the input handle. The input handle may move a distance corresponding to the offset before the tool moves in a direction away from the obstruction. Alternatively, the tool may move in a direction away from the obstruction while maintaining a trim between a position of the input handle and a pose of the tool. The trim may be equal to the offset or the method may include dynamically scaling movement of the input handle relative to the pose of the tool in a direction parallel to the offset until the trim reaches a predetermined value. The predetermined value may be zero or nonzero.
[0008] In some aspects, slipping the handle relative to the pose of the tool occurs after the surgical robot reaches the predetermined force threshold to move the tool towards a desired pose. The method may further include a processing unit of the robotic surgical system to define the offset between a threshold position of the input handle when the tool reaches the predetermined force threshold and a position of the input handle after the input handle is pushed beyond the threshold position. The method may include the robotic surgical system providing force feedback to a clinician to resist slipping of the input handle beyond the threshold position.
[0009] In another aspect of the present disclosure, a method of collision handling of a robotic surgical system with a processing unit of the robotic surgical system includes receiving a first input signal from a user interface of the robotic surgical system to move a tool of a surgical robot of the robotic surgical system to a desired pose of the tool, transmitting an input control signal to the surgical robot to move the tool towards the desired pose, receiving a feedback signal from the surgical robot that a force to move the tool towards the desired pose is greater than a predetermined threshold, maintaining the tool at a threshold pose when the predetermined threshold is reached, and slipping a position of the input handle relative to the threshold pose to a second position of the input handle to define an offset between the second position of the input handle and a desired pose of the tool corresponding to the second position of the input handle.
[0010] In aspects, the method includes transmitting a feedback control signal to the user interface to resist movement of the input handle beyond a threshold position corresponding to the threshold pose of the tool. [0011] In some aspects, the method includes receiving a second input signal from the user interface after slipping the position of the input handle indicative of the input handle moving towards a threshold position corresponding to the threshold pose of the tool. The method may include maintaining the tool in the threshold pose in responds to receiving the second input signal. Alternatively, the method may include transmitting a second control signal to the surgical robot to move the tool away from the desired pose with a trim defined between the input handle and the pose of the tool. Transmitting the second control signal may include the trim being equal to the offset between the second position of the input handle and the desired pose of the tool corresponding to the second position of the input handle. The method may include dynamically scaling movement of the input handle to the pose of the tool to reduce the trim between the position of the input handle and the pose of the tool until the trim reaches a predetermined value. The predetermined value may be zero or nonzero.
[0012] Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:
[0014] FIG. 1 is a schematic illustration of a user interface and a robotic system in accordance with the present disclosure; and
[0015] FIG. 2 is a plan view, schematic illustration, of a workspace of the user interface of FIG. 1; [0016] FIG. 3 is a view of a display device of the user interface of FIG. 1 illustrating a tool of a surgical robot within a surgical site;
[0017] FIG. 4 is a flowchart of a method of collision handling and collision recovery in accordance with the present disclosure; and
[0018] FIG. 5 is a flowchart of another method of collision handling and collision recovery in accordance with the present disclosure.
DETAILED DESCRIPTION
[0019] Embodiments of the present disclosure are now described in detail with reference to the drawings in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term "clinician" refers to a doctor, a nurse, or any other care provider and may include support personnel. Throughout this description, the term "proximal" refers to the portion of the device or component thereof that is closest to the clinician and the term "distal" refers to the portion of the device or component thereof that is farthest from the clinician. In addition, as used herein the term "neutral" is understood to mean non-scaled.
[0020] This disclosure generally relates to collision handling and collision recovery algorithms or methods for robotic surgical systems. Specifically, for collision handling a processing unit of a robotic surgical system may allow an input handle of a user interface to slip beyond a position corresponding to a pose of a tool of a surgical robot when a portion of the surgical robot collides with an obstruction. Slipping the input handle relative to the pose of the tool defines an offset between the position of the input handle and a pose of the tool. [0021] To recover from the collision, the input handle may move through the entire offset before the tool moves from the pose when the surgical robot collided with the obstruction. Alternatively, any movement of the input handle to move the surgical robot away from the obstruction would move the surgical robot away from the obstruction such that a trim is defined between the position of the input handle and a pose of the tool. The trim may be equal to the offset or the robot surgical system may dynamically scale movement of the surgical robot to reduce or remove the trim in a manner imperceptible to a clinician.
[0022] Referring to FIG. 1, a robotic surgical system 1 in accordance with the present disclosure is shown generally as a surgical robot 10, a processing unit 30, and a user interface 40. The surgical robot 10 generally includes linkages 12 and a robot base 18. The linkages 12 moveably support an end effector or tool 20 which is configured to act on tissue. The linkages 12 may be in the form of arms each having an end 14 that supports an end effector or tool 20 which is configured to act on tissue. In addition, the ends 14 of the linkages 12 may include an imaging device 16 for imaging a surgical site "S". The user interface 40 is in communication with robot base 18 through the processing unit 30.
[0023] The user interface 40 includes a display device 44 which is configured to display three- dimensional images. The display device 44 displays three-dimensional images of the surgical site "S" which may include data captured by imaging devices 16 positioned on the ends 14 of the linkages 12 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site "S", an imaging device positioned adjacent the patient "P", imaging device 56 positioned at a distal end of an imaging arm 52). The imaging devices (e.g., imaging devices 16, 56) may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site "S". The imaging devices transmit captured imaging data to the processing unit 30 which creates three-dimensional images of the surgical site "S" in real-time from the imaging data and transmits the three-dimensional images to the display device 44 for display.
[0024] The user interface 40 also includes input handles 42 which are supported on control arms 43 which allow a clinician to manipulate the surgical robot 10 (e.g., move the arms 12, the ends 14 of the linkages 12, and/or the tools 20). Each of the input handles 42 is in communication with the processing unit 30 to transmit control signals thereto and to receive feedback signals therefrom. Additionally or alternatively, each of the input handles 42 may include input devices 46 (FIG. 2) which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools 20 supported at the ends 14 of the linkages 12.
[0025] With additional reference to FIG. 2, each of the input handles 42 is moveable through a predefined workspace to move the ends 14 of the linkages 12, e.g., tools 20 (FIG. 1), within a surgical site "S". The three-dimensional images on the display device 44 are orientated such that the movement of the input handles 42 moves the ends 14 of the linkages 12 as viewed on the display device 44. The three-dimensional images remain stationary while movement of the input handles 42 is scaled to movement of the ends 14 of the linkages 12 within the three-dimensional images. To maintain an orientation of the three-dimensional images, kinematic mapping of the input handles 42 is based on a camera orientation relative to an orientation of the ends 14 of the linkages 12. The orientation of the three-dimensional images on the display device 44 may be mirrored or rotated relative to view from above the patient "P". In addition, the size of the three- dimensional images on the display device 44 may be scaled to be larger or smaller than the actual structures of the surgical site permitting a clinician to have a better view of structures within the surgical site "S". As the input handles 42 are moved, the tools 20 are moved within the surgical site "S" as detailed below. Movement of the tools 20 may also include movement of the ends 14 of the linkages 12 which support the tools 20.
[0026] For a detailed discussion of the construction and operation of a robotic surgical system 1, reference may be made to U. S. Patent No. 8,828,023, the entire contents of which are incorporated herein by reference.
[0027] The movement of the tools 20 is scaled relative to the movement of the input handles 42. When the input handles 42 are moved within a predefined workspace, the input handles 42 send input signals to the processing unit 30. The processing unit 30 analyzes the input signals to move the tools 20 in response to the input signals. The processing unit 30 transmits scaled control signals to the robot base 18 to move the tools 20 in response to the movement of the input handles 42. The processing unit 30 scales the input signals by dividing an Inputdistance (e.g., the distance moved by one of the input handles 42) by a scaling factor SF to arrive at a scaled Outputdistance (e.g., the distance that one of the ends 14 is moved). The scaling factor SF is in a range between about 1 and about 10 (e.g., 3). This scaling is represented by the following equation:
Outputdistance = Inputdistance/ SF
It will be appreciated that the larger the scaling factor SF the smaller the movement of the tools 20 relative to the movement of the input handles 42. [0028] For a detailed description of scaling movement of the input handle 42 along the X, Y, and Z coordinate axes to movement of the tool 20, reference may be made to commonly owned International Patent Application Serial No. PCT/US2015/051130, filed September 21, 2015, and International Patent Application No. PCT/US2016/14031, filed January 20, 2016, the entire contents of each of these disclosures are herein incorporated by reference.
[0029] Referring to FIGS. 1-3, during a robotic surgical procedure, a clinician interfaces with the input handle 42 to manipulate the tool 20 within the surgical site "S". As the tool 20 is moved within the surgical site "S", a clinician can visualize movement of the tool 20 within the surgical site "S" on the display 44.
[0030] To manipulate the tool 20, a clinician moves an input handle 42 from a first position "PI" to a second position "P2", shown in dashed lines (FIG. 2). The processing unit 30 receives an input signal sent from the user interface 40 and transmits a control signal to the surgical robot 10 to move the tool 20 from a first pose to a second pose. For example, the input handle 42 is moved a distance along a control X axis in a direction illustrated by arrow "Ml" and the tool 20 is moved in a direction along a robotic X axis illustrated by arrow "Rl" representing movement of the tool 20 from a first pose "Tl" towards a second pose "T2".
[0031] During movement of the tool 20 from the first pose "Tl" towards the second pose "T2", the tool 20 may collide with an obstruction within the surgical site "S", e.g., tissue T, another tool 20, an organ, or other surgical implement. When the tool 20 collides with the obstruction, the processing unit 30 receives a feedback signal from the surgical robot 10 and transmits a feedback control signal to the user interface 40. In response to receiving the feedback control signal, the user interface provides force feedback to the clinician indicative of the tool 20 colliding with the obstruction. For example, the clinician may feel resistance to continued movement along the control X axis in the direction of the arrow "Ml".
[0032] When the clinician feels the force feedback, the clinician may push the input handle 42 against the force feedback (e.g., in a direction opposite to the direction of the force feedback) and continue to move the input handle 20 along the control X axis in the direction of arrow "Ml". In response, the processing unit 30 continues to send control signals to the surgical robot 10 to move the tool 20 along the robotic X axis in the direction of arrow "Rl" until the force of the surgical robot 10, to continue movement of the tool 20 along the robotic X axis, exceeds a predetermined threshold. The predetermined threshold may be determined by a deflection of a portion of the surgical robot 10 or by a torque at one or more joints of the surgical robot 10. When the force of the surgical robot 10 exceeds the predetermined threshold, the surgical robot 10 "clutches" the movement of the input handle 42 from movement of the robotic system 10, scales down movement of the input handle 42 from movement of the surgical robot 10, and/or any other known means of collision handling. For a detailed discussion of systems and methods for detecting and handling of a collision of a tool or linkage of a robotic system and an obstruction reference may be made to U.S. Provisional Patent Application Serial No. 62/613,543, filed January 4, 2018, and entitled "SURGICAL ROBOT INCLUDING TORQUE SENSORS [Atty. Docket No. C00014971.USP1 (203-11527)], the entire contents of which are hereby incorporated by reference.
[0033] With particular reference to FIG. 2, the force to move the tool 20 along the robotic X axis was reached the predetermined threshold when the input handle 42 was positioned at a threshold position "PT". As shown, the input handle 42 was pushed through the threshold position "PT" to the second position "P2". As the input handle 42 is moved between the threshold position "PT" and the second position "P2" the tool 20 is substantially stationary within the surgical site "S", e.g., the tool 20 remains in the first pose "Ti" as shown in FIG. 3, such that the input handle 42 "slips" relative to the tool 20. This "slipping" of the input handle 42 relative to the tool 20 results in a position mismatch between a desired pose "T2" of the tool 20 based on the position of the input handle 42 and the actual pose of the tool 20 which remains at the first pose "Tl".
[0034] With the input handle 42 in the second position "P2", the surgical robot 10 maintains the tool 20 at the first pose "Tl", the pose at which the predetermined threshold was reached, until the input handle 42 is moved along the control X axis in a direction that requires a force below the predetermined threshold to reposition the tool 20 along the robotic X axis, e.g., in a direction opposite the arrow "Rl".
[0035] This position mismatch can create undesired motions of the tool 20 within the surgical site "S" during a surgical procedure. For example, when the input handle 42 is in the second position "P2", the tool 20 may be maintained in the first pose "Tl" with the predetermined threshold force being directed towards an obstruction, e.g., tissue "T", such that, were the tool 20 to free itself from the obstruction, the tool 20 may move towards desired pose "T2" unexpectedly and/or at an undesired high velocity.
[0036] With reference to FIG. 4, a method 200 for slipping the input handle 42 relative to the tool 20 in an event of a collision with an obstruction and a method for collision recovery is disclosed, in accordance with the present disclosure, with reference to the robotic surgical system 1 of FIGS. 1-3. As detailed below, a collision between a tool 20 and tissue "T" of a patient is described; however, such a collision may be between any portion of the surgical robot 10 and an obstruction. For example, a collision may occur between a linkage 12 of the surgical robot 10 and another linkage 12.
[0037] Initially, a clinician moves the input handle 42 in a first direction along the control X axis towards the second position "P2" and transmits an input signal indicative of the movement (Step 210). The processing unit 30 receives the input signal (Step 240) and transmits an input control signal to move the tool 20 towards the desired pose of the surgical robot 10 (Step 242). The surgical robot 10 receives the control signal and moves the tool 20, and thus the surgical robot 10, towards the desired pose "T2" (Step 260).
[0038] As the tool 20 is moved towards the desired pose "T2", a portion of the surgical robot 10, e.g., tool 20, may collide with tissue "T" such that the surgical robot 10 would require a force greater than a predetermined threshold to continue to move the surgical robot 10 towards the desired pose "T2" (Step 262); this pose is defined as the threshold pose "Tl". When the predetermined threshold is reached or exceeded, the surgical robot 10 transmits a feedback signal to the processing unit 30.
[0039] The processing unit 30 receives the feedback signal (Step 244) from the surgical robot 10 and transmits a control signal to the surgical robot 10 (Step 246) to maintain the surgical robot at the threshold pose "Tl" (Step 264). In addition, the processing unit 30 transmits a feedback control signal to the user interface 40 (Step 246). In response to the feedback control signal, a clinician experiences force feedback against moving the input handle beyond a threshold position "PT" that corresponds to the threshold pose "Tl" of the surgical robot 10 (Step 212).
[0040] The clinician may push the input handle 42 in the first direction through the force feedback of the user interface 40 to a second position "P2" (Step 214). The processing unit 30 receives an input signal in response to movement of the input handle 42 in the first direction and slips the position of the input handle 42 relative to the pose of the surgical robot 10 (Step 248). As the input handle 42 is moved beyond the threshold position "PT" an offset is generated along the control X axis as the input handle 42 is "slipped" between the threshold position "PT" and the second position "P2". The offset represents movement of the input handle 42 beyond the point at which the position of the input handle 42 corresponds to the pose of the surgical robot 10, e.g., the threshold position "PT", and the position of the input handle 42, e.g., the second position "P2".
[0041] With the input handle 42 at the second position "P2", the input handle 42 can be moved along the control X axis in a second direction away from the obstruction, e.g., the direction opposite the arrow "Ml", (Step 216) such that the input handle 42 moves through a dead zone equal to the offset between the second position "P2" and the threshold position "PT" before the tool 20 of the surgical robot 10 moves along the robot X axis in a direction opposite the arrow "Rl". Once the input handle 42 returns to the threshold position "PT" along the control X axis, the surgical robot 10 is recovered from the collision such that the surgical robot 10 moves the tool 20 along the robot X axis in response to additional movement of the input handle 42 in the second direction (Steps 220, 254, 256, 266). It will be appreciated that movement of the input handle 42 along the control X axis towards the threshold position "PT" will be allowed with little or no resistance, e.g., force feedback, while additional movement of the input handle 42 along the control X axis away from the threshold position "PT" will be resisted with additional force feedback.
[0042] With additional reference to FIG. 5, another method 300 of collision recovery is disclosed in accordance with the present disclosure. After the processing unit 30 slips the position of the input handle 42 relative to the threshold pose of the surgical robot 10 to define an offset (Step 248), the input handle 42 is moved in the second direction along the control X axis (Step 302). The processing unit 30 receives an input signal indicative of the movement of the input handle 42 in the second direction (Step 350) and transmits a second control signal to the surgical robot 10 to move away from the threshold pose "T2" with a trim between the input handle and the pose of the surgical robot (Step 352). It will be appreciated that the trim is substantially equal to the offset between the threshold position "PT" and the second position "P2". The surgical robot 10 receives the second control signal and moves the surgical robot 10 away from the threshold pose (Step 366). The robotic surgical system 1 may continue to manipulate the surgical robot 10 in response to movements of the input handle 42 with the trim maintained between the position of the input handle 42 and the pose of the surgical robot 10.
[0043] Alternatively in some embodiments, the robotic surgical system 1 may dynamically scale the movement of the input handle 42 and the tool 20 to reduce or eliminate the trim in a manner imperceptible to a clinician. For example, the input handle 42 can be moved in the first and second directions along the control X axis such that input signals are transmitted to the processing unit 30 (Step 304). The processing unit 30 receives the input signals (Step 354) and dynamically scales movements of the input handle 42 to reduce the trim between the input handle 42 and the pose of the surgical robot 10 (Step 356). The processing unit 30 transmits scaled control signals to the surgical robot 10 (Step 358) which moves the surgical robot 10 in response to the scaled control signals (Step 368). The trim may be reduced to a predetermined value and the robotic surgical system 10 may continue to move the surgical robot 10 in response to movement of the input handle 42. In particular embodiments, the predetermined value of the trim is nonzero and in other embodiments the trim is reduced to zero such that the position of the input handle 42 corresponds to the pose of the surgical robot 10. [0044] For a detailed discussion of a robotic surgical system functioning with an offset and/or dynamic scaling to eliminate an offset reference can be made to commonly owned U.S. Provisional Patent Application No. 62/554,292, filed September 5, 2017 and entitled "ROBOTIC SURGICAL SYSTEMS WITH ROLL, PITCH, AND YAW REALIGNMENT INCLUDING TRIM AND FLIP ALGORITHMS" [Atty. Docket C00014973.USP1 (203-11525)], the entire contents of which are hereby incorporated by reference.
[0045] Slipping a position of the input handle 42 relative to a pose of the tool 20 allows for movement or repositioning of the input handle 42 within the workspace of the user interface 40 without movement of the tool 20 within the surgical site "S". The methods of collision recovery detailed above, e.g., moving the input handle 42 through a dead zone, operating with an offset, and dynamically scaling to eliminate offset, allows for predictable movement of a tool, e.g., tool 20, of a surgical robot after a collision. Such predictable movement may improve surgical outcomes, reduce the surgical time, reduce recovery time, and/or reduce the cost of surgery.
[0046] As detailed above, the user interface 40 is in operable communication with the surgical robot 10 to perform a surgical procedure on a patient; however, it is envisioned that the user interface 40 may be in operable communication with a surgical simulator (not shown) to virtually actuate a robotic system and/or tool in a simulated environment. For example, the robotic surgical system 1 may have a first mode in which the user interface 40 is coupled to actuate the surgical robot 10 and a second mode in which the user interface 40 is coupled to the surgical simulator to virtually actuate a robotic system. The surgical simulator may be a standalone unit or be integrated into the processing unit 30. The surgical simulator virtually responds to a clinician interfacing with the user interface 40 by providing visual, audible, force, and/or haptic feedback to a clinician through the user interface 40. For example, as a clinician interfaces with the input handles 42, the surgical simulator moves representative tools that are virtually acting on tissue. It is envisioned that the surgical simulator may allow a clinician to practice a surgical procedure before performing the surgical procedure on a patient. In addition, the surgical simulator may be used to train a clinician on a surgical procedure. Further, the surgical simulator may simulate "complications" during a proposed surgical procedure to permit a clinician to plan a surgical procedure.
[0047] While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the appended claims. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope of the claims appended hereto.

Claims

WHAT IS CLAIMED:
1. A method of collision handling for a robotic surgical system, the method comprising: slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and the input handle is moved in a direction corresponding to moving the tool towards the obstruction, the input handle having an offset relative to a desired pose of the tool in response to slipping of the input handle.
2. The method according to claim 1 , further comprising moving the input handle in a direction to move the portion of the surgical robot away from the obstruction after the slipping of the input handle, the input handle moving a distance corresponding to the offset before the tool moves in a direction away from the obstruction.
3. The method according to claim 1 , further comprising moving the input handle in a direction away from the obstruction after slipping the input handle such that the tool moves in a direction away from the obstruction maintaining a trim between a position of the input handle and a pose of the tool.
4. The method according to claim 3, wherein the trim is equal to the offset.
5. The method according to claim 3, further comprising dynamically scaling movement of the input handle relative to the pose of the tool in a direction parallel to the offset until the trim reaches a predetermined value.
6. The method according to claim 5, wherein the predetermined value is nonzero.
7. The method according to claim 1, wherein slipping the input handle relative to the pose of the tool occurs after the surgical robot reaches a predetermined force threshold to move the tool towards a desired pose.
8. The method according to claim 7, further comprising a processing unit of the robotic surgical system defining the offset between a threshold position of the input handle when the tool reaches the predetermined force threshold and a position of the input handle after the input handle is pushed beyond the threshold position.
9. The method according to claim 8, further comprising the robotic surgical system providing force feedback to a clinician to resist slipping of the input handle beyond the threshold position.
10. A method of collision handling of a robotic surgical system with a processing unit of the robotic surgical system, the method comprising: receiving a first input signal from a user interface of the robotic surgical system to move a tool of a surgical robot of the robotic surgical system to a desired pose of the tool; transmitting an input control signal to the surgical robot to move the tool towards the desired pose; receiving a feedback signal from the surgical robot that a force to move the tool towards the desired pose is greater than a predetermined threshold; maintaining the tool at a threshold pose when the predetermined threshold is reached; and slipping a position of the input handle relative to the threshold pose to a second position of the input handle to define an offset between the second position of the input handle and a desired pose of the tool corresponding to the second position of the input handle.
11. The method according to claim 10, further comprising transmitting a feedback control signal to the user interface to resist movement of the input handle beyond a threshold position corresponding to the threshold pose of the tool.
12. The method according to claim 10, further comprising receiving a second input signal from the user interface after slipping the position of the input handle indicative of the input handle moving towards a threshold position corresponding to the threshold pose of the tool.
13. The method according to claim 12, further comprising maintaining the tool in the threshold pose in response to receiving the second input signal.
14. The method according to claim 12, further comprising transmitting a second control signal to the surgical robot to move the tool away from the desired pose with a trim defined between the input handle and the pose of the tool.
15. The method according to claim 14, wherein transmitting the second control signal includes the trim being equal to the offset between the second position of the input handle and the desired pose of the tool corresponding to the second position of the input handle.
16. The method according to claim 14, further comprising dynamically scaling movement of the input handle to the pose of the tool to reduce the trim between the position of the input handle and the pose of the tool until the trim reaches a predetermined value. The method according to claim 16, wherein the predetermined value is nonzero.
PCT/US2018/049334 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems WO2019050829A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2020534809A JP7349992B2 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems
EP18853152.9A EP3678572A4 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems
US16/643,306 US11628022B2 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems
CN201880006846.2A CN110177516B (en) 2017-09-05 2018-09-04 Collision handling algorithm for robotic surgical systems
CA3074443A CA3074443A1 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems
AU2018328098A AU2018328098A1 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762554331P 2017-09-05 2017-09-05
US62/554,331 2017-09-05

Publications (1)

Publication Number Publication Date
WO2019050829A1 true WO2019050829A1 (en) 2019-03-14

Family

ID=65634540

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/049334 WO2019050829A1 (en) 2017-09-05 2018-09-04 Collision handling algorithms for robotic surgical systems

Country Status (7)

Country Link
US (1) US11628022B2 (en)
EP (1) EP3678572A4 (en)
JP (1) JP7349992B2 (en)
CN (1) CN110177516B (en)
AU (1) AU2018328098A1 (en)
CA (1) CA3074443A1 (en)
WO (1) WO2019050829A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11135014B2 (en) 2011-09-02 2021-10-05 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
GB2625572A (en) * 2022-12-20 2024-06-26 Cmr Surgical Ltd Control of a surgical robot arm

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11398047B2 (en) * 2019-05-30 2022-07-26 Nvidia Corporation Virtual reality simulations using surface tracking

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070293734A1 (en) * 2001-06-07 2007-12-20 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20110054686A1 (en) * 2009-08-25 2011-03-03 Samsung Electronics Co., Ltd. Apparatus and method detecting a robot slip
WO2016053657A1 (en) * 2014-09-29 2016-04-07 Covidien Lp Dynamic input scaling for controls of robotic surgical system
WO2017075121A1 (en) * 2015-10-30 2017-05-04 Covidien Lp Haptic fedback controls for a robotic surgical system interface

Family Cites Families (255)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5762458A (en) 1996-02-20 1998-06-09 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US5855583A (en) 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US5792135A (en) 1996-05-20 1998-08-11 Intuitive Surgical, Inc. Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6364888B1 (en) 1996-09-09 2002-04-02 Intuitive Surgical, Inc. Alignment of master and slave in a minimally invasive surgical apparatus
US7727244B2 (en) 1997-11-21 2010-06-01 Intuitive Surgical Operation, Inc. Sterile surgical drape
US7666191B2 (en) 1996-12-12 2010-02-23 Intuitive Surgical, Inc. Robotic surgical system with sterile surgical adaptor
US8182469B2 (en) 1997-11-21 2012-05-22 Intuitive Surgical Operations, Inc. Surgical accessory clamp and method
US7699855B2 (en) 1996-12-12 2010-04-20 Intuitive Surgical Operations, Inc. Sterile surgical adaptor
US7963913B2 (en) 1996-12-12 2011-06-21 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US8206406B2 (en) 1996-12-12 2012-06-26 Intuitive Surgical Operations, Inc. Disposable sterile surgical adaptor
US8529582B2 (en) 1996-12-12 2013-09-10 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US6331181B1 (en) 1998-12-08 2001-12-18 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US6132368A (en) 1996-12-12 2000-10-17 Intuitive Surgical, Inc. Multi-component telepresence system and method
US6714839B2 (en) 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
DE69940850D1 (en) 1998-08-04 2009-06-18 Intuitive Surgical Inc Articular device for positioning a manipulator for robotic surgery
US6459926B1 (en) 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6659939B2 (en) 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US6951535B2 (en) 2002-01-16 2005-10-04 Intuitive Surgical, Inc. Tele-medicine system that transmits an entire state of a subsystem
US8600551B2 (en) 1998-11-20 2013-12-03 Intuitive Surgical Operations, Inc. Medical robotic system with operatively couplable simulator unit for surgeon training
US7125403B2 (en) 1998-12-08 2006-10-24 Intuitive Surgical In vivo accessories for minimally invasive robotic surgery
US6799065B1 (en) 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6493608B1 (en) 1999-04-07 2002-12-10 Intuitive Surgical, Inc. Aspects of a control system of a minimally invasive surgical apparatus
US6770081B1 (en) 2000-01-07 2004-08-03 Intuitive Surgical, Inc. In vivo accessories for minimally invasive robotic surgery and methods
US6394998B1 (en) 1999-01-22 2002-05-28 Intuitive Surgical, Inc. Surgical tools for use in minimally invasive telesurgical applications
US8944070B2 (en) * 1999-04-07 2015-02-03 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US6594552B1 (en) 1999-04-07 2003-07-15 Intuitive Surgical, Inc. Grip strength with tactile feedback for robotic surgery
US6565554B1 (en) 1999-04-07 2003-05-20 Intuitive Surgical, Inc. Friction compensation in a minimally invasive surgical apparatus
US6424885B1 (en) 1999-04-07 2002-07-23 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6793652B1 (en) 1999-06-02 2004-09-21 Power Medical Interventions, Inc. Electro-mechanical surgical device
US6716233B1 (en) 1999-06-02 2004-04-06 Power Medical Interventions, Inc. Electromechanical driver and remote surgical instrument attachment having computer assisted control capabilities
US7695485B2 (en) 2001-11-30 2010-04-13 Power Medical Interventions, Llc Surgical device
US6315184B1 (en) 1999-06-02 2001-11-13 Powermed, Inc. Stapling device for use with an electromechanical driver device for use with anastomosing, stapling, and resecting instruments
US8004229B2 (en) 2005-05-19 2011-08-23 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US8768516B2 (en) 2009-06-30 2014-07-01 Intuitive Surgical Operations, Inc. Control of medical robotic system manipulator about kinematic singularities
US10188471B2 (en) 1999-09-17 2019-01-29 Intuitive Surgical Operations, Inc. Tele-operative surgical systems and methods of control at joint limits using inverse kinematics
US7594912B2 (en) 2004-09-30 2009-09-29 Intuitive Surgical, Inc. Offset remote center manipulator for robotic surgery
US6491691B1 (en) 1999-10-08 2002-12-10 Intuitive Surgical, Inc. Minimally invasive surgical hook apparatus and method for using same
US6206903B1 (en) 1999-10-08 2001-03-27 Intuitive Surgical, Inc. Surgical tool with mechanical advantage
US6312435B1 (en) 1999-10-08 2001-11-06 Intuitive Surgical, Inc. Surgical instrument with extended reach for use in minimally invasive surgery
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US6902560B1 (en) 2000-07-27 2005-06-07 Intuitive Surgical, Inc. Roll-pitch-roll surgical tool
US6746443B1 (en) 2000-07-27 2004-06-08 Intuitive Surgical Inc. Roll-pitch-roll surgical tool
US6840938B1 (en) 2000-12-29 2005-01-11 Intuitive Surgical, Inc. Bipolar cauterizing instrument
ES2304430T3 (en) 2001-01-29 2008-10-16 The Acrobot Company Limited ROBOTS WITH ACTIVE LIMITATION.
US7824401B2 (en) 2004-10-08 2010-11-02 Intuitive Surgical Operations, Inc. Robotic tool with wristed monopolar electrosurgical end effectors
US6783524B2 (en) 2001-04-19 2004-08-31 Intuitive Surgical, Inc. Robotic surgical tool with ultrasound cauterizing and cutting instrument
US6994708B2 (en) 2001-04-19 2006-02-07 Intuitive Surgical Robotic tool with monopolar electro-surgical scissors
US6817974B2 (en) 2001-06-29 2004-11-16 Intuitive Surgical, Inc. Surgical tool having positively positionable tendon-actuated multi-disk wrist joint
CA2451824C (en) 2001-06-29 2015-02-24 Intuitive Surgical, Inc. Platform link wrist mechanism
US6676684B1 (en) 2001-09-04 2004-01-13 Intuitive Surgical, Inc. Roll-pitch-roll-yaw surgical tool
US6728599B2 (en) 2001-09-07 2004-04-27 Computer Motion, Inc. Modularity system for computer assisted surgery
CA2466812C (en) 2001-12-04 2012-04-03 Michael P. Whitman System and method for calibrating a surgical instrument
US6839612B2 (en) 2001-12-07 2005-01-04 Institute Surgical, Inc. Microwrist system for surgical procedures
US6793653B2 (en) 2001-12-08 2004-09-21 Computer Motion, Inc. Multifunctional handle for a medical robotic system
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
US7155316B2 (en) 2002-08-13 2006-12-26 Microbotics Corporation Microsurgical robot system
EP1575439B1 (en) 2002-12-06 2012-04-04 Intuitive Surgical, Inc. Flexible wrist for surgical tool
US7386365B2 (en) 2004-05-04 2008-06-10 Intuitive Surgical, Inc. Tool grip calibration for robotic surgery
US8882657B2 (en) 2003-03-07 2014-11-11 Intuitive Surgical Operations, Inc. Instrument having radio frequency identification systems and methods for use
US7410483B2 (en) 2003-05-23 2008-08-12 Novare Surgical Systems, Inc. Hand-actuated device for remote manipulation of a grasping tool
US9002518B2 (en) 2003-06-30 2015-04-07 Intuitive Surgical Operations, Inc. Maximum torque driving of robotic surgical tools in robotic surgical systems
US7379790B2 (en) 2004-05-04 2008-05-27 Intuitive Surgical, Inc. Tool memory-based software upgrades for robotic surgery
WO2006015319A2 (en) 2004-07-30 2006-02-09 Power Medical Interventions, Inc. Flexible shaft extender and method of using same
US9261172B2 (en) 2004-09-30 2016-02-16 Intuitive Surgical Operations, Inc. Multi-ply strap drive trains for surgical robotic arms
US9700334B2 (en) 2004-11-23 2017-07-11 Intuitive Surgical Operations, Inc. Articulating mechanisms and link systems with torque transmission in remote manipulation of instruments and tools
CA2826925C (en) 2005-02-22 2017-01-24 Mako Surgical Corp. Haptic guidance system and method
US8465474B2 (en) 2009-05-19 2013-06-18 Intuitive Surgical Operations, Inc. Cleaning of a surgical instrument force sensor
US8496647B2 (en) 2007-12-18 2013-07-30 Intuitive Surgical Operations, Inc. Ribbed force sensor
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US8108072B2 (en) 2007-09-30 2012-01-31 Intuitive Surgical Operations, Inc. Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
US8147503B2 (en) 2007-09-30 2012-04-03 Intuitive Surgical Operations Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US11259870B2 (en) 2005-06-06 2022-03-01 Intuitive Surgical Operations, Inc. Interactive user interfaces for minimally invasive telesurgical systems
US8398541B2 (en) 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
WO2007005555A2 (en) 2005-06-30 2007-01-11 Intuitive Surgical Indicator for tool state communication in multi-arm telesurgery
US8273076B2 (en) 2005-06-30 2012-09-25 Intuitive Surgical Operations, Inc. Indicator for tool state and communication in multi-arm robotic telesurgery
US8079950B2 (en) * 2005-09-29 2011-12-20 Intuitive Surgical Operations, Inc. Autofocus and/or autoscaling in telesurgery
US7762825B2 (en) 2005-12-20 2010-07-27 Intuitive Surgical Operations, Inc. Electro-mechanical interfaces to mount robotic surgical arms
US7689320B2 (en) 2005-12-20 2010-03-30 Intuitive Surgical Operations, Inc. Robotic surgical system with joint motion controller adapted to reduce instrument tip vibrations
US8182470B2 (en) 2005-12-20 2012-05-22 Intuitive Surgical Operations, Inc. Telescoping insertion axis of a robotic surgical system
US9241767B2 (en) 2005-12-20 2016-01-26 Intuitive Surgical Operations, Inc. Method for handling an operator command exceeding a medical device state limitation in a medical robotic system
US7453227B2 (en) 2005-12-20 2008-11-18 Intuitive Surgical, Inc. Medical robotic system with sliding mode control
US7819859B2 (en) 2005-12-20 2010-10-26 Intuitive Surgical Operations, Inc. Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator
US7741802B2 (en) 2005-12-20 2010-06-22 Intuitive Surgical Operations, Inc. Medical robotic system with programmably controlled constraints on error dynamics
US7756036B2 (en) 2005-12-22 2010-07-13 Intuitive Surgical Operations, Inc. Synchronous data communication
US7757028B2 (en) 2005-12-22 2010-07-13 Intuitive Surgical Operations, Inc. Multi-priority messaging
US8054752B2 (en) 2005-12-22 2011-11-08 Intuitive Surgical Operations, Inc. Synchronous data communication
US9266239B2 (en) * 2005-12-27 2016-02-23 Intuitive Surgical Operations, Inc. Constraint based control in a minimally invasive surgical apparatus
US7930065B2 (en) 2005-12-30 2011-04-19 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US8628518B2 (en) 2005-12-30 2014-01-14 Intuitive Surgical Operations, Inc. Wireless force sensor on a distal portion of a surgical instrument and method
US7907166B2 (en) 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US7835823B2 (en) 2006-01-05 2010-11-16 Intuitive Surgical Operations, Inc. Method for tracking and reporting usage events to determine when preventive maintenance is due for a medical robotic system
JP5236502B2 (en) 2006-02-22 2013-07-17 ハンセン メディカル,インク. System and apparatus for measuring distal force of a work implement
US8597182B2 (en) 2006-04-28 2013-12-03 Intuitive Surgical Operations, Inc. Robotic endoscopic retractor for use in minimally invasive surgery
KR101477125B1 (en) 2006-06-13 2014-12-29 인튜어티브 서지컬 인코포레이티드 Minimally invasive surgical system
US8419717B2 (en) 2006-06-13 2013-04-16 Intuitive Surgical Operations, Inc. Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
US8597280B2 (en) 2006-06-13 2013-12-03 Intuitive Surgical Operations, Inc. Surgical instrument actuator
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20090192523A1 (en) 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical instrument
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US7391173B2 (en) 2006-06-30 2008-06-24 Intuitive Surgical, Inc Mechanically decoupled capstan drive
US8151661B2 (en) 2006-06-30 2012-04-10 Intuituve Surgical Operations, Inc. Compact capstan
US7736254B2 (en) 2006-10-12 2010-06-15 Intuitive Surgical Operations, Inc. Compact cable tension tender device
US7935130B2 (en) 2006-11-16 2011-05-03 Intuitive Surgical Operations, Inc. Two-piece end-effectors for robotic surgical tools
US9226648B2 (en) 2006-12-21 2016-01-05 Intuitive Surgical Operations, Inc. Off-axis visualization systems
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US8620473B2 (en) 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US8852208B2 (en) 2010-05-14 2014-10-07 Intuitive Surgical Operations, Inc. Surgical system instrument mounting
US20130165945A9 (en) 2007-08-14 2013-06-27 Hansen Medical, Inc. Methods and devices for controlling a shapeable instrument
US20090076476A1 (en) 2007-08-15 2009-03-19 Hansen Medical, Inc. Systems and methods employing force sensing for mapping intra-body tissue
US9050120B2 (en) 2007-09-30 2015-06-09 Intuitive Surgical Operations, Inc. Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US8012170B2 (en) 2009-04-27 2011-09-06 Tyco Healthcare Group Lp Device and method for controlling compression of tissue
US8561473B2 (en) 2007-12-18 2013-10-22 Intuitive Surgical Operations, Inc. Force sensor temperature compensation
WO2009094670A1 (en) 2008-01-25 2009-07-30 The Trustees Of Columbia University In The City Of New York Systems and methods for force sensing in a robot
CA2712607A1 (en) 2008-01-25 2009-07-30 Mcmaster University Surgical guidance utilizing tissue feedback
US8808164B2 (en) 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US8155479B2 (en) 2008-03-28 2012-04-10 Intuitive Surgical Operations Inc. Automated panning and digital zooming for robotic surgical systems
CN101543394B (en) 2008-03-28 2013-03-27 德昌电机股份有限公司 Telescopic tilting mechanism
US7886743B2 (en) 2008-03-31 2011-02-15 Intuitive Surgical Operations, Inc. Sterile drape interface for robotic surgical instrument
US7843158B2 (en) 2008-03-31 2010-11-30 Intuitive Surgical Operations, Inc. Medical robotic system adapted to inhibit motions resulting in excessive end effector forces
US9895813B2 (en) 2008-03-31 2018-02-20 Intuitive Surgical Operations, Inc. Force and torque sensing in a surgical robot setup arm
US9265567B2 (en) 2008-06-30 2016-02-23 Intuitive Surgical Operations, Inc. Vessel sealing instrument with stepped jaw
US8540748B2 (en) 2008-07-07 2013-09-24 Intuitive Surgical Operations, Inc. Surgical instrument wrist
US8821480B2 (en) 2008-07-16 2014-09-02 Intuitive Surgical Operations, Inc. Four-cable wrist with solid surface cable channels
US9204923B2 (en) 2008-07-16 2015-12-08 Intuitive Surgical Operations, Inc. Medical instrument electronically energized using drive cables
US8315720B2 (en) 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100116080A1 (en) 2008-11-11 2010-05-13 Intuitive Surgical, Inc. Robotic linkage
US8083691B2 (en) 2008-11-12 2011-12-27 Hansen Medical, Inc. Apparatus and method for sensing force
US8161838B2 (en) 2008-12-22 2012-04-24 Intuitive Surgical Operations, Inc. Method and apparatus for reducing at least one friction force opposing an axial force exerted through an actuator element
US8335590B2 (en) 2008-12-23 2012-12-18 Intuitive Surgical Operations, Inc. System and method for adjusting an image capturing device attribute using an unused degree-of-freedom of a master control device
US8374723B2 (en) 2008-12-31 2013-02-12 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US8594841B2 (en) 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US8858547B2 (en) 2009-03-05 2014-10-14 Intuitive Surgical Operations, Inc. Cut and seal instrument
US8120301B2 (en) 2009-03-09 2012-02-21 Intuitive Surgical Operations, Inc. Ergonomic surgeon control console in robotic surgical systems
US8423182B2 (en) 2009-03-09 2013-04-16 Intuitive Surgical Operations, Inc. Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems
US8418073B2 (en) 2009-03-09 2013-04-09 Intuitive Surgical Operations, Inc. User interfaces for electrosurgical tools in robotic surgical systems
US8423186B2 (en) 2009-06-30 2013-04-16 Intuitive Surgical Operations, Inc. Ratcheting for master alignment of a teleoperated minimally-invasive surgical instrument
US9492927B2 (en) * 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
KR101606097B1 (en) 2009-10-01 2016-03-24 마코 서지컬 코포레이션 Surgical system for positioning prosthetic component andor for constraining movement of surgical tool
JP2013509902A (en) 2009-11-04 2013-03-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Collision avoidance and detection using distance sensors
US9259275B2 (en) 2009-11-13 2016-02-16 Intuitive Surgical Operations, Inc. Wrist articulation by linked tension members
BR112012011424B1 (en) 2009-11-13 2020-10-20 Intuitive Surgical Operations, Inc surgical instrument
EP4059460A1 (en) 2009-11-13 2022-09-21 Intuitive Surgical Operations, Inc. Surgical tool with a compact wrist
CN102711586B (en) 2010-02-11 2015-06-17 直观外科手术操作公司 Method and system for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
US8644988B2 (en) 2010-05-14 2014-02-04 Intuitive Surgical Operations, Inc. Drive force control in medical instrument providing position measurements
US9019345B2 (en) 2010-07-02 2015-04-28 Intuitive Surgical Operations, Inc. Imaging mode blooming suppression
KR101906539B1 (en) 2010-07-09 2018-10-11 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Electrosurgical tool cover
CN103068348B (en) 2010-08-02 2015-07-15 约翰霍普金斯大学 Method for presenting force sensor information using cooperative robot control and audio feedback
DE102010043584A1 (en) 2010-11-08 2012-05-10 Kuka Laboratories Gmbh Medical workstation
JP6063387B2 (en) 2010-11-15 2017-01-18 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Separation of instrument shaft roll and end effector actuation in surgical instruments
US9241766B2 (en) 2010-12-22 2016-01-26 Intuitive Surgical Operations, Inc. Alternate instrument removal
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
KR102182874B1 (en) 2011-02-15 2020-11-25 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems for indicating a clamping prediction
WO2012112249A1 (en) 2011-02-15 2012-08-23 Intuitive Surgical Operations, Inc. Systems for detecting clamping or firing failure
EP2675387B1 (en) 2011-02-15 2018-04-25 Intuitive Surgical Operations, Inc. Seals and sealing methods for a surgical instrument having an articulated end effector actuated by a drive shaft
EP3278744B1 (en) 2011-02-15 2021-10-20 Intuitive Surgical Operations, Inc. Indicator for knife location in a stapling or vessel sealing instrument
US9393017B2 (en) 2011-02-15 2016-07-19 Intuitive Surgical Operations, Inc. Methods and systems for detecting staple cartridge misfire or failure
EP3300678A1 (en) 2011-02-18 2018-04-04 Intuitive Surgical Operations Inc. Fusing and cutting surgical instrument and related methods
US9259277B2 (en) 2011-05-13 2016-02-16 Intuitive Surgical Operations, Inc. Instrument actuation interface
US8870912B2 (en) 2011-05-31 2014-10-28 Intuitive Surgical Operations, Inc. Surgical instrument with single drive input for two end effector mechanisms
KR102109615B1 (en) 2011-05-31 2020-05-12 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Positive control of robotic surgical instrument end effector
JP5715304B2 (en) 2011-07-27 2015-05-07 エコール ポリテクニーク フェデラル デ ローザンヌ (イーピーエフエル) Mechanical remote control device for remote control
EP2768419B1 (en) 2011-10-21 2020-05-13 Intuitive Surgical Operations, Inc. Grip force control for robotic surgical instrument end effector
WO2013063522A2 (en) 2011-10-26 2013-05-02 Reid Robert Cyrus Surgical instrument motor pack latch
WO2013066790A1 (en) 2011-11-02 2013-05-10 Intuitive Surgical Operations, Inc. Method and system for stereo gaze tracking
US9144456B2 (en) 2012-04-09 2015-09-29 Intuitive Surgical Operations, Inc. Surgical instrument control
KR101800189B1 (en) 2012-04-30 2017-11-23 삼성전자주식회사 Apparatus and method for controlling power of surgical robot
US9333650B2 (en) 2012-05-11 2016-05-10 Vanderbilt University Method and system for contact detection and contact localization along continuum robots
EP2884935B1 (en) 2012-08-15 2020-04-08 Intuitive Surgical Operations, Inc. Phantom degrees of freedom in joint estimation and control
CN104717936B (en) 2012-08-15 2018-01-26 直观外科手术操作公司 The breakaway-element clutch for the operation mounting platform that user starts
JP6250673B2 (en) 2012-08-15 2017-12-20 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Movable surgical mounting platform controlled by manual robot arm movement
WO2014032046A1 (en) 2012-08-24 2014-02-27 University Of Houston Robotic device and systems for image-guided and robot-assisted surgery
CN109846553B (en) 2012-09-17 2022-03-08 直观外科手术操作公司 Method and system for assigning input devices for teleoperated surgical instrument functions
KR102218244B1 (en) 2012-12-10 2021-02-22 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
US9498215B2 (en) 2012-12-31 2016-11-22 Intuitive Surgical Operations, Inc. Surgical staple cartridge with enhanced knife clearance
US10277097B2 (en) 2013-01-14 2019-04-30 Intuitive Surgical Operations, Inc. Motor assembly
KR20140102465A (en) 2013-02-14 2014-08-22 삼성전자주식회사 Surgical robot and method for controlling the same
US9424303B2 (en) 2013-02-15 2016-08-23 Intuitive Surgical Operations, Inc. Systems and methods for synchronizing nodes of a robotic system
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
US9839481B2 (en) 2013-03-07 2017-12-12 Intuitive Surgical Operations, Inc. Hybrid manual and robotic interventional instruments and methods of use
EP4005546A1 (en) 2013-03-15 2022-06-01 Intuitive Surgical Operations, Inc. Surgical patient side cart with steering interface
US9948852B2 (en) 2013-03-15 2018-04-17 Intuitive Surgical Operations, Inc. Intelligent manual adjustment of an image control element
WO2014186412A2 (en) 2013-05-15 2014-11-20 Intuitive Surgical Operations, Inc. Force transmission mechanism for teleoperated surgical system
CN105208963B (en) 2013-05-15 2018-12-04 直观外科手术操作公司 Surgery patients side cart with suspension
KR102463600B1 (en) 2013-08-09 2022-11-07 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Medical robotic system with remote current controller for controlling a plurality of distally housed motors
CN105611892B (en) 2013-08-15 2019-02-19 直观外科手术操作公司 Robotic tool driven element
US10271911B2 (en) 2013-08-15 2019-04-30 Intuitive Surgical Operations, Inc. Instrument sterile adapter drive features
CN105637552B (en) 2013-08-16 2019-06-14 直观外科手术操作公司 System and method for recording and replaying between heterogeneous device
US9539059B2 (en) 2013-09-24 2017-01-10 Sony Olympus Medical Solutions Inc. Medical robot arm apparatus, medical robot arm control system, medical robot arm control method, and program
US9446517B2 (en) 2013-10-17 2016-09-20 Intuitive Surgical Operations, Inc. Fault reaction, fault isolation, and graceful degradation in a robotic system
JP5754820B2 (en) 2013-11-28 2015-07-29 国立大学法人東京工業大学 Surgical robot
EP4184483B1 (en) 2013-12-20 2024-09-11 Intuitive Surgical Operations, Inc. Simulator system for medical procedure training
TWI548388B (en) 2013-12-30 2016-09-11 國立臺灣大學 A handheld robot for orthopedic surgery and a control method thereof
CN111481245A (en) 2014-02-21 2020-08-04 直观外科手术操作公司 Articulatable members with constrained motion and related apparatus and methods
KR102410823B1 (en) 2014-02-21 2022-06-21 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Mechanical joints, and related systems and methods
KR102443404B1 (en) 2014-03-17 2022-09-16 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Sterile barrier between surgical instrument and teleoperated actuator
KR102456408B1 (en) 2014-03-17 2022-10-20 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Surgical cannula mounts and related systems and methods
WO2015142784A1 (en) 2014-03-17 2015-09-24 Intuitive Surgical Operations, Inc. Command shaping to dampen vibrations in mode transitions
EP3119329B1 (en) 2014-03-17 2022-07-20 Intuitive Surgical Operations, Inc. Guided setup for teleoperated medical device
KR102332119B1 (en) 2014-03-17 2021-11-29 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Automated structure with pre-established arm positions in a teleoperated medical system
CN106102549B (en) 2014-03-17 2018-12-04 直观外科手术操作公司 System and method for controlling imaging instrument orientation
KR102364743B1 (en) 2014-03-17 2022-02-18 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Restoring instrument control input position/orientation during midprocedure restart
US9918800B2 (en) 2014-03-17 2018-03-20 Intuitive Surgical Operations, Inc. Surgical system with obstacle indication system
KR102677390B1 (en) 2014-03-17 2024-06-24 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Wheeled cart with vibration reduction device, and related systems and methods
WO2015142824A1 (en) 2014-03-17 2015-09-24 Intuitive Surgical Operations, Inc. Surgical drape and systems including surgical drape and attachment sensor
KR102300251B1 (en) 2014-03-17 2021-09-09 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Automatic push-out to avoid range of motion limits
US10432922B2 (en) 2014-03-19 2019-10-01 Intuitive Surgical Operations, Inc. Medical devices, systems, and methods using eye gaze tracking for stereo viewer
CN106456148B (en) 2014-03-19 2020-06-12 直观外科手术操作公司 Medical devices, systems, and methods using eye gaze tracking
KR20150128049A (en) 2014-05-08 2015-11-18 삼성전자주식회사 Surgical robot and control method thereof
US10743947B2 (en) * 2014-05-15 2020-08-18 Covidien Lp Systems and methods for controlling a camera position in a surgical robotic system
NL2013369B1 (en) 2014-08-26 2016-09-26 Univ Eindhoven Tech Surgical robotic system and control of surgical robotic system.
US10327855B2 (en) 2014-09-17 2019-06-25 Intuitive Surgical Operations, Inc. Systems and methods for utilizing augmented Jacobian to control manipulator joint movement
US10123846B2 (en) 2014-11-13 2018-11-13 Intuitive Surgical Operations, Inc. User-interface control using master controller
CN107249498B (en) 2015-02-19 2024-04-23 柯惠Lp公司 Repositioning method for input device of robotic surgical system
EP3628264B1 (en) 2015-03-17 2024-10-16 Intuitive Surgical Operations, Inc. Systems and methods for rendering onscreen identification of instruments in a teleoperational medical system
US10033308B2 (en) 2015-03-17 2018-07-24 Intuitive Surgical Operations, Inc. Systems and methods for motor torque compensation
KR102673560B1 (en) 2015-06-09 2024-06-12 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Configuring surgical system with surgical procedures atlas
US10524871B2 (en) 2015-06-10 2020-01-07 Intuitive Surgical Operations, Inc. Master-to-slave orientation mapping when misaligned
WO2016201313A1 (en) 2015-06-11 2016-12-15 Intuitive Surgical Operations, Inc. Systems and methods for instrument engagement
US10507068B2 (en) 2015-06-16 2019-12-17 Covidien Lp Robotic surgical system torque transduction sensing
CN113456241A (en) 2015-11-12 2021-10-01 直观外科手术操作公司 Surgical system with training or assisting function
WO2017083125A1 (en) 2015-11-13 2017-05-18 Intuitive Surgical Operations, Inc. Stapler with composite cardan and screw drive
US10898189B2 (en) 2015-11-13 2021-01-26 Intuitive Surgical Operations, Inc. Push-pull stapler with two degree of freedom wrist
WO2017100434A1 (en) 2015-12-10 2017-06-15 Covidien Lp Robotic surgical systems with independent roll, pitch, and yaw scaling
US9949798B2 (en) 2016-01-06 2018-04-24 Ethicon Endo-Surgery, Llc Methods, systems, and devices for controlling movement of a robotic surgical system
KR20180100702A (en) 2016-01-29 2018-09-11 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and Methods for Variable Speed Surgical Instruments
US10864050B2 (en) * 2016-02-26 2020-12-15 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US11013567B2 (en) 2016-03-17 2021-05-25 Intuitive Surgical Operations, Inc. Systems and methods for instrument insertion control
US11327475B2 (en) * 2016-05-09 2022-05-10 Strong Force Iot Portfolio 2016, Llc Methods and systems for intelligent collection and analysis of vehicle data
WO2018005750A1 (en) 2016-07-01 2018-01-04 Intuitive Surgical Operations, Inc. Computer-assisted medical systems and methods
KR102533374B1 (en) 2016-07-14 2023-05-26 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Automatic manipulator assembly deployment for draping
EP4414144A2 (en) 2016-07-14 2024-08-14 Intuitive Surgical Operations, Inc. Systems and methods for controlling a surgical instrument
KR102456414B1 (en) 2016-09-09 2022-10-19 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Push-Pull Surgical Instrument End Effector Activation Using a Flexible Tensile Member
US11166773B2 (en) 2016-09-09 2021-11-09 Intuitive Surgical Operations, Inc. Stapler beam architecture
US11234700B2 (en) 2016-09-09 2022-02-01 Intuitive Surgical Operations, Inc. Wrist architecture
US11020193B2 (en) 2016-09-15 2021-06-01 Intuitive Surgical Operations, Inc. Medical device drive system
EP3515349B1 (en) 2016-09-19 2024-09-04 Intuitive Surgical Operations, Inc. Positioning indicator system for a remotely controllable arm and related methods
WO2018052795A1 (en) 2016-09-19 2018-03-22 Intuitive Surgical Operations, Inc. Base positioning system for a controllable arm and related methods
WO2018067451A1 (en) 2016-10-03 2018-04-12 Intuitive Surgical Operations, Inc. Surgical instrument with retaining feature for cutting element
CN109843189B (en) 2016-10-11 2022-01-14 直观外科手术操作公司 Stapler cartridge with integral knife
KR102546565B1 (en) 2016-10-14 2023-06-23 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Preload Tensioning System for Surgical Instruments and Related Methods
CN113974735A (en) 2016-11-02 2022-01-28 直观外科手术操作公司 Robotic surgical stapler assembly configured to be reloaded using a stapler
US11241274B2 (en) 2016-11-04 2022-02-08 Intuitive Surgical Operations, Inc. Electrically insulative electrode spacers, and related devices, systems, and methods
US11040189B2 (en) 2016-11-04 2021-06-22 Intuitive Surgical Operations, Inc. Electrode assemblies with electrically insulative electrode spacers, and related devices, systems, and methods
US11241290B2 (en) 2016-11-21 2022-02-08 Intuitive Surgical Operations, Inc. Cable length conserving medical instrument
US11154373B2 (en) 2017-02-08 2021-10-26 Intuitive Surgical Operations, Inc. Control of computer-assisted tele-operated systems
US10357321B2 (en) 2017-02-24 2019-07-23 Intuitive Surgical Operations, Inc. Splayed cable guide for a medical instrument
EP4241720A3 (en) 2017-07-27 2023-11-08 Intuitive Surgical Operations, Inc. Association systems for manipulators
US11161243B2 (en) 2017-11-10 2021-11-02 Intuitive Surgical Operations, Inc. Systems and methods for controlling a robotic manipulator or associated tool
KR102348324B1 (en) 2017-11-10 2022-01-10 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for controlling a robotic manipulator or associated tool
US11173597B2 (en) 2017-11-10 2021-11-16 Intuitive Surgical Operations, Inc. Systems and methods for controlling a robotic manipulator or associated tool
US11191596B2 (en) 2017-11-15 2021-12-07 Intuitive Surgical Operations, Inc. Foot controller
WO2019136039A1 (en) * 2018-01-04 2019-07-11 Covidien Lp Robotic surgical systems including torque sensors

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070293734A1 (en) * 2001-06-07 2007-12-20 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20110054686A1 (en) * 2009-08-25 2011-03-03 Samsung Electronics Co., Ltd. Apparatus and method detecting a robot slip
WO2016053657A1 (en) * 2014-09-29 2016-04-07 Covidien Lp Dynamic input scaling for controls of robotic surgical system
WO2017075121A1 (en) * 2015-10-30 2017-05-04 Covidien Lp Haptic fedback controls for a robotic surgical system interface

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11135014B2 (en) 2011-09-02 2021-10-05 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
US11896314B2 (en) 2011-09-02 2024-02-13 Stryker Corporation Surgical instrument including housing, a cutting accessory that extends from the housing and actuators that establish the position of the cutting accessory relative to the housing
GB2625572A (en) * 2022-12-20 2024-06-26 Cmr Surgical Ltd Control of a surgical robot arm

Also Published As

Publication number Publication date
EP3678572A4 (en) 2021-09-29
US20200345433A1 (en) 2020-11-05
EP3678572A1 (en) 2020-07-15
CN110177516A (en) 2019-08-27
JP2020532406A (en) 2020-11-12
US11628022B2 (en) 2023-04-18
CN110177516B (en) 2023-10-24
AU2018328098A1 (en) 2020-03-19
CA3074443A1 (en) 2019-03-14
JP7349992B2 (en) 2023-09-25

Similar Documents

Publication Publication Date Title
US11950870B2 (en) Computer-assisted tele-operated surgery systems and methods
JP7434246B2 (en) System and method for positioning a manipulator arm by clutching in zero orthogonal space simultaneously with zero space movement
US10251716B2 (en) Robotic surgical system with selective motion control decoupling
CN109788994B (en) Computer-assisted teleoperated surgical system and method
JP6421171B2 (en) System and method for following a path using zero space
EP2884933B1 (en) User initiated break-away clutching of a surgical mounting platform
KR102145236B1 (en) Manipulator arm-to-patient collision avoidance using a null-space
US11583358B2 (en) Boundary scaling of surgical robots
WO2019136039A1 (en) Robotic surgical systems including torque sensors
US20230372040A1 (en) Robotic surgical systems with roll, pitch, and yaw realignment including trim and flip algorithms
US11628022B2 (en) Collision handling algorithms for robotic surgical systems
WO2019222211A1 (en) Method and apparatus for manipulating tissue

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18853152

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3074443

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2020534809

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018328098

Country of ref document: AU

Date of ref document: 20180904

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018853152

Country of ref document: EP

Effective date: 20200406