WO2017114860A1 - Commande de positionnement et d'orientation spatiaux découplés d'un centre de mouvement déporté - Google Patents

Commande de positionnement et d'orientation spatiaux découplés d'un centre de mouvement déporté Download PDF

Info

Publication number
WO2017114860A1
WO2017114860A1 PCT/EP2016/082773 EP2016082773W WO2017114860A1 WO 2017114860 A1 WO2017114860 A1 WO 2017114860A1 EP 2016082773 W EP2016082773 W EP 2016082773W WO 2017114860 A1 WO2017114860 A1 WO 2017114860A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
motion
manipulator
intervention
actuation signal
Prior art date
Application number
PCT/EP2016/082773
Other languages
English (en)
Inventor
Aleksandra Popovic
David Paul NOONAN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017114860A1 publication Critical patent/WO2017114860A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • B25J18/007Arms the end effector rotating around a fixed point
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39349RCC remote center compliance device inserted between wrist and gripper

Definitions

  • the present disclosure generally relates to robots utilized during various interventional procedures (e.g., laparoscopic surgery, neurosurgery, spinal surgery, natural orifice transluminal surgery, pulmonary/bronchoscopy surgery, biopsy, ablation, and diagnostic interventions).
  • the present disclosure specifically relates to a decoupled spatial positioning and spatial orienting control of a remote-center-of- motion of an intervention robot.
  • Minimally invasive surgery is performed using elongated instruments inserted into the patient's body through small ports. More particularly, the small ports that are placed on the patient's body are the only incision points through which the instruments may pass through to access the inside of the patient. As such, the instruments may be operated to rotate around these fulcrum points, but the instruments should not be operated in a manner that imposes translational forces on the ports to prevent any potential injury and harm to the patient. This is especially important for robotic guided surgery.
  • some known robots implement what is known as a remote-center- of-motion (RCM) at the fulcrum point whereby a robot enforces an operating principle that only rotation of an instrument can be performed at a port and all translational forces of the instrument at that port are eliminated.
  • RCM remote-center- of-motion
  • This can be achieved by implementing a mechanical design which has the RCM at a specific location in space, and then aligning that point in space with the port.
  • the RCM can be implemented virtually within the software of a robotic system, provided sufficient degrees of freedom exist to ensure the constraints of the RCM can be met.
  • Constraint robots such as RCM robots
  • RCM robots are challenging to control.
  • Such robots usually implement at least five (5) joints of which (3) joints are used to position the RCM and at least two (2) joint are used to orient the RCM. Due to kinematic constraints, mapping between the joints and space degrees of freedom is not intuitive. Furthermore, the safety of these such robots can be compromised if the user accidentally moves the RCM after the instrument is inserted into the patient body. The computationally constraint systems for such robots are even more difficult to operate as those constraints are less intuitive.
  • the present disclosure provides a control of a robotic apparatus employing a manipulator robot and an intervention robot whereby the control utilizes the same user input device to independently control the manipulator robot and the intervention robot for a spatial positioning and a spatial orienting, respectively, of a remote-center-of- motion (RCM) of the intervention robot.
  • RCM remote-center-of- motion
  • the robotic apparatus is controlled by the same user input device for an independent spatial positioning of the RCM by the manipulator robot to coincide with an insertion point into a body as supported by an operating table serving as a reference plane, and for an independent spatial orienting of the RCM by the intervention robot to orient a surgical instrument supported by the intervention robot in an intuitive view of the operating table again serving as the reference plane.
  • One form of the inventions of the present disclosure is a robotic system employing a robotic apparatus, a user input device and a robot controller for executing an interventional procedure.
  • the robotic apparatus includes an intervention robot mounted unto a manipulator robot.
  • a structural configuration of the intervention robot defines a remote-center-of-motion (RCM).
  • RCM remote-center-of-motion
  • the user input device is structurally configured to generate a manipulator actuation signal representative of a spatial positioning of the remote-center-of-motion within a kinematic space of the manipulator robot, and responsive to the manipulator actuation signal, the robot controller is structurally configured to drive a translational motion and/or a rotational motion of the manipulator robot to thereby spatially position the remote-center-of-motion within the kinematic space of the manipulator robot.
  • the user input device is structurally configured to generate an interventional actuation signal representative of a spatial orienting of the remote-center-of-motion within a kinematic space of the intervention robot, and responsive to the interventional actuation signal, the robot controller is structurally configured to drive a pitch motion and/or a yaw motion of the intervention robot to thereby spatially orient the remote-center-of-motion within the kinematic space of the intervention robot.
  • a second form of the inventions of the present disclosure is the robot controller employing a mapping module, a spatial positioning module and a spatial orienting module.
  • the mapping module is structurally configured to generate a spatial positioning signal derived from a mapping of the manipulator actuation signal to the kinematic space of the manipulator robot, and responsive to the spatial positioning signal, the spatial positioning module is structurally configured to drive the translational motion and/or the rotational motion of the manipulator robot to thereby spatially position the remote-center-of-motion within a kinematic space of the manipulator robot.
  • the mapping module is structurally configured to generate a spatial orienting signal derived from a mapping of the intervention actuation signal to the kinematic space of the intervention robot, and responsive to the spatial orienting signal, the spatial orienting module is structurally configured to drive the pitch motion and/or the yaw motion of the intervention robot to thereby spatially orient the remote-center-of-motion within the kinematic space of the intervention robot.
  • a third form of the inventions of the present disclosure is a method for operating the robotic system.
  • the method involves the user input device generating the manipulator actuation signal representative of the spatial positioning of the remote-center-of-motion within the kinematic space of the manipulator robot, and responsive to the manipulator actuation signal, the robot controller driving the translational motion and/or the rotational motion of the manipulator robot to thereby spatially position the remote-center-of-motion within the kinematic space of the manipulator robot.
  • the method further involves the user input device generating the interventional actuation signal representative of the spatial orienting of the remote-center-of-motion within the kinematic space of the intervention robot, and responsive to the interventional actuation signal, the robot controller driving the pitch motion and/or the yaw motion of the intervention robot to thereby spatially orient the remote-center-of-motion within the kinematic space of the intervention robot.
  • manipulator robot broadly encompasses any robot having a structural configuration as understood in the art of the present disclosure and as exemplary described herein to include one or more articulated joints (e.g., prismatic joints and/or revolute joints) capable of translational motion and/or rotational motion segments/links in one or more degrees of freedom;
  • articulated joints e.g., prismatic joints and/or revolute joints
  • the phrase "kinematic space of the manipulator robot” broadly encompasses, as understood in the art of the present disclosure and as exemplary described herein, a spatial area traversable by the intervention robot over a range of translational motion and/or a range of rotational motion of the manipulator robot;
  • intervention robot broadly encompasses any robot having a structural configuration as understood in the art of the present disclosure and as exemplary described herein including two or more revolute joints and an end-effector whereby an intersection of axes thereof defines a remote-center-of-motion at a fulcrum point in space whereby an instrument held by the end-effector may be pitched, yawed and/or rolled at the remote-center-of-motion;
  • kinematic space of the intervention robot broadly encompasses, as understood in the art of the present disclosure and as exemplary described herein, a spatial area enclosing a range of pitch motion and/or a range of yaw motion of the intervention robot;
  • the term "remote-center-of-motion” broadly encompasses, as understood in the art of the present disclosure and as exemplary described herein, a fulcrum point within the kinematic space of the intervention robot;
  • spatialal positioning of the remote-center-of-motion broadly encompasses, as understood in the art of the present disclosure and as exemplary described herein, a translational motion and/or rotational motion of the remote- center- of motion to a point the kinematic space of the manipulator robot;
  • spatial orienting of the remote-center-of-motion broadly encompasses, as understood in the art of the present disclosure and as exemplary described herein, a pitch motion and/or a yaw motion of the remote-center-of motion about a point within a space defined by the kinematic space of the intervention robot;
  • the term "user input device” broadly encompasses any device having a structural configuration as understood in the art of the present disclosure and as exemplary described herein to facilitate a human interface with a device, an apparatus or a system of any type.
  • Examples of an input user device include, but are not limited to a keyboard, a mouse, a joy-stick, push buttons and an operator panel;
  • the term "robot controller” broadly encompasses all structural configurations as understood in the art of the present disclosure and as exemplary described herein of an application specific main board or an application specific integrated circuit for controlling an application of various inventive principles of the present disclosure as subsequently described herein.
  • the structural configuration of the controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
  • the robot controller may be housed or linked to a workstation or the user input device.
  • Examples of a workstation include, but are not limited to, an assembly of one or more computing devices (e.g., a client computer, a desktop and a tablet), a display/monitor, and one or more input devices (e.g., a keyboard, joysticks and mouse).
  • computing devices e.g., a client computer, a desktop and a tablet
  • display/monitor e.g., a display/monitor
  • input devices e.g., a keyboard, joysticks and mouse.
  • module broadly encompasses a component of the robot controller consisting of an electronic circuit and/or an executable program (e.g., executable software and/firmware) for executing a specific application;
  • executable program e.g., executable software and/firmware
  • the term "signal” broadly encompasses all forms of a detectable physical quantity or impulse (e.g., voltage, current, or magnetic field strength) as understood in the art of the present disclosure and as exemplary described herein for transmitting data for applying various inventive principles of the present disclosure as subsequently described herein; and
  • FIG. 1 illustrates a block diagram of an exemplary embodiment of a robotic system in accordance with the inventive principles of the present disclosure.
  • FIG. 2 illustrates a schematic diagram of an exemplary embodiment of a user input device in accordance with the inventive principles of the present disclosure.
  • FIGS. 3A and 3B illustrate block diagrams of a first exemplary embodiment of a robot controller in accordance with the inventive principles of the present disclosure.
  • FIGS. 4 A and 4B illustrate block diagrams of a second exemplary embodiment of a robot controller in accordance with the inventive principles of the present disclosure.
  • FIGS. 5A and 5B illustrates a side view and a top view, respectively, of a schematic diagram of an exemplary embodiment of a manipulator robot in accordance with the inventive principles of the present disclosure.
  • FIG. 6 illustrates an exemplary embodiment of an intervention robot as known in the art.
  • FIG. 7 illustrates an exemplary surgical intervention in accordance with the inventive principles of the present disclosure.
  • FIG. 8 illustrates a flowchart representative of an exemplary embodiment of a robot control method in accordance with the inventive principles of the present disclosure.
  • FIG. 9A illustrates a flowchart representative of an exemplary embodiment of a manipulator robot control method in accordance with the inventive principles of the present disclosure.
  • FIG. 9B illustrates a flowchart representative of an exemplary embodiment of an intervention robot control method in accordance with the inventive principles of the present disclosure.
  • FIG. 1 teaches basic inventive principles of a robotic apparatus employing a manipulator robot and an intervention robot, and a robotic control method user implemented by the same user input device to independently control the manipulator robot and the intervention robot for a spatial positioning and a spatial orienting, respectively, of a remote-center-of-motion (RCM) of the intervention robot.
  • RCM remote-center-of-motion
  • a robotic system of the present disclosure employs a user input device 10 and a robot controller 20, and a robotic apparatus including an intervention robot 40 removably or permanently mounted to a manipulator robot 30.
  • User input device 10 facilitates a human interface with robot controller 20 (e.g., a keyboard, a mouse, a joy-stick, push buttons and an operator panel).
  • robot controller 20 e.g., a keyboard, a mouse, a joy-stick, push buttons and an operator panel.
  • user input device 10 includes one or more motion actuator(s) 1 1 as known in the art for generating actuation signals as user input device 10 is being utilized by an operator thereof to control independent actuations by robot controller 20 of manipulator robot 30 and intervention robot 40.
  • user input device 10 provides one or more degrees of freedom of motion actuator(s) 1 1 for controlling the independent actuations of manipulator robot 30 and intervention robot 40.
  • user input device 10 provides three degrees of freedom of motion actuator(s) 1 1 within a coordinate system 13 as shown involving a forward/backward actuation along an X F/B-axis, a left/right actuation along a Y L/R-axis, and an up/down actuation along a Zu/D-axis.
  • motion actuator(s) 1 1 may be designed to generate a single positioning actuation signal PAS that is designated as a manipulator actuation signal for controlled actuation by robot controller 20 of manipulator robot 30 or designated as an interventional actuation signal IAS for controlled actuation by robot controller 20 of intervention robot 40.
  • motion actuator(s) 1 1 may be designed to independently generate a manipulator actuation signal MAS for controlled actuation by robot controller 20 of manipulator robot 30 and to independently generate an interventional actuation signal IAS for controlled actuation by robot controller 20 of intervention robot 40.
  • user input device 10 may provide a header within an actuation signal to inform robot controller 20 of which actuation signal is being designated or generated by user input device 10.
  • user input device 10 may further include a mode actuator 12 for generating a mode signal MS informative to robot controller 20 of a current operating mode of user input device 10 between a manipulation mode for controlled actuation by robot controller 20 of manipulator robot 30 and an intervention mode for controlled actuation by robot controller 20 of intervention robot 40.
  • Robot controller 20 processes the actuation signal transmitted by user input device 10 thereto to independently control actuation of manipulator robot 30 via a manipulator drive signal MDS as will be further described herein, or to independently control actuation of intervention robot 40 via an interventional drive signal IDS as will be further described herein.
  • robot controller 20 may be a stand-alone device wired to or wirelessly connected to user input device 10, or housed within or linked to a
  • robot controller 20 processes a header of the actuation signal if applicable or processes mode signal MS if transmitted by user input device 10 thereto.
  • robot controller 20 may be designed with an interface facilitating an operator to inform robot controller 20 of an operating mode of user input device 10 desired by the operator, or may receive information from a workstation interface of an operating mode of user input device 10 desired by the operator.
  • manipulator actuation signal MAS generated by user input device 10 may be in the form of a linear vector specifying a linear direction and displacement magnitude within coordinate system 13.
  • robot controller 20 transforms the linear vector to a kinematic space of manipulator robot 30 as will be further described herein. Such a transformation may involve a scaling of the vector by robot controller 20. From the transformation, robot controller 20 generates manipulator drive signal MDS in the form of electric current having an amplitude and/or frequency suitable for driving manipulator robot 30 to effect translational motion and/or rotational motion of manipulator robot 30 in accordance with the linear vector.
  • intervention actuation signal IAS generated by user input device 10 may be in the form of an angular vector specifying an angular direction and arc magnitude within coordinate system 13.
  • robot controller 20 transforms the angular vector of user input device 10 to a kinematic space of intervention robot 40 as will be further described herein. Such a transformation may involve a scaling of the vector by robot controller 20. From the transformation, robot controller 20 generates intervention drive signal IDS in the form of electric current having an amplitude and/or frequency suitable for driving intervention robot 40 to effect a pitch motion and/or a yaw motion of intervention robot 40 in accordance with the angular vector.
  • Manipulator robot 30 includes one (1) or more articulated joints (not shown) (e.g., prismatic join(s), and/or revolute joint(s)) providing one (1) or more degrees of freedom for translational motion and/or rotational motion of segments/links and end- effector (not shown) of manipulator robot 30 as driven by manipulator drive signal MDS.
  • articulated joints e.g., prismatic join(s), and/or revolute joint(s)
  • a range of translational motion and/or a range of rotational motion of the segment/links and end-effector define a kinematic space of manipulator robot 30 as will be further described herein.
  • one or more articulated joints extend between a base segment/link, and an end-effector for mounting intervention robot 40 upon manipulator 30.
  • any translational motion and/or rotational motion of the segments/links and end-effector of manipulator robot 30 is based on the base segment/link serving as a point of origin of the kinematic space of manipulator robot 30.
  • Intervention robot 40 includes one (1) or more arms/arcs (not shown) supporting two (2) or more actuators (not shown) in a structural configuration defining a remote-center-of-motion RCM at a fulcrum point within a kinematic space of intervention robot 40 as will be further described herein. Intervention robot 40 further includes an end-effector (not shown) for holding an interventional instrument whereby the remote-center-of-motion RCM is positioned along an axis of the interventional instrument to establish a workspace defined by motion of the interventional instrument.
  • intervention robot 40 holds an intervention instrument 60 whereby the remote-center-of-motion RCM is positioned along a longitudinal axis of intervention instrument 60 having a workspace 61.
  • interventional instrument includes, but are not limited to, surgical instruments and viewing/imaging instruments (e.g., an endoscope).
  • a spatial positioning operation involves an operator interfacing with user input device 10 to control an actuation by robot controller 20 of manipulator robot 30 as needed to spatially position the remote-center- of-motion RCM at a coordinate point within the kinematic space of manipulator robot 30 as directed by manipulator actuation signal MAS.
  • Intervention robot 40 is removably or permanently mounted in any suitable manner to manipulator robot 30 to move in unison with any controlled actuation of manipulator robot 30 by robot controller 20 via user input device 10.
  • a spatially orienting operation involves the operator interfacing with user input device 10 to control an actuation by robot controller 20 of intervention robot 30 as needed to spatially orient the remote-center-of-motion RCM at an orientation within a kinematic space of intervention robot 30 as directed by interventional actuation signal IAS.
  • FIG. 1 an exemplary operation of the robotic system is shown FIG. 1 .
  • Note intervention robot 40 mounted to manipulator 30 is not shown for visual clarity in the description of a spatial positioning operation and a spatial orienting operation of the robotic system. Nonetheless, those skilled in the art will appreciate the remote-center- of-motion RCM is symbolic of robots 30 and 40 for purposes of the exemplary operation.
  • a spatial positioning by the operator of the remote-center-of- motion RCM of intervention robot 40 being positioned at coordinate point within a kinematic space 50 of manipulator robot 30 symbolized by a coordinate system XMR- YMR-ZMR having a point of origin 51 involves: (1) user input device 10 transmitting manipulator actuation signal
  • manipulator actuation signal MAS is a linear vector informative of a spatial positioning of the remote- center-of-motion RCM from a coordinate point RCMpi to a coordinate point RCMp2 within kinematic space 50 of manipulator robot 30, and
  • robot controller 20 transmits a manipulator drive signal MDS to manipulator robot 30 for driving a translational motion and/or a rotational motion of manipulator robot 30 and
  • intervention robot 40 in unison to spatially position the remote-center- of-motion RCM to coordinate point RCMp 2 within kinematic space 50 of manipulator robot 30.
  • a spatial orienting by the operator of the remote-center-of-motion RCM of intervention robot 40 at an orientation about the coordinate point RCMp 2 within a kinematic system of intervention robot 40 symbolized by coordinate system XyAW-axis, a YpucH-axis and a ZRoix-axis involves:
  • interventional actuation signal IAS is an angular vector informative of a spatial orienting of the remote-center- of-motion RCM from an orientation RCMAI to an orientation RCMA2 about the coordinate point RCMp 2 , within the kinematic space of
  • robot controller 20 transmits an
  • interventional drive signal IDS to intervention robot 140 for driving an angular motion of intervention robot 140 to spatially orient the remote- center-of-motion RCM at orientation RCMA2 about coordinate point
  • FIGS. 2-6 describes exemplary embodiments of user input device 10, robot controller 20, manipulator robot 30 and intervention robot 40 for practicing the basic inventive principles of FIG. 1. From this description, those having ordinary skill in the art will appreciate how to apply the inventive principles of the present invention to numerous and various embodiments of user input device 10, robot controller 20, manipulator robot 30 and intervention robot 40.
  • an embodiment 1 10 of user input device 10 employs a platform 1 1 1 housing actuators (not shown), a joystick 1 12 for engaging the actuators, a lock 1 14 symbolically shown as a key and a command button 1 15.
  • joystick 1 12 may be moved in an up and down direction via a rotation of joystick 1 12, a left and right direction via a sideways tilt of joystick 1 12, and/or a back and forth direction via a rear and frontal tilt of joystick 1 12.
  • the movement of joystick determines a vector having a direction and magnitude for positioning actuation signal PAS.
  • An unlocking of lock 1 14 generates a manipulator unlocked signal MUS for designating positioning actuation signal PAS as a manipulator actuation signal MAS, and a locking of lock 1 14 generates a manipulator locked signal MLS for designating positioning actuation signal PAS as an interventional actuation signal IAS.
  • Command button 1 14 may be utilized for any commands suitable for the positioning/orienting aspects of the interventional procedure as will be further described herein.
  • an embodiment 120a of robot controller 20 employs a mapping module 121, a spatial positioning module 124 and a spatial orienting module 125.
  • mapping module 121 An operational state of mapping module 121 is dependent upon a generation of manipulator unlocked signal MUS or manipulator locked signal MLS by user input device 1 10.
  • mapping module 121 For a generation of manipulator unlocked signal MUS as shown in FIG. 3A, mapping module 121 includes a spatial positioning map 122 for processing positioning actuation signal PAS as a manipulator actuation signal to thereby generate a spatial positioning signal SPS as a linear vector transformation of positioning actuation signal PAS as will be further described herein.
  • spatial positioning module 124 generates a manipulator drive signal MDS for driving a translational motion and/or a rotational motion of a manipulator robot 130 (FIG. 5) as will be further described herein.
  • mapping module 121 further includes a spatially orienting map 123 for processing positioning actuation signal PAS as an interventional actuation signal IAS to thereby generate a spatial orienting signal SOS as an angular vector transformation of positioning actuation signal PAS as will be further described herein.
  • spatial orienting module 125 generate an interventional drive signal IDS for driving a pitch motion and/or a yaw motion of an intervention robot 140 (FIG. 6) as will be further described herein.
  • an embodiment 120b of robot controller 20 employs a manipulator control 126 including a mapping module 121a and spatial positioning module 124, and further employs an interventional control 127 including a mapping module 121b and spatial orienting module 125.
  • mapping modules 121a and 121b receive either manipulator unlocked signal MUS or manipulator locked signal MLS as generated by user input device 1 10.
  • mapping module 121a For a generation of manipulator unlocked signal MUS as shown in FIG. 4A, mapping module 121a includes spatial positioning map 122 for processing positioning actuation signal PAS as a manipulator actuation signal to thereby generate a spatial positioning signal SPS as a linear vector transformation of positioning actuation signal PAS as will be further described herein.
  • spatial positioning module 124 generates a manipulator drive signal MDS for driving a translational motion and/or a rotational motion of manipulator robot 130 (FIG. 5) as will be further described herein.
  • mapping module 121b further includes spatially orienting map 123 for processing positioning actuation signal PAS as an interventional actuation signal IAS to thereby generate a spatial orienting signal SOS as an angular vector transformation of positioning actuation signal PAS as will be further described herein.
  • spatial orienting module 125 generate an interventional drive signal IDS for driving a pitch motion and/or a yaw motion of intervention robot 140 (FIG. 6) as will be further described herein.
  • an embodiment 130 of manipulator robot 30 (FIG. 1) employs a prismatic joint 131a connecting rigid links 134a and 134b, a revolute joint 132 connecting rigid links 134b and 134c, a prismatic joint 131b connecting rigid links 134c and 134d, and an end-effector 135 for removably or permanently mounting of intervention robot 140 (FIG. 6) thereon.
  • link 134a serves as a base link for a point of origin of a kinematic space 150 of manipulator robot 130.
  • prismatic joint 131 a translationally moves links 134b, 134c and 134d and end-effector 135 in unison along the Z-axis of kinematic space 150 of manipulator robot 130 as best shown in FIG. 5 A.
  • revolute joint 132 rotationally moves 134c and 134d and end-effector 135 in unison about the Z-axis of kinematic space 150 of manipulator robot 130 as best shown in FIG. 5B.
  • prismatic joint 131b translationally moves link 134d and end-effector 135 in unison along the X-axis and/or the Y-axis of kinematic space 150 manipulator robot 130 as shown in FIGS. 5 A and 5B.
  • the drive signals generated by robot controllers 120a (FIG. 3) and 120b (FIG. 4) are motor signals for prismatic joints 131a and 131b, and revolute joint 132.
  • spatial positioning map 121 includes a scaled mapping of prismatic joint 13 la for a translational motion of manipulator robot 130 along ZMR-axis as derived from a transformation S*ZU/D of the linear vector provided by user input device 1 10 (FIG. 2) into kinematic space 150 of manipulator robot 130.
  • Spatial positioning map 121 further includes a non-scaled mapping of revolute joint 132 for a rotational motion of manipulator robot 130 about ZMR-axis as derived from a transformation atan 2(YL/R,XF/B) of the linear vector provided by user input device 1 10 (FIG. 2) into kinematic space 150 of manipulator robot 130.
  • Spatial positioning map 121 further includes a non-scaled mapping of prismatic joint 13 lb for a translational motion of manipulator robot 130 along XMR-axis and/or YMR-axis as derived from a transformation S*(XF/B 2 + YF/B 2 ) of the linear vector provided by user input device 1 10 (FIG. 2) into kinematic space 150 of manipulator robot 130.
  • an embodiment 140 of intervention robot 40 employs an revolute joint 141 having a primary axis PA2, an revolute joint 142 having a secondary axis SA2, a support arc 143, and an instrument arc 144 including an end- effector 145 for holding an endoscope 160 having a longitudinal axis LA2.
  • Support arc 143 is concentrically connected to revolute joint 141 and revolute joint 142
  • instrument arc 144 is concentrically connected to revolute joint 142.
  • a range of pitch motion and a range of yaw motion of end- effector 145 about remote-center-of motion 146 defines a kinematic
  • a workspace 161 relative to remote-center-of- motion 146 has surface and base dimensions derived from base arch length of ⁇ 3 of support arc 143 and extension arc length ⁇ 3 of instrument arc 144a,
  • revolute joint 141 may be driven by the robot controller as
  • revolute joint 142 may be driven by the robot controller as
  • end effector 161 has a capability, manual or controlled by the
  • joystick 1 10 (FIG. 2) is operated along the XF/B-axis and the YL/R- axis, which are mapped and scaled to an XpucH-axis and a YyAW-axis of intervention robot 140.
  • the XpucH-axis of intervention robot 140 is aligned with the XtviR-axis of manipulator robot 130 (FIG. 5) and the YyAW-axis of intervention robot 130 is aligned with the YiviR-axis of manipulator robot 130.
  • the operator may want the ZRoix-axis of intervention robot 140 along an axis of the intervention instrument (e.g., an endoscope) whereby the up-down motion of joystick 1 10 is mapped to the up-down axis of the intervention instrument (e.g., an up-down on an endoscope image).
  • the instrument axis can be saved using a command button on joystick 1 10.
  • Robot controller 120 thereafter controls rotation of the intervention instrument via a pitch motion and/or a yaw angle to instrument axis.
  • FIG. 2 an exemplary operation of user input device 1 10 (FIG. 2), robot controller 120 (FIGS. 3 and 4), manipulator robot 130 (FIG. 5) and intervention robot 140 (FIG. 6) within a surgical environment will now be described herein in connection with FIGS. 7-9. From the description, those having ordinary skill in the art will appreciate how to operate numerous and various embodiments of a user input device, a robot controller, a manipulator robot and intervention robot within any type of operational environment in accordance with the inventive principles of the present disclosure.
  • the surgical environment includes user input device 1 10.
  • the surgical procedure involves a spatial positioning and a spatial orienting of the remote-center-of-motion 146 of intervention robot 140 to coincide with an insertion port 102 of a patient 101 resting on an operating table 100 with the surface of operating table 100 serving as a reference plane 103.
  • joystick 1 10 is plugged into computer 173 and various controls 174 are installed on computer 173, particularly robot controller 120.
  • any type of tracking system may be utilized for tracking the remote- center-of-motion 146 of intervention robot 140 within the surgical environment.
  • a robot tracker control 175 may be installed on computer 173 whereby robot tracker control 175 is an electromagnetic, optical and/or image based control with sensors and/or markers attached to robots 130 and 140, patient 101 and/or operating table 100 as needed. If incorporated, robot tracker control 175 controls a display of a virtual intervention robot 140 over a pre-operative or intra- operative image of patient 101 as exemplary shown in FIG. 7.
  • a special instrument (not shown) having a length whereby a distal tip of the special instrument as held by the end-effector of intervention robot 140 coincides with the remote-center-of-motion 146 of intervention robot 140. Consequently, the distal tip of the special instrument may be visually tracked while being driven to touch insertion point 102 as an indication the remote-center-of-motion 146 of intervention robot 140 is coinciding with insertion point 102.
  • manipulator robot 130 is affixed to operating table 100 with intervention robot 140 being spaced from operating table 100 to enable patient 101 to rest thereon.
  • the robotic apparatus is registered to a pre-operative image of patient 101 if applicable, and a communication 176, wired or wireless, is established between workstation 170 and robots 130 and 140.
  • intervention robot 140 is positioned at a starting coordinate position within the kinematic space of manipulator robot 130. Additionally, an intra-operative image of patient 101 if applicable is registered to the robotic apparatus if applicable and tracking of robots 130 and 140 if applicable is initiated by robot tracker 175.
  • FIG. 8 illustrates a flowchart 180 representative of a robotic control method implemented by user input device 100 and robot controller 120.
  • stage SI 82 of flowchart 180 encompasses the operator of joystick 1 10 unlocking manipulator robot 130.
  • stage SI 82 involves the operator unlocking the lock device of joystick 1 10 to thereby unlock manipulator robot 130.
  • stage SI 82 involves a graphical user interface displayed on monitor 171 that facilitates an unlocking of manipulator robot 130.
  • a stage SI 84 of flowchart 180 encompasses spatial positioning by the operator of the remote-center-of-motion RCM of intervention robot 140 in a direction of insertion point 102 of a patient 101.
  • user input device 1 10 and robot controller 120 are operated to execute a flowchart 210 representative of a manipulator robot control method of the inventions of the present disclosure.
  • a stage S212 of flowchart 210 encompasses joystick 1 10 transmitting position actuation signal PAS and manipulator unlocking signal MUS to robot controller 20 as an indication of a spatial positioning of the remote-center-of- motion RCM at insertion point 102.
  • position actuation signal PAS is transformed into spatial positioning signal SPS in accordance with a mapping 122 by mapping module 121 of the linear vector within kinematic space 150 of manipulator robot 130, and communicated to spatial positioning module 124.
  • a stage S214 of flowchart 210 encompasses spatial positioning module 124 transmitting manipulator drive signal MDS to the joints of manipulator robot 130 to thereby translationally and/or rotationally move manipulator robot 150 as affixed to reference plane 103.
  • Stages S212 and S214 are repeated until the remote-center-of-motion 146 coincides with insertion point 102 as shown in FIG. 7.
  • mapping 122 by mapping module 121 of the linear vector within kinematic space 150 of manipulator robot 130 is updated as manipulator robot 130 is being repeatedly actuated.
  • stage S I 86 of flowchart 180 encompasses the operator of joystick 1 10 locking manipulator robot 130.
  • stage SI 86 involves the operator locking the lock device of joystick 1 10 to thereby lock manipulator robot 130.
  • stage SI 86 involves a graphical user interface displayed on monitor 171 that facilitates a locking of manipulator robot 130.
  • a stage S I 88 of flowchart 180 encompasses an insertion of endoscope 160 (FIG. 6) within the end-effector of intervention robot 140, and a selection of a principle axis of intervention robot 140 as previously described herein.
  • a stage S I 90 of flowchart 180 encompasses a spatial orienting by the operator of the remote-center-of-motion RCM of intervention robot 140 about the insertion point 102 of patient 101.
  • user input device 1 10 and robot controller 120 are operated to execute a flowchart 220 representative of an intervention robot control method of the inventions of the present disclosure.
  • a stage S222 of flowchart 220 encompasses joystick 1 10 transmitting position actuation signal PAS and manipulator locking signal MLS to robot controller 20 as an indication of a spatial orienting of the remote-center-of-motion RCM about insertion point 102.
  • position actuation signal PAS is transformed into spatial orienting signal APS in accordance with a mapping 123 by mapping module 121 of the angular vector within the kinematic space of intervention robot 130, and communicated to spatial orienting module 125.
  • a stage S224 of flowchart 220 encompasses spatial orienting module 125 transmitting intervention drive signal IDS to the actuator of intervention robot 140 to thereby pitch and/to yaw endoscope 160 as held by interventional robot 140.
  • Stages S222 and S224 are repeated until the remote-center-of-motion 146 is positioned at one or more orientations for executing the surgical procedure.
  • mapping 123 by mapping module 121 of the angular vector within the kinematic space of intervention robot 140 is updated as intervention robot 140 is being repeatedly actuated.
  • Flowchart 180 is terminated upon completion of stage S I 90.
  • FIGS. 1 -9 may be implemented in various combinations of electronic components/circuitry, hardware, executable software and executable firmware and provide functions which may be combined in a single element or multiple elements.
  • the functions of the various features, elements, components, etc. shown/illustrated/depicted in the FIGS. 1 -9 can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • processor When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared and/or multiplexed.
  • explicit use of the term "processor” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP") hardware, memory (e.g., read only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.) and virtually any means and/or machine (including hardware, software, firmware, circuitry, combinations thereof, etc.) which is capable of (and/or
  • DSP digital signal processor
  • any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.
  • exemplary embodiments of the present disclosure can take the form of a computer program product or application module accessible from a computer-usable and/or computer-readable storage medium providing program code and/or instructions for use by or in connection with, e.g., a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that can, e.g., include, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus or device.
  • Such exemplary medium can be, e.g., an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer- readable medium include, e.g., a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), flash (drive), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W) and DVD.
  • corresponding and/or related systems incorporating and/or implementing the device or such as may be used/implemented in a device in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.
  • corresponding and/or related method for manufacturing and/or using a device and/or system in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne la commande d'un appareil robotique employant un robot (40) d'intervention définissant un centre de mouvement déporté et monté sur un robot manipulateur (30). Dans une opération de positionnement spatial, un dispositif (10) d'entrée d'utilisateur génère un signal d'actionnement de manipulateur représentatif d'un positionnement spatial du centre de mouvement déporté à l'intérieur d'un espace cinématique du robot manipulateur (30), et une commande (20) de robots pilote un mouvement de translation et/ou un mouvement de rotation du robot manipulateur (30) pour positionner ainsi spatialement le centre de mouvement déporté à l'intérieur de l'espace cinématique du robot manipulateur (30). Dans une opération distincte découplée d'orientation spatiale, le dispositif (10) d'entrée d'utilisateur génère un signal d'actionnement d'intervention représentatif d'un orientation spatiale du centre de mouvement déporté à l'intérieur d'un espace cinématique du robot (40) d'intervention, et la commande (20) de robots pilote un mouvement de tangage et/ou un mouvement de lacet du robot (40) d'intervention pour orienter ainsi spatialement le centre de mouvement déporté à l'intérieur de l'espace cinématique du robot (40) d'intervention.
PCT/EP2016/082773 2015-12-30 2016-12-28 Commande de positionnement et d'orientation spatiaux découplés d'un centre de mouvement déporté WO2017114860A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562272747P 2015-12-30 2015-12-30
US62/272,747 2015-12-30

Publications (1)

Publication Number Publication Date
WO2017114860A1 true WO2017114860A1 (fr) 2017-07-06

Family

ID=57851035

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/082773 WO2017114860A1 (fr) 2015-12-30 2016-12-28 Commande de positionnement et d'orientation spatiaux découplés d'un centre de mouvement déporté

Country Status (1)

Country Link
WO (1) WO2017114860A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010044536A1 (fr) * 2008-10-13 2010-04-22 Meerecompany Robot asservi chirurgical
KR100994931B1 (ko) * 2008-05-27 2010-11-17 (주)미래컴퍼니 수술용 로봇 암의 링크구조 및 그 세팅방법
US20140194699A1 (en) * 2013-01-08 2014-07-10 Samsung Electronics Co., Ltd. Single port surgical robot and control method thereof
WO2015127078A1 (fr) * 2014-02-20 2015-08-27 Intuitive Surgical Operations, Inc. Mouvement limité d'une plateforme de montage chirurgical commandée par un mouvement manuel de bras robotiques

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100994931B1 (ko) * 2008-05-27 2010-11-17 (주)미래컴퍼니 수술용 로봇 암의 링크구조 및 그 세팅방법
WO2010044536A1 (fr) * 2008-10-13 2010-04-22 Meerecompany Robot asservi chirurgical
US20140194699A1 (en) * 2013-01-08 2014-07-10 Samsung Electronics Co., Ltd. Single port surgical robot and control method thereof
WO2015127078A1 (fr) * 2014-02-20 2015-08-27 Intuitive Surgical Operations, Inc. Mouvement limité d'une plateforme de montage chirurgical commandée par un mouvement manuel de bras robotiques

Similar Documents

Publication Publication Date Title
US11819301B2 (en) Systems and methods for onscreen menus in a teleoperational medical system
US10874467B2 (en) Methods and devices for tele-surgical table registration
EP3119326B1 (fr) Mise en forme de commandes pour amortir des vibrations dans des transitions de mode
CN110769773A (zh) 用于遥控操作的主/从配准和控制
US9283049B2 (en) Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
CN111670011B (zh) 用于定位机器人操纵器的控制模式和过程
JP3540362B2 (ja) 手術用マニピュレータの制御システム及びその制御方法
JP7257559B2 (ja) コンピュータ支援遠隔操作システムにおける補助器具制御
CN113795215A (zh) 用于磁性感测和与套管针对接的系统和方法
US20220061936A1 (en) Control of an endoscope by a surgical robot
EP4301269A1 (fr) Configuration assistée par robot pour un système chirurgical robotisé
Abdurahiman et al. Human-computer interfacing for control of angulated scopes in robotic scope assistant systems
EP3474763A1 (fr) Guidage par imagerie pour commande cinématique découplée d'un centre de mouvement distant
WO2017114860A1 (fr) Commande de positionnement et d'orientation spatiaux découplés d'un centre de mouvement déporté
US20220087748A1 (en) Surgical robotic system having grip-dependent control
WO2023192204A1 (fr) Réglage et utilisation de centres de mouvement à distance logiciels pour systèmes assistés par ordinateur
CN117651535A (zh) 用于低自由度工具的外科机器人的逆向运动学的投影算子

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16828952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16828952

Country of ref document: EP

Kind code of ref document: A1