WO2020232406A1 - Confidence-based robotically-assisted surgery system - Google Patents

Confidence-based robotically-assisted surgery system Download PDF

Info

Publication number
WO2020232406A1
WO2020232406A1 PCT/US2020/033270 US2020033270W WO2020232406A1 WO 2020232406 A1 WO2020232406 A1 WO 2020232406A1 US 2020033270 W US2020033270 W US 2020033270W WO 2020232406 A1 WO2020232406 A1 WO 2020232406A1
Authority
WO
WIPO (PCT)
Prior art keywords
path
point cloud
confidence indicator
image data
autonomous
Prior art date
Application number
PCT/US2020/033270
Other languages
French (fr)
Inventor
Michael Kam
Hamed SAEIDI
Axel Krieger
Simon LEONARD
Justin OPEFERMANN
Original Assignee
University Of Maryland, College Park
Johns Hopkins University
Childrens Research Institute, Childrens National Medical Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Maryland, College Park, Johns Hopkins University, Childrens Research Institute, Childrens National Medical Center filed Critical University Of Maryland, College Park
Priority to US17/098,990 priority Critical patent/US20210077195A1/en
Publication of WO2020232406A1 publication Critical patent/WO2020232406A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/75Manipulators having means for prevention or compensation of hand tremors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3941Photoluminescent markers

Definitions

  • the present disclosure relates to a system. More particularly, the present disclosure relates to a robotically-assisted surgery (RAS) system.
  • RAS robotically-assisted surgery
  • a RAS system can reduce human errors and improve patient outcomes by leveraging robotic accuracy and repeatability during certain surgical procedures.
  • the degree of interaction between RAS systems and human operators has not been found to be optimal.
  • a completely autonomous RAS system has not been feasible for many surgical situations, procedures and environments. Therefore, a need exists for an RAS system that optimizes the amount of
  • a system may include a camera system that includes a first camera and a second camera, an articulating member that includes a tool, and a computer.
  • the computer may include at least one processor and a non-transitory memory configured to store computer-readable instructions which, when executed, cause the at least one processor to, receive image data from the first camera, receive point cloud image data from the second camera, wherein the image data and the point cloud image data correspond to a tissue on which markers are disposed, identify marker positions of the markers based on the image data and the point cloud image data, generate a path between a first point on the point cloud and a second point on the point cloud based at least on the marker positions, filter the path, receive real-time position data corresponding to the articulating member, generate a three-dimensional (3D) trajectory based on the filtered path and the real-time position data, generate control commands based on the 3D trajectory, and control the articulating member and the tool to follow the 3D trajectory based on the control commands.
  • 3D three-dimensional
  • the tool may include an electrocautery tool.
  • the computer-readable instructions which cause the at least one processor to control the articulating member and the tool may further cause the electrocautery tool to cut the tissue along the path.
  • the first camera may include a near-infrared (NIR) camera
  • the second camera may include a red-blue-green-depth (RGBD) camera
  • the image data may include NIR image data
  • the markers may include NIR markers.
  • the computer-readable instructions which cause the at least one processor to generate the path may further cause the at least one processor to identify projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data, and reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
  • the computer-readable instructions which cause the at least on processor to filter the path may further cause the at least one processor to select tracked waypoints as a subset of the reference waypoints, and generate filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
  • the filtering algorithm may be selected from the group consisting of a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
  • the computer-readable instructions when executed, may further cause the at least one processor to calculate at least one autonomous confidence indicator based on autonomous incision error, calculate a manual confidence indicator based on manual incision error, generate at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator, and generate the control commands based on the at least one allocation function.
  • the at least one autonomous confidence indicator may be selected from the group consisting of a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error.
  • the at least one allocation function may include multiple of allocation functions corresponding to movement of the articulating member in three-dimensional directions, and roll, pitch, and yaw of the articulated member.
  • a method may include steps for generating image data and point cloud image data corresponding to a region of interest on which markers are disposed, identifying marker positions of the markers based on the image data and the point cloud image data, generating a path between a first point of the point cloud image data and a second point of the point cloud image data, based at least on the marker positions, receiving real-time position data corresponding to an articulating member, generating a three-dimensional (3D) trajectory for the articulating member based on the path and the real-time position data, generating control commands based on the 3D trajectory, and controlling the articulating member to follow the 3D trajectory based on the control commands.
  • 3D three-dimensional
  • the articulating member may include a robotic arm, and controlling the articulating member may include causing the robotic arm to cut tissue in the region of interest along the path.
  • the step of generating the path may include identifying projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data, and generating reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
  • the step of filtering the path may include selecting tracked waypoints as a subset of the reference waypoints, and generating filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
  • the filtering algorithm may be selected from the group consisting of: a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
  • the method may further include steps for calculating at least one autonomous confidence indicator based on autonomous incision error, calculating a manual confidence indicator based on manual incision error, generating at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator, and generating the control commands based on the at least one allocation function.
  • the at least one autonomous confidence indicator may include at least one confidence indicator selected from a group consisting of a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error.
  • the at least one allocation function comprises a plurality of allocation functions corresponding to movement of the articulating member in three- dimensional directions, and roll, pitch, and yaw of the articulated member.
  • the image data may include near-infrared (NIR) image data
  • the markers may include NIR markers.
  • FIG. 1 depicts a schematic diagram of a RAS system, in accordance with an embodiment of the present disclosure.
  • FIG. 2 depicts a block diagram of the RAS system depicted in FIG. 1 , in accordance with an embodiment of the present disclosure.
  • FIG. 3A depicts a block diagram of a shared control system, in accordance with an embodiment of the present disclosure.
  • FIG. 3B depicts a block diagram of a shared control subsystem, in accordance with an embodiment of the present disclosure.
  • FIG. 3C depicts a block diagram of a manual control subsystem, in accordance with an embodiment of the present disclosure.
  • FIG. 3D depicts a block diagram of an autonomous control subsystem, in accordance with an embodiment of the present disclosure.
  • FIG. 4 depicts a graphical user interface for a shared control system, in accordance with an embodiment of the present disclosure.
  • FIG. 5 illustrates a series of tissue samples, in accordance with an embodiment of the present disclosure.
  • FIG. 6 depicts average tracking error graphs for tissue samples, in accordance with an embodiment of the present disclosure.
  • FIG. 7 depicts normalized tracking error graphs, in accordance with an embodiment of the present disclosure.
  • FIG. 8 depicts a confidence indicator graph and an allocation function graph, in accordance with an embodiment of the present disclosure.
  • FIG. 9 depicts several allocation functions, in accordance with embodiments of the present disclosure.
  • FIGS. 10A and 10B present flow diagrams depicting at least some of the functionality of the shared control module depicted in FIG. 2, in accordance with embodiments of the present disclosure.
  • FIGS. 11 shows an illustrative RAS system having a dual camera system, in accordance with embodiments of the present disclosure.
  • FIG. 12A shows a perspective view of a testbed of the RAS system of FIG. 11 , in accordance with embodiments of the present disclosure.
  • FIG. 12B shows an illustrative image of sample tissue and near- infrared (NIR) markers captured by an NIR camera of the RAS system, in
  • FIG. 12C shows an illustrative point cloud image of the tissue sample captured by a RGBD camera of the RAS system with positions of NIR markers overlaid on the point cloud image, in accordance with embodiments of the present disclosure.
  • FIG. 13 shows illustrative system components which may be used in connection with a manual control mode of the RAS system, in accordance with embodiments of the present disclosure.
  • FIG. 14A shows illustrative overlays 1400 and 1410 corresponding to an exemplary manual cutting task that may be performed using the RAS system of FIG. 11 , in accordance with embodiments of the present disclosure.
  • FIG. 14B shows an illustrative comparison between a desired incision path and an actual incision path, which may be use to evaluate error following the cutting task, in accordance with embodiments of the present disclosure.
  • FIG. 15A shows an illustrative side-view of a tissue sample following the exemplary manual cutting task, in accordance with embodiments of the present disclosure.
  • FIG. 15B shows an illustrative comparison of upper and lower edges of the cut portion of the tissue shown in FIG. 15A, in accordance with embodiments of the present disclosure.
  • FIG. 16 shows an illustrative block diagram corresponding to a portion of the RAS system of FIG. 11 , including a supervised autonomous control subsystem and low level controller, in accordance with embodiments of the present disclosure.
  • FIG. 17 shows an illustrative graph of a point cloud that may be captured with the RGBD camera of the RAS system of FIG. 11 that includes a path generated for cutting between a start point and an end point on the point cloud, in accordance with embodiments of the present disclosure.
  • FIG. 18A shows an illustrative sequence of frames that include a raw, unfiltered path that may be generated by a path planner of a supervised autonomous control subsystem, in accordance with embodiments of the present disclosure.
  • FIG. 18B shows an illustrative frame that includes tracked waypoints of the raw, unfiltered path, in accordance with embodiments of the present disclosure.
  • FIG. 18C shows an illustrative sequence of frames that include the tracked waypoints and filtered waypoints that may be output by a filter of the supervised autonomous control system, in accordance with embodiments of the present disclosure.
  • FIG. 19 shows an illustrative example of a series of paths, waypoints, and corresponding NIR markers overlaid on a point cloud, in accordance with embodiments of the present disclosure.
  • FIG. 20 shows an illustrative identification pattern that may be used to assess accuracy of a 3D NIR marker projection method that may be performed by the RAS system, in accordance with an embodiment.
  • FIG. 21 shows an illustrative graph that provides an example of evaluating marker projection errors, in accordance with embodiments of the present disclosure.
  • FIG. 22 shows an illustrative graph that provides an example of the effects of changes in roll angle on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
  • FIG. 23 shows an illustrative graph that provides an example of the effects of changes in pitch angle on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
  • FIG. 24 shows an illustrative graph that provides an example of the effects of changes in distance on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
  • FIG. 25A shows an illustrative desired path planning pattern overlaid on an identification pattern, in accordance with embodiments of the present disclosure.
  • FIG. 25B shows an illustrative 3D graph demonstrating effects of local noise on point cloud density and path planning accuracy, and an illustrative 3D graph illustrating path planning under no external noise, in accordance with embodiments of the present disclosure.
  • FIG. 26 shows an illustrative graph depicting a path planning error model and corresponding effects of changes in point cloud density on path planning error, and an illustrative graph depicting the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
  • FIG. 27 shows an illustrative system by which confidence indicators may be generated for roll, pitch, distance, and point cloud density
  • FIG. 28 shows an illustrative graph depicting an allocation function that is calculated based on manual and automatic control confidence indicators, in accordance with embodiments of the present disclosure.
  • FIG. 29 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a high level of skill, in accordance with embodiments of the present disclosure.
  • FIG. 30 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a moderate level of skill, in accordance with embodiments of the present disclosure.
  • FIG. 31 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a low level of skill, in accordance with embodiments of the present disclosure.
  • FIG. 32A shows an illustrative graph depicting an allocation function compared to automatic and manual confidence indicators, in accordance with some embodiments of the present disclosure.
  • FIG. 32B shows an illustrative graph depicting a shared output for controlling a second joint of a robot of an RAS system compared to autonomous and manual control signals based on which the shared output is generated, in
  • FIG. 32C shows an illustrative graph depicting a shared output for controlling a sixth joint of a robot of an RAS system compared to autonomous and manual control signals based on which the shared output is generated, in
  • FIG. 33 illustrates a graphical user interface that includes an indicator of a level of shared control of the articulated member of a robot, in accordance with embodiments of the present disclosure.
  • FIG. 34 illustrates a graphical user interface that includes manual control indicators for a RAS system, in accordance with embodiments of the present disclosure.
  • FIG. 35 illustrates a graphical user interface that includes regular and NIR video of a task space along with a procedure and control mode indicator corresponding to an RAS system, in accordance with embodiments of the present disclosure.
  • Embodiments of the present disclosure advantageously improve both RAS system efficiency and patient outcomes by combining the best features of automation with the complementary skills of the surgeon operating the RAS system. While automation of the RAS system may provide greater accuracy and repeatability in certain surgical situations, automation is not infallible and safe operation requires surgeon supervision and possible intervention. Accordingly, the present disclosure provides a control system that allows surgical procedures to be performed
  • embodiments of the present disclosure provide a confidence-based shared control system that provides an automated control allocation during a surgical task, situation, procedure, etc.
  • the confidence-based shared control system improves the surgical performance of any surgeon by reducing not only the overall error committed by the surgeon, but also the workload of the surgeon during the task.
  • FIG. 1 depicts a schematic diagram of RAS system 10, in accordance with an embodiment of the present disclosure.
  • RAS system 10 includes computer 100 coupled to robot 20, input device 30, camera 40 and display 50.
  • Tissue 4 may include one or more tissue samples, a region of interest of a patient, etc.
  • Robot 20 includes articulated member or arm 22 and tool 24.
  • tool 24 is an extension of arm 22, and may be, for example, a surgical tool, an electro-surgical tool, a laser, etc. The movement of tool 24 is controlled by commands to robot 20.
  • Input device 30 includes stylus 32 and one or more switches or buttons 34.
  • Computer 100 may also be coupled to network 60, which may include one or more local area networks, wide area networks, the Internet, etc.
  • robot 20 is a Smart Tissue Autonomous Robot (STAR) that includes a KUKA LBR iiwa robot with a 7-DOF (degree of freedom) lightweight arm 22 and a surgical tool 24.
  • Robot 20 receives control commands or signals from computer 100, and sends positional information for arm 22 to computer 100.
  • the control commands or signals may include one or more of the following types of data: position, velocity, acceleration, force, torque, etc.
  • surgical tool 24 is an electro-cautery tool that is based on a 2-DOF laparoscopic grasper Radius T manufactured by Tuebingen Scientific.
  • Electro-cautery tool 24 includes a shaft, a quick release interface that is electrically isolated from the shaft, and two conductors, disposed within the center of electro-cautery tool 24, that are electrically coupled to an electro-surgical generator (ESG) (not depicted for clarity).
  • ESG electro-surgical generator
  • a needle electrode is inserted into the quick-release interface, and a cutting waveform is selected on the ESG.
  • an input control for the ESG such as, for example, a foot pedal, a button or switch, etc.
  • the ESG receives a control signal.
  • the ESG In response, the ESG generates an electrical signal representing the cutting waveform, and then sends the electrical signal to the needle electrode.
  • a grounding pad disposed underneath the tissue sample, patient, etc. in task space 2, is coupled to the ESG to complete the electrical circuit. The electrical signal vaporizes tissue in contact with the electrode, thereby cutting the tissue.
  • computer 100 may receive the ESG control signal from input device 30, and then send the ESG control signal to the ESG.
  • input device 30 may include a button or switch that is mapped to the ESG control signal.
  • input device 30 may be coupled to the ESG and provide the ESG control signal directly thereto.
  • robot 20 including different arms 22 and tools 24, are also contemplated, such as, for example, a motorized suturing device, etc.
  • input device 30 is a 6-DOF Sensable
  • haptic device 30 that allows the surgeon to manually control robot 20.
  • haptic device 30 sends positional information for stylus 32 and commands received through buttons 34 to computer 100, and may receive haptic feedback from computer 100. If haptic feedback is provided, haptic device 30 includes one or more haptic actuators that render the haptic feedback to the surgeon. Haptic feedback may include force, vibration, motion, texture, etc. Other embodiments of input device 30 are also contemplated.
  • camera 40 is a Point Grey Chameleon RGB (red green blue) camera. Camera 40 sends image data to computer 100 that provide visual feedback to the surgeon and input data for the autonomous control mode discussed below. Other embodiments of camera 40 are also contemplated.
  • FIG. 2 depicts a block diagram of RAS system 10 depicted in FIG. 1 , in accordance with an embodiment of the present disclosure.
  • Computer 100 includes bus 110, processor 120, memory 130, I/O interfaces 140, display interface 150, and one or more communication interfaces 160.
  • I/O interfaces 140 are coupled to I/O devices 142 using a wired or wireless connection
  • display interface 150 is coupled to display 50, and
  • communication interface 160 is connected to network 60 using a wired or wireless connection.
  • Bus 110 is a communication system that transfers data between processor 120, memory 130, I/O interfaces 140, display interface 150, and
  • Power connector 112 is coupled to bus 110 and a power supply (not shown).
  • Processor 120 includes one or more general-purpose or application-specific microprocessors to perform computation and control functions for
  • Processor 120 may include a single integrated circuit, such as a micro-processing device, or multiple integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of processor 120.
  • processor 120 may execute computer programs or modules, such as operating system 132, shared control module 134, other software modules 136, etc., stored within memory 130.
  • Memory 130 stores information and instructions for execution by processor 120.
  • memory 130 may include a variety of non-transitory computer-readable medium that may be accessed by processor 120.
  • memory 130 may include volatile and nonvolatile medium, non- removable medium and/or removable medium.
  • memory 130 may include any combination of random access memory (“RAM”), dynamic RAM (DRAM), static RAM (SRAM), read only memory (“ROM”), flash memory, cache memory, and/or any other type of non-transitory computer-readable medium.
  • RAM random access memory
  • DRAM dynamic RAM
  • SRAM static RAM
  • ROM read only memory
  • flash memory cache memory, and/or any other type of non-transitory computer-readable medium.
  • Memory 130 contains various components for retrieving, presenting, modifying, and storing data.
  • memory 130 stores software modules that provide functionality when executed by processor 120.
  • the software modules include an operating system 132 that provides operating system functionality for computer 100.
  • the software modules also include shared control module 134 that provides functionality for controlling robot 20.
  • shared control module 134 may include a plurality of modules, each module providing specific individual functionality for controlling robot 20.
  • Other software modules 136 may cooperate with shared control module 134 to provide functionality for controlling robot 20, such as planning algorithms, robot controllers, computer vision, control allocation strategies, etc.
  • other software modules 136 may include a Robot Operating System (ROS), which provides a flexible collection of tools, libraries, device drivers, such as robot device drivers, sensor device drivers, etc., conventions, etc.
  • ROS Robot Operating System
  • other software modules 136 may include an
  • OpenCV Open Source Computer Vision
  • KDL Kinematics and Dynamics Library
  • OROCOS Open Robot Control Systems
  • Data 138 may include data associated with operating system 132, shared control module 134, other software modules 136, etc.
  • I/O interfaces 140 are configured to transmit and/or receive data from I/O devices 142.
  • I/O interfaces 140 enable connectivity between processor 120 and I/O devices 142 by encoding data to be sent from processor 120 to I/O devices 142, and decoding data received from I/O devices 142 for processor 120.
  • data may be sent over wired and/or a wireless connections.
  • I/O interfaces 140 may include one or more wired communications interfaces, such as USB, Ethernet, etc., and/or one or more wireless communications interfaces, coupled to one or more antennas, such as WiFi, Bluetooth, cellular, etc.
  • I/O devices 142 provide input to computer 100 and/or output from computer 100.
  • I/O devices 142 are operably connected to computer 100 using either a wireless connection or a wired connection.
  • I/O devices 142 may include a local processor coupled to a communication interface that is configured to communicate with computer 100 using the wired or wireless connection.
  • I/O devices 142 include robot 20, input device 30, camera 40, and may include other devices, such as a joystick, keyboard, mouse, touch pad, etc.
  • Display interface 150 is configured to transmit image data from computer 100 to monitor or display 50.
  • Communication interface 160 is configured to transmit data to and from network 60 using one or more wired or wireless connections.
  • Network 60 may include one or more local area networks, wide area networks, the Internet, etc., which may execute various network protocols, such as, for example, wired and wireless Ethernet, Bluetooth, etc.
  • Network 60 may also include various network protocols, such as, for example, wired and wireless Ethernet, Bluetooth, etc.
  • wired and/or wireless physical layers such as, for example, copper wire or coaxial cable networks, fiber optic networks, Bluetooth wireless networks, WiFi wireless networks, CDMA, FDMA and TDMA cellular wireless networks, etc.
  • FIG. 3A depicts a block diagram of shared control system 200, in accordance with an embodiment of the present disclosure.
  • the functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
  • shared control system 200 performs complex surgical procedures collaboratively between robot 20 and the surgeon with the highest possible degree of autonomy, while ensuring safe operation at all times. In one sense, shared control system 200 is“self-aware” of the limitations of its automation capabilities.
  • Shared control system 200 includes manual control subsystem 210, autonomous control subsystem 220, a shared control subsystem 230, and a supervised autonomous control subsystem 250 (e.g., described below in connection with FIG. 16). Also depicted in FIG. 3A is task space 2 including robot 20 and tissue 4. Tissue 4 may be one or more tissue samples, a region of interest of a patient, etc.
  • Manual control subsystem 210 generates manual control command 212, which is input to shared control subsystem 230.
  • Autonomous control subsystem 220 generates autonomous control command 222, which is input to shared control subsystem 230.
  • the supervised autonomous control subsystem 250 may generate a supervised autonomous control command 252.
  • Shared control subsystem 230 generates shared control command 232.
  • shared control command 232 is input to low level controller 240, which converts shared control command 232 to robot-specific control signal 231.
  • Robot-specific control signal 231 is then sent to robot 20.
  • low level controller 240 is a software module that is specific to this robot, such as the IIWA (Intelligent Industrial Work Assistant) Stack.
  • shared control command 232 may be sent directly to robot 20, which converts shared control command 232 to the appropriate robot-specific control signal.
  • Shared control subsystem 230 generates shared control command 232 according to the Equation 1 :
  • Equation 1 manual control commands from the surgeon, M(t), are combined with autonomous control commands, A(t), using complementary scales a(t) e [0, 1] and 1 - a(t), respectively, to form the shared control command to the robot, U(t).
  • the allocation function a(t) defines the respective percentages of the manual control command M(t) and the autonomous control command A(t) that are combined to form the shared control command U(t).
  • the allocation function a(t) defines these percentages with respect to an independent variable x that reflects or indicates certain performance criteria for the shared control subsystem 230.
  • manual control command 212 represents M(t)
  • either the autonomous control command 222 or the supervised autonomous control command 252 may represent A(t)
  • shared control command 232 represents U(t).
  • the allocation function selects the autonomous control command as the shared control command. In other words, the shared control command is not influenced by the manual control command when a(t) is 0. Conversely, when a(t) is 1 , the allocation function selects the manual control command as the shared control command. In other words, the shared control command is not influenced by the autonomous control command when a(t) is 1.
  • the allocation function blends or combines the manual control command and the autonomous control command, based on the value of the allocation function, to generate the shared control command.
  • allocation function a(t) changes dynamically during the task and is a function of the independent variable x.
  • allocation function 802 is a function of tracking accuracy.
  • Allocation function 804 is a function of proximity to obstacles and/or desired locations.
  • Allocation function 806 is a function of the accuracy of predicting human intentions in controlling the robot.
  • Allocation function 808 is a function of the level of manipulation precision.
  • Allocation function 810 is a fixed function and does not change based on the performance criteria. Generally, performance criteria determine the confidence and hence the allocation function, which is task
  • Allocation function 812 is a function of trust in the manual and/or autonomous control subsystems, and, more particularly, allocation function 812 is a function of the confidence in the manual and/or autonomous control subsystems and their dynamic uncertainties.
  • FIG. 3B depicts a block diagram of shared control subsystem 230, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3B is low level controller 240 which converts shared control command 232 into shared control signal 231. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
  • Scale function 233 applies the allocation function a(t) to manual control command 212
  • scale function 234 applies the allocation function a(t) to autonomous control command 222 or the supervised autonomous control command 252.
  • the scaled commands are then combined to form shared control command 232.
  • Generation of the allocation function a(t) is performed by an adaptive confidence-based autonomy allocation module 239, based on manual confidence indicator 237 and autonomous confidence indicator 238.
  • Manual confidence indicator 237 is determined based on manual tracking error data 235 that is acquired when processor 120 is operating in a manual control mode during performance of a predetermined task using tool 24.
  • Manual tracking error data 235 are associated with the trajectory of tool 24 during performance of the predetermined task.
  • autonomous confidence indicator 238 is determined based on autonomous tracking error data 236 that are acquired when processor 120 is operating in an autonomous control mode during performance of the predetermined task using tool 24.
  • the autonomous tracking error data 236 are associated with the trajectory of tool 24 during performance of the predetermined task.
  • Performance of the predetermined task in manual control mode and autonomous control mode, in order to determine the manual and autonomous confidence indicators 237, 238, respectively, represents the identification tests noted above. This process is discussed in more detail below.
  • FIG. 3C depicts a block diagram of manual control subsystem 210, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3C are task space 2 including robot 20 and tissue 4, and low level controller 240 which converts manual control command 212 into manual control signal 211.
  • the functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
  • Inverse kinematics are applied to generate manual control command 212 in joint-space, and low level controller 240 then converts manual control command 212 to manual control signal 211.
  • the manual control signal 211 is then sent to robot 20 over the appropriate I/O interface 140.
  • the manual control command 212 is sent to robot 20 over the appropriate I/O interface 140, which processes the command as necessary.
  • FIG. 3D depicts a block diagram of autonomous control subsystem 220, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3D are task space 2 including robot 20 and tissue 4, and low level controller 240 which converts autonomous control command 222 into autonomous control signal 221. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
  • real- time video frames from camera 40 are processed to detect a reference trajectory inscribed on tissue 4, such as, for example, a circular pattern cut.
  • Edge and contour detection algorithms in OpenCV are used to detect the reference cutting trajectory.
  • the reference trajectory is converted from the image frame to the Cartesian robot frame using a homography transformation.
  • the resulting reference and the real-time positions of robot 20 are used in the trajectory generator and planner to produce multiple equidistant waypoints for the desired trajectory starting from the closest point on the desired trajectory to robot 20. Smooth, time-based desired trajectory segments are produced between the waypoints using, for example, Reflexxes Motion Libraries.
  • KDL Kinematics and Dynamics Library
  • OROCOS Open Robot Control Systems
  • KDL Kinematics and Dynamics Library
  • Low level controller 240 then converts autonomous control command 222 to autonomous control signal 221.
  • FIG. 4 depicts a graphical user interface 300 for shared control system 200, in accordance with an embodiment of the present disclosure.
  • GUI 300 depicts a video image of tissue 4 within task space 2, with reference trajectory 310 for the predetermined task inscribed thereon. GUI 300 also overlays a computer-generated image depicting desired trajectory 320 for the autonomous control mode, one or more suggested autonomous control mode regions 330, one or more suggested manual control mode regions 340, and control mode indicator 350. Suggested autonomous control mode regions 330 and suggested manual control mode regions 340 are determined based on the allocation function a(t). In certain embodiments, the shared control mode automatically switches between autonomous control mode and manual control mode based on the allocation function a(t) during the performance of the predetermined task. In other embodiments, the surgeon manually switches between the control modes, using haptic device 30, during the performance of the predetermined task.
  • FIG. 5 illustrates a series of tissue samples 400, in accordance with an embodiment of the present disclosure.
  • a predetermined task is first performed on different tissue samples in both manual control mode and autonomous control mode.
  • the predetermined task is a circular pattern cut; other surgical procedures are also contemplated.
  • Tissue samples 400 includes tissue sample 410 without pseudo-blood occlusions and with reference trajectory 412, tissue sample 420 with a small pseudo-blood occlusion and reference trajectory 422, tissue sample 430 with a medium pseudo-blood occlusion and reference trajectory 432, tissue sample 440 with a large pseudo-blood occlusion and reference trajectory 442, tissue sample 450 with a different size pseudo-blood occlusions and reference trajectory 452, and tissue sample 460 with symmetric, medium pseudo- blood occlusions and reference trajectory 462.
  • a laser pointer is attached to tool 24 and used to project a laser dot on tissue samples 400. Performance of the circular cut pattern on tissue samples 400 using a laser pointer attached to tool 24 sufficiently identifies the tracking accuracy of the autonomous and manual control modes. Tool 24 and attached laser pointer follow the desired cutting trajectory for each control mode for each tissue sample 400. In one embodiment, the motion of robot 20 was
  • the first identification test performs the circular cut pattern on the tissue sample 400 under manual control mode
  • the second identification test performs the circular cut pattern on the tissue sample under autonomous control mode.
  • the actual trajectory of the laser dot is captured by camera 40, and the image data are processed to determine the tracking error of tool 24 by comparing the actual trajectory of the laser dot to the reference trajectory.
  • the laser dot and the location and size of any pseudo-blood occlusions are detected using functionality provided by the OpenCV library.
  • Perspective transformations are applied to the image data to generate a top view of the laser dot trajectory, and then the image data is mapped to a new image frame that is a square 500 x 500 pixel plane.
  • each pixel represents 0.2 mm on the trajectory plane.
  • the location of the laser dot is then tracked using color thresholding and blob detection, and the locations of any pseudo-blood occlusions in that tissue sample are similarly determined.
  • the position of the laser dot is compared to the reference trajectory for that tissue sample, and the tracking error for that identification test is determined.
  • FIG. 6 depicts average tracking error graph 500 for tissue samples 400, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 6 are tool trajectory and tracking error graphs 510 for tissue sample 410, and tool trajectory and tracking error graphs 550 for tissue sample 450.
  • Average tracking error graph 500 depicts average tracking error for manual control mode 502 and average tracking error for autonomous control mode 504 for identification tests performed on tissue sample 410, i.e. ,“none,” tissue sample 420, i.e.,“small,” tissue sample 430, i.e.,“medium,” tissue sample 440, i.e., “large,” and tissue samples 450, 460, i.e.,“multiple.”
  • tissue sample 410 does not have pseudo-blood occlusions.
  • Average tracking error graph 500 for tissue sample 410 indicate that the autonomous control mode outperforms the manual control mode - the average tracking error for the autonomous control mode was about 0.5 mm, while the average tracking error for the manual control mode was about 0.8 mm.
  • the average tracking error of the autonomous control mode increases from about 0.5 mm to about 1.6 mm, while the average tracking error of the manual control mode essentially remains within the same range for all of the samples, i.e. , from about 0.6 mm to about 0.8 mm. More particularly, when pseudo-blood occlusions on the desired trajectory interferes with the detection algorithms of the autonomous control mode, the tracking error for the autonomous control mode locally increases near the pseudo-blood occlusions.
  • Tool trajectory and tracking error graphs 510 present more detailed data for tissue sample 410, including plots of the reference trajectory and the actual trajectory in the X-Y plane, and graphs of the trajectory tracking errors, for the manual control mode and the autonomous control mode.
  • Tool trajectory and tracking error graphs 550 present more detailed data for tissue sample 450, including plots of the reference trajectory and the actual trajectory in the X-Y plane, and graphs of the trajectory tracking errors, for the manual control mode and the autonomous control mode.
  • FIG. 7 depicts normalized tracking error graphs 600, in accordance with an embodiment of the present disclosure.
  • FIG. 7 also depicts annotated tissue sample 640.
  • the tracking error data are normalized using a metric related to the size of the occlusion in each tissue sample 400. Other normalization metrics may also be used.
  • the normalization metric, d identifies the intersection of the reference trajectory with the pseudo-blood occlusion.
  • Annotated tissue sample 640 depicts a portion of tissue sample 440 with reference trajectory 442, and several values for d.
  • the end of the pseudo-blood occlusion 1.
  • OpenCV blob detection algorithms are used to find the location and size of the pseudo-blood occlusions or blobs on the reference trajectory, and to normalize their intersections. The tracking error along d for each identification test was determined and normalized based on the blob sizes. Other blob detection algorithms are also contemplated.
  • the performances of autonomous control mode and the manual control mode, over all of the identification tests, are then analyzed based on the normalized proximity to the pseudo-blood occlusions.
  • a curve is fitted to each normalized control mode tracking error data set.
  • the fitted curve for the manual control mode is a linear function, i.e. , manual control mode curve 602
  • the fitted curve for the autonomous control mode is a skewed Gaussian function, i.e., autonomous control mode curve 604.
  • the fitted function for the manual control mode is governed by Equation 2
  • the fitted function for the autonomous control mode is governed by Equation 3.
  • Normalized tracking error graphs 600 include manual control mode normalized tracking error data, autonomous control mode normalized tracking error data, and the fitted curves for each data set.
  • CM 1 - yM
  • CA 1 - yA
  • FIG. 8 depicts confidence indicator graph 700 and allocation function graph 710, in accordance with an embodiment of the present disclosure.
  • Confidence indicator graph 700 depicts manual control mode confidence indicator 702 and autonomous control mode confidence indicator 704 for the experimental tests described above. After confidence indicators 702, 704 are determined, the allocation function a(t) is generated based on these confidence indicators. In one embodiment, confidence indicators 702, 704 are used to locally select the most reliable control mode as the predetermined task is performed.
  • confidence indicator 702 is more or less constant, the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode are determined based on confidence indicator 704.
  • confidence indicator 704 is greater than confidence indicator 702.
  • confidence indicator 704 is greater than the manual control mode.
  • Tminimum 0.79 the middle of the pseudo-blood occlusion is approached ( d ⁇ 0)
  • confidence in the autonomous control mode reaches a minimum level (Tminimum 0.79)
  • confidence indicator 702 is greater than confidence indicator 704. In other words, confidence in the manual control mode is greater than the autonomous control mode. As d approaches positive values after Tupper 724, confidence indicator 704 is greater than confidence indicator 702 and gradually increases back to 1. In other words, confidence in the autonomous control mode is again greater than the manual control mode.
  • Allocation function graph 710 depicts allocation function 712, which is a function of the confidence in the autonomous control mode, i.e. , confidence indicator 704.
  • allocation function 712 returns a value of 0 or 1 based on the value of confidence indicator 704.
  • the value 0 indicates that the autonomous control mode has been selected for the shared control mode
  • the value 1 indicates that the manual control mode has been selected for the shared control mode.
  • the shared control mode is initially set to the autonomous control mode
  • allocation function 712 has an initial setting of 0.
  • the normalized distance d approaches lower decision threshold Tiower 722.
  • allocation function 712 returns the value 1 , which changes the shared control mode to the manual control mode.
  • the normalized distance d approaches upper decision threshold Tupper 724.
  • allocation function 712 returns the value 0, which changes the shared control mode back to the autonomous control mode.
  • FIGS. 10A and 10B present flow diagrams depicting at least some of the functionality of shared control module 134 depicted in FIG. 2, in accordance with embodiments of the present disclosure
  • FIG. 10A presents a flow diagram for controlling an articulating member including a tool, in accordance with an embodiment of the present disclosure.
  • a manual control mode confidence indicator is determined based on a manual control mode for the articulating member of the robot.
  • tracking error data are acquired during the performance of a predetermined task under the manual control mode.
  • the tracking error data represent the deviations between a reference trajectory and the actual trajectory of the tool.
  • the manual control mode confidence indicator is determined based on this tracking error data.
  • the tracking error data may be normalized using a metric associated with the predetermined task, such as, for example, occlusion size, and then a curve may be fitted to the data to produce a normalized tracking error function.
  • the manual control mode confidence indicator is then derived from the normalized tracking error function.
  • an autonomous control mode confidence indicator is determined based on an autonomous control mode for the articulating member of the robot.
  • tracking error data are acquired during the performance of a predetermined task under the autonomous control mode.
  • the tracking error data represent the deviations between a reference trajectory and the actual trajectory of the tool.
  • the autonomous control mode confidence indicator is determined based on this tracking error data.
  • the tracking error data may be normalized using a metric associated with the predetermined task, such as, for example, occlusion size, and then a curve may be fitted to the data to produce a normalized tracking error function.
  • the autonomous control mode confidence indicator is then derived from the normalized tracking error function.
  • an allocation function is generated based on the manual control mode confidence indicator and the autonomous control mode confidence indicator. As discussed above, the manual and autonomous control mode
  • the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode may be determined based on the autonomous control mode confidence indicator.
  • the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode may be determined based on the manual control mode confidence indicator.
  • the manual and autonomous control mode confidence indicators are blended to yield an allocation function a(t) that combines control commands from the manual control mode and control commands from the
  • a control command is generated for the articulating member of the robot based on the allocation function.
  • the autonomous control command A(t) is selected as the control command.
  • the control command is not influenced by the manual control command when a(t) is 0.
  • the manual control command M(t) is selected as the control command.
  • the control command is not influenced by the autonomous control command when a(t) is 1.
  • the allocation function a(t) changes as a function of the independent variable x.
  • the independent variable x may be the confidence in the autonomous control mode, as discussed above.
  • the control command is sent to the articulating member.
  • the control command is input to a low level controller, which converts the control command to a robot-specific control signal.
  • the robot-specific control signal is then sent to the robot over the appropriate I/O Interface.
  • the control command is sent directly to the robot, which converts the control command to the appropriate robot-specific control signal.
  • autonomous control mode may correspond to a fully autonomous control mode (e.g., in connection with subsystem 220 of FIG. 3D) or to a supervised autonomous control mode (e.g., in connection with subsystem 250 of FIG. 16).
  • FIG. 10B presents a flow diagram for controlling an articulating member including a tool, in accordance with an embodiment of the present disclosure.
  • control command is converted to a robot-specific control signal, as discussed above.
  • FIGS. 11 -18 depict various features of a RAS system, in accordance with the present disclosure.
  • Narrow-band imaging (NBI) with only green and blue light to highlight the malignant lesions improves the identification of disease-free resection margins, and is an alternative to white light (WL) endoscopy.
  • WL white light
  • biocompatible near-infrared (NIR) markers may be used for robot guidance in these and other surgical situations, and provide strong penetration of the NIR light, durability, and bio-compatibility. More specifically, by observing the NIR light with higher wavelength than the visual light, the NIR markers can always be seen intra operatively with high signal to noise ratio (SNR), even when obstructed by blood and tissue. In long-term multimodality tumor treatment scenarios, several rounds of chemotherapy are performed before the surgery and the tumor dimension shrinks over time. In one embodiment, the location of the tumor is marked before chemotherapy, which provides surgeons with the original tumor region intra-operatively rather than the shrunken tumor post chemotherapy.
  • SNR signal to noise ratio
  • NIR markers described herein may made from FDA-approved NIR fluorophore Indocyanine Green (ICG), cyanoacrylate
  • NIR markers may be used on target tissue locations for suture planning via linear interpolation as well as 2D pattern cutting for pseudo-tumor resection. Additionally, NIR markers may be used on soft and unstructured 3D tissues in combination with more complex control methods compared to the 2D scenario.
  • a single point cloud of a tissue surface may be acquired (e.g., using a NIR camera, a RGBD camera, or a combination of the two), and a straight-line, 3D incision path for the robot may be determined. The start and end points may be manually selected in some embodiments.
  • FIG. 11 shows an illustrative RAS system 1100 that is included in a testbed.
  • the RAS system 1100 may include a robot 20 having a robotic arm 22 (e.g., a 7-DOF KUKA lightweight robotic arm), a near-infrared (NIR) camera 1102 (e.g., which may be a 845 nm ⁇ 55 nm NIR camera), a Red-Green-Blue-Depth (RGBD) camera 1104 (e.g., which may be a Realsense D415 RGBD camera), a light source 1106 (e.g., which may be an infrared or NIR light source, and which may include a 760 nm high power light emitting diode), and an electrocautery tool 24. Examples of various functions of the RAS system 1100 will be illustrated with respect to a tissue sample 4.
  • NIR near-infrared
  • RGBD Red-Green-Blue-Depth
  • a light source 1106 e.
  • the RAS system 1100 may correspond to the RAS system 10 of FIGS. 1 and 2, but with a dual camera imaging system that includes the cameras 1102 and 1104 instead of the camera 40, which may allow for 3D and NIR imaging.
  • Components of the RAS system 1100 having counterparts in the RAS system 10 may be referred to with the same reference numerals.
  • the NIR camera 1102 and the RGBD camera 1104 may be included in a supervised autonomous control subsystem 250 (e.g., which may correspond to the supervised control subsystem 250 shown of FIG. 3), shown in FIG. 16, of a shared control system that includes the robot 20.
  • the subsystem 250 may control the robot 20 and electrocautery tool 24 to produce precise and consistent incisions on complex three-dimensional (3D) soft tissues, such as the tissue sample 4.
  • the supervised autonomous control subsystem 250 may provide a supervised autonomous control mode in which an operator (e.g., a surgeon) may identify key points on a tissue of interest, such as, for example, a tumor, by selecting the NIR markers outlining the tumor using a GUI.
  • the operator may validate the electrocautery path before autonomous control is initiated.
  • the subsystem may autonomously generate and filter a complete 3D electro-surgery pattern between multiple key points marking the tumor bed. Since the path planning and filtering is done via continuous and multiple measurements of the 3D tissue surface information, the resulting executed incision may be more accurate than a conventional single-step, offline path planning method.
  • the supervised autonomous control mode may provide a more accurate 3D incision on real tissues, including a more accurate depth of cut.
  • Electrocautery tool 24 may be added to robot 20 for performing incisions on the tissue samples. Electrocautery tool 24 may use a needle electrode to send a cutting waveform, which may be generated via an electro-surgical generator (e.g., an DRE ASG-300 electro-surgical generator), to the target tissue. The cutting waveform may vaporize tissues in contact with the electrode.
  • an electro-surgical generator e.g., an DRE ASG-300 electro-surgical generator
  • FIG. 12A shows a perspective view of the testbed of FIG. 11 and the dual camera imaging system of the RAS 1100 that includes the NIR camera 1102 and the RGBD camera 1104.
  • the dual camera system may detect: NIR markers 1220 disposed in or on the tissue sample 4, their 3D positions, a tissue surface (e.g., of tissue sample 4), and a manual control interface for result comparisons.
  • the RGBD camera 1104 may obtain 3D tissue surface information.
  • the NIR camera 1102 may detect the NIR markers 1220 when they are illuminated by the light source 1106.
  • other 3D cameras such as plenoptic cameras and structured illumination cameras, may be used instead of or in addition to the RGBD camera 1104.
  • the projector in order to prevent interference of the projector of the RGBD camera 1104 with the readings captured by the NIR camera 1102, the projector may be periodically switched back and forth between on and off states (e.g., with a state transition occurring every 0.22 seconds) via software triggers that control the RGBD camera 1104.
  • the NIR camera 1102 may be configured to capture images only when the projector of the RGBD camera 1104 is turned off.
  • a real-time imaging system e.g., which may be included in subsystem 250 of FIG. 16
  • the positions of the RGBD camera 1104 and the NIR camera 1102 may be compared to a checkerboard, and relative positions of the cameras with respect to each other may be determined (e.g., using the
  • a hand-eye calibration may be performed by finding the position of the checkerboard in the robot coordinates. The 3D position and orientation of the cameras 1102, 1104 compared to the robot 20 are then determined.
  • a visual servoing platform may be used to track portions of the NIR images captured by the NIR camera 1102 corresponding to the NIR markers 1220 between NIR image frames captured by the NIR camera. The operator may select the markers via mouse clicks (e.g., with a mouse of the I/O devices 142 of the computer 100 shown in FIG. 2).
  • mouse clicks e.g., with a mouse of the I/O devices 142 of the computer 100 shown in FIG. 2
  • NIR cameras, NIR images, and NIR markers are described, these elements are intended to be illustrative and not limiting. In alternate embodiments, other suitable camera types, image types, and/or marker types may be used in place of the NIR camera 1120, the NIR image data, and the NIR markers 1220 to provide landmark/feature detection as a basis for path
  • FIG. 12B shows an illustrative image captured by the NIR camera 1102, showing the NIR markers 1220 disposed on the tissue sample 4.
  • FIG. 12C shows an illustrative point cloud image of the tissue sample 4 captured by the RGBD camera 1104.
  • the point cloud image has been overlaid with the 3D positions of the NIR markers 1220 (e.g., by the subsystem 250).
  • supervised autonomous control subsystem 210 may include a manual control mode, some aspects (e.g., system components) of which are shown in FIG. 13.
  • a surgeon manually controls the 3D motion of the tool-tip of the robot 20 using the input device 30.
  • the coordinate frame transformations between input device 30, the camera view frame, and the robot frame are done in real-time via the ROS transform package, which matches all the motions that supervised autonomous control subsystem 250 may perform.
  • Camera 40 provides high-resolution real-time visual feedback to the surgeon, and the NIR marker positions are overlaid on this view (e.g., green dots shown in FIG. 13.c) as a reference for the surgeon.
  • a third camera (not shown), which may be a RGB camera (e.g., camera 40 of FIG. 1 ), may be included in the RAS system 10, and may capture high-resolution video 1306, which is displayed on the monitor 1304 (e.g., which may correspond to the display 50 of FIG. 1 ) to provide real-time visual feedback to the operator.
  • the positions of NIR markers 1120 (e.g., having been previously identified from the NIR image(s) and point cloud image(s) captured by the cameras 1102 and 1104, respectively) may be overlaid over the video 1306.
  • FIG. 14A depicts illustrative overlays 1400 and 1410 corresponding to an exemplary cutting task that uses 4 NIR markers 1420.
  • Overlay 1400 depicts the desired incision pattern 1402 and an incision path 1404 on a tissue sample 1412 (e.g., which may correspond to tissue sample 4 of FIG. 11 ).
  • the tissue sample 1412 may have been cut using subsystem 210, 220, 250, or a combination of these.
  • Overlay 1410 depicts an approximation 1406 of corresponding edges of the incision path 1404, which may be compared to the desired incision path 1402 for surface error measurement.
  • FIG. 14B shows an example comparison between the desired incision pattern 1402 and the approximation 1406. Two regions are shown in higher resolution to illustrate surface error 1422 and 1424. The comparison may be performed by a post-processing system (not shown) in order to estimate error.
  • FIG. 15A depicts a side view of the tissue sample 1412 of FIG. 14A.
  • a post-processing system may extract an estimated top edge 1502 and an estimated bottom edge 1504 of a cut portion 1506 of the tissue sample 1412 for depth error measurement.
  • the estimated top edge 1502 and the estimated bottom edge 1504 may be identified automatically be the subsystem 250 in some
  • FIG. 15B shows an illustrative comparison of the estimated top edge 1502 and bottom edge 1504. Distances (e.g., d1 , d2) between corresponding pixels of the top edge 1502 and the bottom edge 1504 may be calculated by the subsystem 250 and may be compared to a desired depth to calculate error. While only one side of the sample 1412 is shown here, it should be understood that the depths of all four sides of the incision may be measured and corresponding error values may be calculated in this way. [0155] While the examples of FIGS. 14A-15B are provided in the context of a manual cutting task, it should be understood that the results of automated cutting tasks and/or automation-assisted cutting tasks may be similarly analyzed to determine error.
  • FIG. 16 depicts a block diagram of a portion of the RAS system 1100, which includes the supervised autonomous control subsystem 250 and a low-level controller 240.
  • the subsystem 250 may include the NIR camera 1102, the RGBD camera 1104, a 3D marker position module 1606, a path planner module 1608, a filter 1610, a trajectory generator and planner module 1612, and an inverse kinematics module 1614.
  • the supervised autonomous control subsystem 250 may be included in a shared control system (e.g., system 230 of FIGS.
  • supervised autonomous control commands 252 that may be generated by the subsystem 250 may be analyzed by such a control system to estimate corresponding error and/or confidence indicators, and may, in combination with separate manual control commands, be used as the basis for generating an allocation function and shared control commands, as will be described.
  • Real-time video frames from the RGBD camera 1104 and the NIR camera 1102 are collected and processed by the 3D marker position module 1606 to obtain the 3D coordinates of the NIR markers (e.g., markers 1120) in the robot frame.
  • An offsetting technique is applied by the path planner module 1608 to project the NIR marker positions outwards on the point cloud and allow planning an incision path with specified margins around the NIR markers.
  • the offsetting technique executed by the path planner module 1608 uses the 3D vectors formed from the previous and next markers to the current marker, calculates a 5mm offset on the superposition of the vectors and projects it to the tissue surface by finding the closest point the point cloud.
  • a path planning algorithm executed by the path planner module 1608 calculates a 3D path on the point cloud model of the tissue surface between each two consecutive projected NIR marker positions (e.g., the corners of the desired pattern 1402 in overlay 1400 of FIG. 14A).
  • the path planner module may thereby generate and output reference waypoints.
  • the filter 1610 eliminates the dynamic inter-frame noise of the resulting path so that it is usable in the robot controllers.
  • Real-time position feedback may be sent from the robot to the subsystem 250, to be processed by the trajectory generator and planner module 1612.
  • the reference waypoints output by the path planner 1608 and filtered by the filter 1610 and the real-time robot positions may be received and used by the trajectory generator and planner module 1612 to obtain smooth time-based trajectories using, for example, Reflexxes Motion Libraries in the robot frame.
  • the task-space trajectories of the robot 20 may be converted to the joint-space trajectories by the inverse kinematics module 1614 using, for example, Kinematics and Dynamics Library (KDL) of Open Robot Control Systems (OROCOS).
  • KDL Kinematics and Dynamics Library
  • OROCOS Open Robot Control Systems
  • Low-level closed-loop robot controllers may be implemented so that the robot 20 follows the desired joint space trajectories and hence the 3D path waypoints on the tissue 4.
  • the subsystem 250 may output a supervised autonomous control command 252 to the low level controller 240, which then converts autonomous control command 252 to supervised autonomous control signal 241 and sends the autonomous control signal 241 to the robot 20.
  • the autonomous control command 252 may instead be sent to a shared control system 230, which may process the autonomous control command 252 and a separate manual control command 212 and apply an allocation function a(t) that defines respective percentages of the manual control command 212 and the supervised autonomous control command 252 that are combined to form a shared control command 232, which is then sent to the low level controller 240, which converts the shared control command 232 to a shared control signal 231 , which is sent to the robot 20 to control the robot 20.
  • a shared control system 230 may process the autonomous control command 252 and a separate manual control command 212 and apply an allocation function a(t) that defines respective percentages of the manual control command 212 and the supervised autonomous control command 252 that are combined to form a shared control command 232, which is then sent to the low level controller 240, which converts the shared control command 232 to a shared control signal 231 , which is sent to the robot 20 to control the robot 20.
  • the 3D path planning algorithm implemented by the path planner module 1608 may determine a 3D path between a start point and an end point on a point cloud using, for example, PCL in C++.
  • FIG. 17 shows an illustrative 3D graph 1700 of a point cloud 1702, which may be generated by the RGBD camera 1104.
  • the 3D path planning algorithm may generate a path 1704 that connects a defined start point on the point cloud to a defined end point on the point cloud.
  • the point cloud 1702 may be captured by the RGBD camera 1104 and a pass-through filter may be applied to extract the point cloud from a region of interest near the tissue sample 4. Applying the region of interest on the point cloud 1702 may avoid the need for processing the entire raw point cloud and hence may reduce the computation time.
  • a statistical outlier removal (SOR) filter may be applied by the planner module 1608 to reduce the noise in the current point cloud 1702.
  • the SOR filter measures the average distance m and standard deviation a of each point to its k nearest neighbors and rejects neighbors that lie beyond the distance m+as.
  • a moving least square (MLS) filter may be applied by the path planner module 1608 to create a smooth distribution of the point cloud 1702 by calculating a fitting surface on each point in a sphere radius r through higher order polynomial that fits the original points, and resampling missing points based on the fitting surface.
  • a mesh is then created by the path planner module 1608 using, for example, Delaunay triangulation among the point cloud.
  • the shortest path between a start point and an end point is then computed using, for example, the Dijkstra algorithm, which determines an optimal path (i.e., shortest distance) if it exists.
  • NIR markers e.g., NIR markers 1120
  • This process is repeated for each two consecutive projected markers as start and end points of each segment of incision (e.g., as illustrated in FIG. 19).
  • In-frame noise may distort the surface of an object of interest, such as, for example, causing a flat surface to appear bumpy.
  • template matching may reduce and/or eliminate the effect of in-frame noise.
  • Other methods for in-frame noise reduction include smoothing and removing outliers for reducing surface or volume noise. Such techniques may be applied (e.g., by the filter 1610 in conjunction with the path planner 1608) to each measurement of point cloud data.
  • Inter-frame noise occurs in real-time measurements and is related to the slight noisy motion of the point cloud from the previous camera frame to the current one.
  • inter-frame noise may cause a time-varying number of way-points at the output of path planning algorithm (e.g., the output of the path planner module 1608), and/or a noisy motion of these points between the frames.
  • Inter frame noise may affect autonomous control when performing delicate and precise tasks such as tumor resection with small margins of error.
  • FIG. 18A shows a sequence of frames 1802 that include raw/unfiltered paths generated over time by the path planner module 1608. Each path includes a start point 1804, an end point 1806, and several waypoints 1808. As shown, the waypoints 1808 may experience noisy motion between frames due to inter-frame noise.
  • FIG. 18B shows an individual frame 1810 showing tracked waypoints 1812 that are a subset of the waypoints 1808 that are selected for tracking between frames to counter in-frame noise.
  • a fixed number n e.g., 4
  • a filtering or estimation algorithm may then be applied to the fixed number of way-points n (e.g., by the filter 1610) over time to obtain a filtered path as additional measurements are acquired.
  • a recursive least square (RLS) estimation method may be used to track the waypoints 1812 on the path.
  • FIG. 18C shows, over time, tracked waypoints 1812 and filtered waypoints 1814 generated by the filter 1610 based on the tracked waypoints 1812.
  • a fixed number of candidate waypoints 1812 and their positions on the noisy path (defined as w, ⁇ ) are first determined, and then a filtering method is applied by the filter 1610.
  • the candidate waypoints 1812 are determined using a waypoint extraction method, and then the candidate waypoints are filtered using a recursive least squares (RLS) method.
  • RLS recursive least squares
  • s Î R 3 and e Î R 3 are the start and end points of the desired path segment on the point cloud
  • P sek Î R 3 is the current calculated path at the time instant k with n k path points between s and e, and length l k.
  • the elements n k (i.e. the number of waypoints 1808 in FIG. 18A) and l k will change dynamically depending on the current reading of the noisy point cloud data and how the path planning algorithm detects the trajectory at that time instant.
  • the path will include at least n min > 0 waypoints (i.e., n k 3 n min , " k 3 0 ).
  • a fixed number of waypoints w i Î R 3 , i Î ⁇ 1, ... , n ⁇ are selected from this path, and then tracked and filtered as more measurements of the noisy path P sek are collected. If n ⁇ n min waypoints are selected for tracking , a fixed number of points are always used in the filtering algorithm to track their dynamics over time, and n min may be determined dynamically over time based on the resolution and density of the point cloud obtained from the 3D sensor/camera in the neighborhood of s and e .
  • w i In order to find the position of the tracked waypoints w i , they are equally distributed along P sek using the total length l k (i.e. , breaking l k into n+ 1 equal sections). The positions of w i are determined as a point on the current path P sek that is the closest point to the location from the start points (e.g., at
  • the RLS method which may be applied after the waypoint extraction method, filters the positions of the waypoints w i to produce filtered waypoints (i.e. the filtered waypoints 1814 in FIG. 18C) using the noisy measurements of the path.
  • H k Î R 3nx3n n is the output/measurement matrix
  • y k Î R 3nx1 is the current measurement of w and is obtained by augmenting the positions of w i detailed above
  • v k Î R 3nx1 is the measurement noise.
  • the augmented vector of the estimation of w i at time step k is and the estimation error is:
  • the cost function is the aggregated variance of the estimation errors:
  • K k Î R 3nx3n is the estimation gain matrix
  • H k Î R 3nx3n is measurement noise covariance matrix
  • P k Î R 3nx3n is estimation-
  • the estimation (e.g., which may correspond to the filtered waypoints 1814) is
  • the noisy (i.e. , unfiltered) path 1904 is also shown.
  • the planned incision path may be shifted by about 5 mm below the tissue surface along the z axis of the robot tool direction which is perpendicular to the tissue, and hence the robot 20 may perform the cut with the desired depth according to Equation 5:
  • tissue e.g., tissue sample 4
  • the contact forces with the tissue during the electrocautery are negligible and no disturbances interfere with the robot controllers.
  • FIGS. 20-26 depict various additional features of a RAS system (e.g., the RAS system 1100 of FIG. 11 ), in accordance with another embodiment of the present disclosure.
  • Different confidence indicator identification methods for supervised autonomous control subsystem 250 may be used. More specifically, the accuracies of the NIR marker position estimation and path planning algorithms may be evaluated via an identification pattern that is positioned at different configurations with respect to the camera system and is also subjected to different noises. These criteria may affect the accuracy of the incision paths performed by the autonomous robot controller.
  • FIG. 20 depicts an identification pattern 2002 mounted on arm 22 of the robot 20.
  • a pattern with a known geometry may be used, such as, for example, the identification pattern 2002.
  • This pattern 2002 shown in the present example includes 36 marker wells which are equally spaced at 1 -cm horizontal and vertical intervals to form a symmetric grid about the center of the identification pattern.
  • This known geometry is used as the ground truth or baseline to evaluate how the accuracy of the camera system (e.g., cameras 1102 and 1104 of FIGS. 11 and 16) for 3D marker position projection varies as different parameters such as distance to the camera, angular positions, etc., are varied.
  • FIG. 21 shows a graph 2100, which provides an example of evaluating marker projection errors at a 31 cm distance with a -40 degree roll angle and a 0 degree pitch angle. Examples of baseline data and projected marker positions via the camera system are shown. The projection error is calculated using the average 3D distances between the baseline and the corresponding projections.
  • FIG. 22 depicts a graph 2202 illustrating the effects of changes in roll angle on marker projection error, and a graph 2204 illustrating the corresponding identified confidence indicator.
  • FIG. 23 depicts a graph 2302 illustrating the effects of changes in pitch angle on marker projection error, and a graph 2304 illustrating the
  • FIG. 24 depicts a graph 2402 illustrating the effects of changes in distance on marker projection error, and a graph 2404 illustrating the corresponding identified confidence indicator.
  • the marker projection error increases.
  • the angles at which the minimum error occurs can take the generic form of rmin for roll and Pmin for pitch.
  • the confidence indicators are calculated by inverting and shifting the curve fitted to the error models so that lower errors are associated with higher confidence values.
  • the identification pattern 2002 can also be used for testing the accuracy of the path planning algorithm when the 3D point cloud is locally subjected to noise resulting in low density data.
  • Clinical sources of noise include external or additional light sources used for illuminating the surgical scene for the surgeon. These light sources may cause local or global reflections and point cloud density degradations in real-time data coming from the camera system.
  • FIG. 25A depicts an illustrative desired path planning pattern 2502 overlaid on the identification pattern 2002 of FIG. 20.
  • a symmetric set of 12 paths on the identification pattern between 4 different markers is first determined (i.e. , the desired path planning pattern 2502).
  • a white light source is used to project random external noises on the pattern to disturb the 3D point cloud obtained from the camera system. Data are collected at different angles and distances similar to the pattern configurations described for marker projection error.
  • FIG. 25B shows a 3D graph 2504 that illustrates effects of local noise on the point cloud density and path planning accuracy, and a 3D graph 2506 that illustrates path planning under no external noise.
  • the path planner determines a less than optimal path between the markers within the low- density point cloud regions in the presence of local noise.
  • the path planner may determine a more accurate path when noise is minimally present on the identification pattern point cloud.
  • FIG. 26 depicts a graph 2602 illustrating a path planning error model and corresponding effects of changes in point cloud density on path planning error, and a graph 2604 illustrating the corresponding identified confidence indicator.
  • the path planning error exponentially decreases as the point cloud density increases because the path planning algorithm relies on the density of the point cloud to produce accurate paths between the markers.
  • the error model e.g., density error model
  • err 10.55e -899s (where s is the point cloud density)
  • the confidence indicator is obtained by scaling and inverting the error model as C A4
  • FIG. 27 shows an illustrative a block diagram of a multi-criteria confidence-based allocation system for 3D incision tasks, according to an embodiment of the present disclosure.
  • the system 2700 may include a camera system 2702 (e.g., which may include some or all of the subsystem 250 of FIG. 16), a roll estimator 2704, a pitch estimator 2706, a distance estimator 2708, a density and noise estimator 2710, the confidence indicators (e.g., "confidence models")
  • a camera system 2702 e.g., which may include some or all of the subsystem 250 of FIG. 16
  • a roll estimator 2704 e.g., which may include some or all of the subsystem 250 of FIG. 16
  • a pitch estimator 2706 e.g., a pitch estimator 2706
  • a distance estimator 2708 e.g., a distance estimator 2708
  • a density and noise estimator 2710 e.g., "confidence models”
  • image data captured by the camera system 2702 may be processed (e.g., in parallel) by the estimators 2704-2710 to generate roll, pitch, yaw, distance, and point cloud density estimates, respectively, along with corresponding error.
  • the confidence indicators may be used to generate roll, pitch, yaw, distance, and point cloud density estimates, respectively, along with corresponding error.
  • C A1 , C A2 , C A3 , and C A4 may be used to calculate confidence values based on the errors from the estimators 2704-2710.
  • the confidence indicators C A1 , C A2 , C A3 , and C A4 may, in combination, correspond to either of the confidence indicators 237 and 238 of FIG. 3B, and the allocation functions 2720 may correspond to at least a portion of the adaptive confidence-based autonomy allocation module 239 of FIG. 3B.
  • the allocation functions 2720 may correspond to the allocation functions 800 shown in FIG. 9. As described in connection with FIG. 9, the allocation functions may include a x , a y , a z , a roll , a pitch , a yaw (e.g.,
  • x w X C A1 + w 2 C A2 + w 3 C A 3 + w 4 C A4 is the weighted confidence indicator outputs.
  • Each translational (x-y-z) and rotational (roll-pitch-yaw) motion of robot tool can be considered separately and can be assigned an allocation function of the allocation functions 2720 that uses weighted confidence indicators as input (i.e., the weights w 1 - w 4 ) .
  • a x , a y , a z , a roll , a pitch , a yaw can be selected from the different forms shown and described in connection with FIG. 9 and may be used (e.g., when controlling the robot 20, the arm 22, and/or the tool 24) to generate shared control commands and/or shared control signals (e.g., shared control command 232 and/or shared control signal 231 of FIGS. 3A and 3B) to minimize the incision error.
  • the allocation functions do not necessarily need to be different and in some cases, similar allocation functions can be used for controlling different degrees of freedom of the robot 20 (e.g., and the corresponding arm 22 and/or tool 24).
  • the allocation functions 2720 may additionally or alternatively include the allocation functions described below in connection with FIG. 28 and Equations 7 and 8.
  • FIG. 28 shows an illustrative graph 2800 of an allocation function 2802 that is calculated based on both manual control confidence indicators C M , described above, and automatic control confidence indicators (e.g., which may include or be calculated based on C A1 - C A4 ).
  • the total error for the manual control may be normalized as err M (t) Î [0,1] and for the autonomous control as err A (t) Î [0,1]
  • Equation 7 The optimal solution to minimizing err(t) by a choice of a(t) can be found according to Equation 7:
  • Equation 7 may cause noisy/jittery allocations of autonomy, which are not necessarily easy for a human to follow due to sudden and frequent changes.
  • Equation 8 An alternate solution is provided in Equation 8: where x Î [x,x] is the independent variable with lower/upper bound for the
  • b is a bias at which a(t) and s is a steepness control parameter.
  • Equation 8 turns to a step an non-smooth function.
  • x C A - C M Î [-1,1] may be selected, that is the difference between the overall confidence in the autonomous control C a and the manual control C m.
  • the upper and lower bounds may be selected as
  • is largely affected by s.
  • s may be set to 5 or around 5 for both smooth and fast allocation of autonomy between the manual and autonomous robot controllers.
  • FIG. 29 shows a graph 2900 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a high level of skill, resulting in little allocation of
  • FIG. 30 shows a graph 3000 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a moderate level of skill, resulting in moderate allocation of autonomous control.
  • FIG. 31 shows a graph 3100 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a low level of skill, resulting in high allocation of autonomous control.
  • FIG. 32A shows an example of autonomy allocation using the RAS system 1100 of FIG. 11. As shown in the illustrative graph 3200, the shared control strategy smoothly produces a smaller a(t) when the autonomous control is superior to manual control and vice versa.
  • FIG. 32B shows a graph 3202 illustrating autonomous and manual commands sent to the 2nd joint of the robot 20 and the overall shared output applied to the 2nd joint.
  • FIG. 32C shows a graph 3204 illustrating autonomous and manual commands sent to the 6th joint of the robot 20 and the overall shared output applied to the 6th joint.
  • FIG. 33 shows an illustrative user interface 3300 that may be displayed (e.g., via the monitor 1304 of FIG. 13) to the operator and that include 3302 an indicator depicting, in real-time or near-real-time, how much control the operator has over the robot (e.g., compared to the percentage of control that is being handled autonomously by the system 250).
  • FIGS. 34 and 35 depict graphical user interfaces for RAS system 1100, (e.g., which may be displayed on the monitor 1304 of FIG. 13) in accordance with embodiments of the present disclosure. These GUIs are depicted for a suturing procedure.
  • GUI 3402 depicts a video image of a task space (e.g., task space 2 of FIG. 16) and unidimensional control mode indicator 3404.
  • a graphical user interface (GUI) 3406 depicts a video image of a task space (e.g., task space 2 of FIG. 16) and multidimensional control mode indicator 3408.
  • GUI graphical user interface
  • FIG. 35 a graphical user interface (GUI) 3500 depicts a visual video image 3504 of a task space (e.g., task space 2, FIG. 16), a NIR video image 3506 of the task space, and procedure and control mode indicator 3502.
  • Ranges of values and/or numeric values are provided herein as examples only, and do not constitute a limitation on the scope of the described embodiments.
  • the use of any and all examples, or exemplary language (“e.g.,”“such as,”“for example,” or the like) provided herein, is intended merely to better illuminate the embodiments and does not pose a limitation on the scope of the embodiments. No language in the specification should be construed as indicating any unclaimed element as essential to the practice of the embodiments.

Abstract

The present disclosure provides a system and method for controlling an articulating member including a tool. The system may include a dual camera system that captures near-infrared (NIR) images and point cloud images of a tissue or other substance that includes NIR markers. The system may generate a three-dimensional (3D) path based on identified positions of the NIR markers, may filter the generated path, and may generate a 3D trajectory for controlling the articulated arm of a robot having a tool to create an incision along the filtered path. In a shared control mode, an operator may generate manually control commands for the robot to guide the tool along such a path, while automated control commands are generated in parallel. One or more allocation functions may be calculated based on calculated manual and automated error models, and shared control signals may be generated based on the allocation functions.

Description

CONFIDENCE-BASED ROBOTICALLY-ASSISTED SURGERY SYSTEM
PRIORITY CLAIM
[001] This present application claims priority to U.S. Provisional Application No. 62/848979, filed May 16, 2019 and U.S. Provisional Application No. 62/907872, filed September 30, 2019, the content of which are incorporated herein by reference in their entirety.
GOVERNMENT SUPPORT
[002] This invention was made with government support under
R01 EB020610 and R21 EB024707 awarded by NIH. The government has certain rights in the invention.
TECHNICAL FIELD
[003] The present disclosure relates to a system. More particularly, the present disclosure relates to a robotically-assisted surgery (RAS) system.
BACKGROUND
[004] The field of medical robotics has dramatically evolved over the past two decades due to advances in robotic and camera technology. Many RAS systems are based on tele-operation (i.e. , remote operation or operation at a distance), and include robotic arms or similar equipment, cameras, highly dexterous surgical tools, etc. Many RAS systems provide a minimally invasive surgery (MIS) approach, which can be faster, safer and require less patient recovery time.
Additionally, a RAS system can reduce human errors and improve patient outcomes by leveraging robotic accuracy and repeatability during certain surgical procedures. However, the degree of interaction between RAS systems and human operators has not been found to be optimal. For example, a completely autonomous RAS system has not been feasible for many surgical situations, procedures and environments. Therefore, a need exists for an RAS system that optimizes the amount of
autonomous and manual interaction with an operator. SUMMARY
[005] In an example embodiment, a system may include a camera system that includes a first camera and a second camera, an articulating member that includes a tool, and a computer. The computer may include at least one processor and a non-transitory memory configured to store computer-readable instructions which, when executed, cause the at least one processor to, receive image data from the first camera, receive point cloud image data from the second camera, wherein the image data and the point cloud image data correspond to a tissue on which markers are disposed, identify marker positions of the markers based on the image data and the point cloud image data, generate a path between a first point on the point cloud and a second point on the point cloud based at least on the marker positions, filter the path, receive real-time position data corresponding to the articulating member, generate a three-dimensional (3D) trajectory based on the filtered path and the real-time position data, generate control commands based on the 3D trajectory, and control the articulating member and the tool to follow the 3D trajectory based on the control commands.
[006] In some embodiments, the tool may include an electrocautery tool. The computer-readable instructions which cause the at least one processor to control the articulating member and the tool may further cause the electrocautery tool to cut the tissue along the path.
[007] In some embodiments, the first camera may include a near-infrared (NIR) camera, the second camera may include a red-blue-green-depth (RGBD) camera, the image data may include NIR image data, and the markers may include NIR markers.
[008] In some embodiments, the computer-readable instructions which cause the at least one processor to generate the path may further cause the at least one processor to identify projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data, and reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
[009] In some embodiments, the computer-readable instructions which cause the at least on processor to filter the path may further cause the at least one processor to select tracked waypoints as a subset of the reference waypoints, and generate filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
[010] In some embodiments, the filtering algorithm may be selected from the group consisting of a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
[011] In some embodiments, the computer-readable instructions, when executed, may further cause the at least one processor to calculate at least one autonomous confidence indicator based on autonomous incision error, calculate a manual confidence indicator based on manual incision error, generate at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator, and generate the control commands based on the at least one allocation function.
[012] In some embodiments, the at least one autonomous confidence indicator may be selected from the group consisting of a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error. The at least one allocation function may include multiple of allocation functions corresponding to movement of the articulating member in three-dimensional directions, and roll, pitch, and yaw of the articulated member.
[013] In an example embodiment, a method may include steps for generating image data and point cloud image data corresponding to a region of interest on which markers are disposed, identifying marker positions of the markers based on the image data and the point cloud image data, generating a path between a first point of the point cloud image data and a second point of the point cloud image data, based at least on the marker positions, receiving real-time position data corresponding to an articulating member, generating a three-dimensional (3D) trajectory for the articulating member based on the path and the real-time position data, generating control commands based on the 3D trajectory, and controlling the articulating member to follow the 3D trajectory based on the control commands.
[014] In some embodiments, the articulating member may include a robotic arm, and controlling the articulating member may include causing the robotic arm to cut tissue in the region of interest along the path.
[015] In some embodiments, the step of generating the path may include identifying projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data, and generating reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
[016] In some embodiments, the step of filtering the path may include selecting tracked waypoints as a subset of the reference waypoints, and generating filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
[017] In some embodiments, the filtering algorithm may be selected from the group consisting of: a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
[018] In some embodiments, the method may further include steps for calculating at least one autonomous confidence indicator based on autonomous incision error, calculating a manual confidence indicator based on manual incision error, generating at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator, and generating the control commands based on the at least one allocation function.
[019] In some embodiments, the at least one autonomous confidence indicator may include at least one confidence indicator selected from a group consisting of a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error. In some embodiments, the at least one allocation function comprises a plurality of allocation functions corresponding to movement of the articulating member in three- dimensional directions, and roll, pitch, and yaw of the articulated member.
[020] In some embodiments, the image data may include near-infrared (NIR) image data, and the markers may include NIR markers.
BRIEF DESCRIPTION OF THE DRAWINGS
[021] FIG. 1 depicts a schematic diagram of a RAS system, in accordance with an embodiment of the present disclosure.
[022] FIG. 2 depicts a block diagram of the RAS system depicted in FIG. 1 , in accordance with an embodiment of the present disclosure.
[023] FIG. 3A depicts a block diagram of a shared control system, in accordance with an embodiment of the present disclosure.
[024] FIG. 3B depicts a block diagram of a shared control subsystem, in accordance with an embodiment of the present disclosure.
[025] FIG. 3C depicts a block diagram of a manual control subsystem, in accordance with an embodiment of the present disclosure.
[026] FIG. 3D depicts a block diagram of an autonomous control subsystem, in accordance with an embodiment of the present disclosure.
[027] FIG. 4 depicts a graphical user interface for a shared control system, in accordance with an embodiment of the present disclosure.
[028] FIG. 5 illustrates a series of tissue samples, in accordance with an embodiment of the present disclosure.
[029] FIG. 6 depicts average tracking error graphs for tissue samples, in accordance with an embodiment of the present disclosure.
[030] FIG. 7 depicts normalized tracking error graphs, in accordance with an embodiment of the present disclosure.
[031] FIG. 8 depicts a confidence indicator graph and an allocation function graph, in accordance with an embodiment of the present disclosure. [032] FIG. 9 depicts several allocation functions, in accordance with embodiments of the present disclosure.
[033] FIGS. 10A and 10B present flow diagrams depicting at least some of the functionality of the shared control module depicted in FIG. 2, in accordance with embodiments of the present disclosure.
[034] FIGS. 11 shows an illustrative RAS system having a dual camera system, in accordance with embodiments of the present disclosure.
[035] FIG. 12A shows a perspective view of a testbed of the RAS system of FIG. 11 , in accordance with embodiments of the present disclosure.
[036] FIG. 12B shows an illustrative image of sample tissue and near- infrared (NIR) markers captured by an NIR camera of the RAS system, in
accordance with embodiments of the present disclosure.
[037] FIG. 12C shows an illustrative point cloud image of the tissue sample captured by a RGBD camera of the RAS system with positions of NIR markers overlaid on the point cloud image, in accordance with embodiments of the present disclosure.
[038] FIG. 13 shows illustrative system components which may be used in connection with a manual control mode of the RAS system, in accordance with embodiments of the present disclosure.
[039] FIG. 14A shows illustrative overlays 1400 and 1410 corresponding to an exemplary manual cutting task that may be performed using the RAS system of FIG. 11 , in accordance with embodiments of the present disclosure.
[040] FIG. 14B shows an illustrative comparison between a desired incision path and an actual incision path, which may be use to evaluate error following the cutting task, in accordance with embodiments of the present disclosure.
[041] FIG. 15A shows an illustrative side-view of a tissue sample following the exemplary manual cutting task, in accordance with embodiments of the present disclosure.
[042] FIG. 15B shows an illustrative comparison of upper and lower edges of the cut portion of the tissue shown in FIG. 15A, in accordance with embodiments of the present disclosure. [043] FIG. 16 shows an illustrative block diagram corresponding to a portion of the RAS system of FIG. 11 , including a supervised autonomous control subsystem and low level controller, in accordance with embodiments of the present disclosure.
[044] FIG. 17 shows an illustrative graph of a point cloud that may be captured with the RGBD camera of the RAS system of FIG. 11 that includes a path generated for cutting between a start point and an end point on the point cloud, in accordance with embodiments of the present disclosure.
[045] FIG. 18A shows an illustrative sequence of frames that include a raw, unfiltered path that may be generated by a path planner of a supervised autonomous control subsystem, in accordance with embodiments of the present disclosure.
[046] FIG. 18B shows an illustrative frame that includes tracked waypoints of the raw, unfiltered path, in accordance with embodiments of the present disclosure.
[047] FIG. 18C shows an illustrative sequence of frames that include the tracked waypoints and filtered waypoints that may be output by a filter of the supervised autonomous control system, in accordance with embodiments of the present disclosure.
[048] FIG. 19 shows an illustrative example of a series of paths, waypoints, and corresponding NIR markers overlaid on a point cloud, in accordance with embodiments of the present disclosure.
[049] FIG. 20 shows an illustrative identification pattern that may be used to assess accuracy of a 3D NIR marker projection method that may be performed by the RAS system, in accordance with an embodiment.
[050] FIG. 21 shows an illustrative graph that provides an example of evaluating marker projection errors, in accordance with embodiments of the present disclosure.
[051] FIG. 22 shows an illustrative graph that provides an example of the effects of changes in roll angle on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure. [052] FIG. 23 shows an illustrative graph that provides an example of the effects of changes in pitch angle on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
[053] FIG. 24 shows an illustrative graph that provides an example of the effects of changes in distance on marker projection error and an illustrative graph of the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
[054] FIG. 25A shows an illustrative desired path planning pattern overlaid on an identification pattern, in accordance with embodiments of the present disclosure.
[055] FIG. 25B shows an illustrative 3D graph demonstrating effects of local noise on point cloud density and path planning accuracy, and an illustrative 3D graph illustrating path planning under no external noise, in accordance with embodiments of the present disclosure.
[056] FIG. 26 shows an illustrative graph depicting a path planning error model and corresponding effects of changes in point cloud density on path planning error, and an illustrative graph depicting the corresponding confidence indicator (model), in accordance with embodiments of the present disclosure.
[057] FIG. 27 shows an illustrative system by which confidence indicators may be generated for roll, pitch, distance, and point cloud density, and
corresponding allocation functions may be generated based thereon, in accordance with embodiments of the present disclosure.
[058] FIG. 28 shows an illustrative graph depicting an allocation function that is calculated based on manual and automatic control confidence indicators, in accordance with embodiments of the present disclosure.
[059] FIG. 29 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a high level of skill, in accordance with embodiments of the present disclosure. [060] FIG. 30 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a moderate level of skill, in accordance with embodiments of the present disclosure.
[061] FIG. 31 shows an illustrative graph depicting autonomy allocation for time-varying confidence in autonomous and manual control, where the manually controlling operator has a low level of skill, in accordance with embodiments of the present disclosure.
[062] FIG. 32A shows an illustrative graph depicting an allocation function compared to automatic and manual confidence indicators, in accordance with some embodiments of the present disclosure.
[063] FIG. 32B shows an illustrative graph depicting a shared output for controlling a second joint of a robot of an RAS system compared to autonomous and manual control signals based on which the shared output is generated, in
accordance with embodiments of the present disclosure.
[064] FIG. 32C shows an illustrative graph depicting a shared output for controlling a sixth joint of a robot of an RAS system compared to autonomous and manual control signals based on which the shared output is generated, in
accordance with embodiments of the present disclosure.
[065] FIG. 33 illustrates a graphical user interface that includes an indicator of a level of shared control of the articulated member of a robot, in accordance with embodiments of the present disclosure.
[066] FIG. 34 illustrates a graphical user interface that includes manual control indicators for a RAS system, in accordance with embodiments of the present disclosure.
[067] FIG. 35 illustrates a graphical user interface that includes regular and NIR video of a task space along with a procedure and control mode indicator corresponding to an RAS system, in accordance with embodiments of the present disclosure. DETAILED DESCRIPTION
[068] Embodiments of the present disclosure will now be described with reference to the drawing figures, in which like reference numerals refer to like parts throughout.
[069] Embodiments of the present disclosure advantageously improve both RAS system efficiency and patient outcomes by combining the best features of automation with the complementary skills of the surgeon operating the RAS system. While automation of the RAS system may provide greater accuracy and repeatability in certain surgical situations, automation is not infallible and safe operation requires surgeon supervision and possible intervention. Accordingly, the present disclosure provides a control system that allows surgical procedures to be performed
collaboratively between robot and surgeon with the highest possible degree of autonomy, while ensuring safe operation at all times.
[070] More particularly, embodiments of the present disclosure provide a confidence-based shared control system that provides an automated control allocation during a surgical task, situation, procedure, etc. Importantly, the confidence-based shared control system improves the surgical performance of any surgeon by reducing not only the overall error committed by the surgeon, but also the workload of the surgeon during the task.
[071] FIG. 1 depicts a schematic diagram of RAS system 10, in accordance with an embodiment of the present disclosure.
[072] RAS system 10 includes computer 100 coupled to robot 20, input device 30, camera 40 and display 50. Tissue 4 may include one or more tissue samples, a region of interest of a patient, etc. Robot 20 includes articulated member or arm 22 and tool 24. Generally, tool 24 is an extension of arm 22, and may be, for example, a surgical tool, an electro-surgical tool, a laser, etc. The movement of tool 24 is controlled by commands to robot 20. Input device 30 includes stylus 32 and one or more switches or buttons 34. Computer 100 may also be coupled to network 60, which may include one or more local area networks, wide area networks, the Internet, etc.
[073] In one embodiment, robot 20 is a Smart Tissue Autonomous Robot (STAR) that includes a KUKA LBR iiwa robot with a 7-DOF (degree of freedom) lightweight arm 22 and a surgical tool 24. Robot 20 receives control commands or signals from computer 100, and sends positional information for arm 22 to computer 100. The control commands or signals may include one or more of the following types of data: position, velocity, acceleration, force, torque, etc.
[074] In one embodiment, surgical tool 24 is an electro-cautery tool that is based on a 2-DOF laparoscopic grasper Radius T manufactured by Tuebingen Scientific. Electro-cautery tool 24 includes a shaft, a quick release interface that is electrically isolated from the shaft, and two conductors, disposed within the center of electro-cautery tool 24, that are electrically coupled to an electro-surgical generator (ESG) (not depicted for clarity). In operation, a needle electrode is inserted into the quick-release interface, and a cutting waveform is selected on the ESG. When the surgeon activates an input control for the ESG, such as, for example, a foot pedal, a button or switch, etc., the ESG receives a control signal. In response, the ESG generates an electrical signal representing the cutting waveform, and then sends the electrical signal to the needle electrode. A grounding pad, disposed underneath the tissue sample, patient, etc. in task space 2, is coupled to the ESG to complete the electrical circuit. The electrical signal vaporizes tissue in contact with the electrode, thereby cutting the tissue. Alternatively, computer 100 may receive the ESG control signal from input device 30, and then send the ESG control signal to the ESG. For example, input device 30 may include a button or switch that is mapped to the ESG control signal. Alternatively, input device 30 may be coupled to the ESG and provide the ESG control signal directly thereto.
[075] Other embodiments of robot 20, including different arms 22 and tools 24, are also contemplated, such as, for example, a motorized suturing device, etc.
[076] In one embodiment, input device 30 is a 6-DOF Sensable
Technologies Phantom Omni haptic device 30 that allows the surgeon to manually control robot 20. In this embodiment, haptic device 30 sends positional information for stylus 32 and commands received through buttons 34 to computer 100, and may receive haptic feedback from computer 100. If haptic feedback is provided, haptic device 30 includes one or more haptic actuators that render the haptic feedback to the surgeon. Haptic feedback may include force, vibration, motion, texture, etc. Other embodiments of input device 30 are also contemplated. [077] In one embodiment, camera 40 is a Point Grey Chameleon RGB (red green blue) camera. Camera 40 sends image data to computer 100 that provide visual feedback to the surgeon and input data for the autonomous control mode discussed below. Other embodiments of camera 40 are also contemplated.
[078] FIG. 2 depicts a block diagram of RAS system 10 depicted in FIG. 1 , in accordance with an embodiment of the present disclosure.
[079] Computer 100 includes bus 110, processor 120, memory 130, I/O interfaces 140, display interface 150, and one or more communication interfaces 160. Generally, I/O interfaces 140 are coupled to I/O devices 142 using a wired or wireless connection, display interface 150 is coupled to display 50, and
communication interface 160 is connected to network 60 using a wired or wireless connection.
[080] Bus 110 is a communication system that transfers data between processor 120, memory 130, I/O interfaces 140, display interface 150, and
communication interface 160, as well as other components not depicted in FIG. 1. Power connector 112 is coupled to bus 110 and a power supply (not shown).
[081] Processor 120 includes one or more general-purpose or application- specific microprocessors to perform computation and control functions for
computer 100. Processor 120 may include a single integrated circuit, such as a micro-processing device, or multiple integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of processor 120. In addition, processor 120 may execute computer programs or modules, such as operating system 132, shared control module 134, other software modules 136, etc., stored within memory 130.
[082] Memory 130 stores information and instructions for execution by processor 120. Generally, memory 130 may include a variety of non-transitory computer-readable medium that may be accessed by processor 120. In various embodiments, memory 130 may include volatile and nonvolatile medium, non- removable medium and/or removable medium. For example, memory 130 may include any combination of random access memory (“RAM”), dynamic RAM (DRAM), static RAM (SRAM), read only memory (“ROM”), flash memory, cache memory, and/or any other type of non-transitory computer-readable medium. [083] Memory 130 contains various components for retrieving, presenting, modifying, and storing data. For example, memory 130 stores software modules that provide functionality when executed by processor 120. The software modules include an operating system 132 that provides operating system functionality for computer 100. The software modules also include shared control module 134 that provides functionality for controlling robot 20. In certain embodiments, shared control module 134 may include a plurality of modules, each module providing specific individual functionality for controlling robot 20. Other software modules 136 may cooperate with shared control module 134 to provide functionality for controlling robot 20, such as planning algorithms, robot controllers, computer vision, control allocation strategies, etc.
[084] In certain embodiments, other software modules 136 may include a Robot Operating System (ROS), which provides a flexible collection of tools, libraries, device drivers, such as robot device drivers, sensor device drivers, etc., conventions, etc. For example, other software modules 136 may include an
OpenCV (Open Source Computer Vision) library that provides a common
infrastructure for computer vision applications, one or more Reflexxes Motion
Libraries that provide instantaneous trajectory generation capabilities for motion control systems, a Kinematics and Dynamics Library (KDL) in Open Robot Control Systems (OROCOS) that provides an application independent framework for modelling and computation of kinematic chains for robots, etc.
[085] Data 138 may include data associated with operating system 132, shared control module 134, other software modules 136, etc.
[086] I/O interfaces 140 are configured to transmit and/or receive data from I/O devices 142. I/O interfaces 140 enable connectivity between processor 120 and I/O devices 142 by encoding data to be sent from processor 120 to I/O devices 142, and decoding data received from I/O devices 142 for processor 120. Generally, data may be sent over wired and/or a wireless connections. For example, I/O
interfaces 140 may include one or more wired communications interfaces, such as USB, Ethernet, etc., and/or one or more wireless communications interfaces, coupled to one or more antennas, such as WiFi, Bluetooth, cellular, etc. [087] Generally, I/O devices 142 provide input to computer 100 and/or output from computer 100. As discussed above, I/O devices 142 are operably connected to computer 100 using either a wireless connection or a wired connection. I/O devices 142 may include a local processor coupled to a communication interface that is configured to communicate with computer 100 using the wired or wireless connection. For example, I/O devices 142 include robot 20, input device 30, camera 40, and may include other devices, such as a joystick, keyboard, mouse, touch pad, etc.
[088] Display interface 150 is configured to transmit image data from computer 100 to monitor or display 50.
[089] Communication interface 160 is configured to transmit data to and from network 60 using one or more wired or wireless connections. Network 60 may include one or more local area networks, wide area networks, the Internet, etc., which may execute various network protocols, such as, for example, wired and wireless Ethernet, Bluetooth, etc. Network 60 may also include various
combinations of wired and/or wireless physical layers, such as, for example, copper wire or coaxial cable networks, fiber optic networks, Bluetooth wireless networks, WiFi wireless networks, CDMA, FDMA and TDMA cellular wireless networks, etc.
[090] FIG. 3A depicts a block diagram of shared control system 200, in accordance with an embodiment of the present disclosure. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
[091] Generally, shared control system 200 performs complex surgical procedures collaboratively between robot 20 and the surgeon with the highest possible degree of autonomy, while ensuring safe operation at all times. In one sense, shared control system 200 is“self-aware” of the limitations of its automation capabilities.
[092] Shared control system 200 includes manual control subsystem 210, autonomous control subsystem 220, a shared control subsystem 230, and a supervised autonomous control subsystem 250 (e.g., described below in connection with FIG. 16). Also depicted in FIG. 3A is task space 2 including robot 20 and tissue 4. Tissue 4 may be one or more tissue samples, a region of interest of a patient, etc. Manual control subsystem 210 generates manual control command 212, which is input to shared control subsystem 230. Autonomous control subsystem 220 generates autonomous control command 222, which is input to shared control subsystem 230. Additionally or alternatively, the supervised autonomous control subsystem 250 may generate a supervised autonomous control command 252. Shared control subsystem 230 generates shared control command 232.
[093] In the embodiment depicted in FIG. 3A, shared control command 232 is input to low level controller 240, which converts shared control command 232 to robot-specific control signal 231. Robot-specific control signal 231 is then sent to robot 20. For the embodiment including the KUKA LBR iiwa robot described above, low level controller 240 is a software module that is specific to this robot, such as the IIWA (Intelligent Industrial Work Assistant) Stack. In other embodiments, shared control command 232 may be sent directly to robot 20, which converts shared control command 232 to the appropriate robot-specific control signal.
[094] Shared control subsystem 230 generates shared control command 232 according to the Equation 1 :
U(t) = a(t)-M(t) + (1 - a(t))-A(t) (1 )
[095] In Equation 1 , manual control commands from the surgeon, M(t), are combined with autonomous control commands, A(t), using complementary scales a(t) e [0, 1] and 1 - a(t), respectively, to form the shared control command to the robot, U(t). The allocation function a(t) defines the respective percentages of the manual control command M(t) and the autonomous control command A(t) that are combined to form the shared control command U(t). The allocation function a(t) defines these percentages with respect to an independent variable x that reflects or indicates certain performance criteria for the shared control subsystem 230. With respect to FIG. 3A, manual control command 212 represents M(t), either the autonomous control command 222 or the supervised autonomous control command 252 may represent A(t), and shared control command 232 represents U(t).
[096] When a(t) is 0, the allocation function selects the autonomous control command as the shared control command. In other words, the shared control command is not influenced by the manual control command when a(t) is 0. Conversely, when a(t) is 1 , the allocation function selects the manual control command as the shared control command. In other words, the shared control command is not influenced by the autonomous control command when a(t) is 1. When a(t) is a number between 0 and 1 , the allocation function blends or combines the manual control command and the autonomous control command, based on the value of the allocation function, to generate the shared control command.
[097] Generally, the allocation function a(t) changes dynamically during the task and is a function of the independent variable x. Referring to FIG. 9, several allocation functions 800 are depicted, in accordance with embodiments of the present disclosure. Allocation function 802 is a function of tracking accuracy.
Allocation function 804 is a function of proximity to obstacles and/or desired locations. Allocation function 806 is a function of the accuracy of predicting human intentions in controlling the robot. Allocation function 808 is a function of the level of manipulation precision. Allocation function 810 is a fixed function and does not change based on the performance criteria. Generally, performance criteria determine the confidence and hence the allocation function, which is task
dependent. Allocation function 812 is a function of trust in the manual and/or autonomous control subsystems, and, more particularly, allocation function 812 is a function of the confidence in the manual and/or autonomous control subsystems and their dynamic uncertainties.
[098] Generation of this confidence-based allocation function a(t) requires identification tests for both manual and autonomous control modes to reveal their respective strengths and weaknesses, and is described in more detail below. The factors affecting manual control mode performance include the angle of camera 40 and the dissimilarities between the kinematics of haptic device 30 and robot 20. The factors affecting autonomous control mode performance include random failures in detecting the desired cutting trajectory as well as any imprecision in the calculation of tool 24 location via the robot kinematic chain.
[099] FIG. 3B depicts a block diagram of shared control subsystem 230, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3B is low level controller 240 which converts shared control command 232 into shared control signal 231. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
[0100] Scale function 233 applies the allocation function a(t) to manual control command 212, and scale function 234 applies the allocation function a(t) to autonomous control command 222 or the supervised autonomous control command 252. The scaled commands are then combined to form shared control command 232.
[0101 ] Generation of the allocation function a(t) is performed by an adaptive confidence-based autonomy allocation module 239, based on manual confidence indicator 237 and autonomous confidence indicator 238. Manual confidence indicator 237 is determined based on manual tracking error data 235 that is acquired when processor 120 is operating in a manual control mode during performance of a predetermined task using tool 24. Manual tracking error data 235 are associated with the trajectory of tool 24 during performance of the predetermined task.
Similarly, autonomous confidence indicator 238 is determined based on autonomous tracking error data 236 that are acquired when processor 120 is operating in an autonomous control mode during performance of the predetermined task using tool 24. The autonomous tracking error data 236 are associated with the trajectory of tool 24 during performance of the predetermined task. Performance of the predetermined task in manual control mode and autonomous control mode, in order to determine the manual and autonomous confidence indicators 237, 238, respectively, represents the identification tests noted above. This process is discussed in more detail below.
[0102] FIG. 3C depicts a block diagram of manual control subsystem 210, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3C are task space 2 including robot 20 and tissue 4, and low level controller 240 which converts manual control command 212 into manual control signal 211. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
[0103] To perform a predetermined task in manual control mode, real-time video images from camera 40 are presented on display 50, and the surgeon plans the desired tool trajectory based on a reference trajectory inscribed on tissue 4, such as, for example, a circular pattern cut, and then follows the desired tool trajectory using haptic device 30. The position feedback from robot 20 and the position commands from haptic device 30 are used to determine reference positions of robot 20 in task space 2. In one embodiment, the initial position of robot 20 when the predetermined task starts is identified, and new reference positions read from the displacement of haptic device 30 are added to produce the final position of robot 20 in the Cartesian task-space. Inverse kinematics are applied to generate manual control command 212 in joint-space, and low level controller 240 then converts manual control command 212 to manual control signal 211. The manual control signal 211 is then sent to robot 20 over the appropriate I/O interface 140. In an alternative embodiment, the manual control command 212 is sent to robot 20 over the appropriate I/O interface 140, which processes the command as necessary.
[0104] FIG. 3D depicts a block diagram of autonomous control subsystem 220, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 3D are task space 2 including robot 20 and tissue 4, and low level controller 240 which converts autonomous control command 222 into autonomous control signal 221. The functionality represented by this block diagram is provided by one or more software modules including shared control module 134, other software modules 136, etc.
[0105] To perform a predetermined task in autonomous control mode, real- time video frames from camera 40 are processed to detect a reference trajectory inscribed on tissue 4, such as, for example, a circular pattern cut. Edge and contour detection algorithms in OpenCV are used to detect the reference cutting trajectory. Then, the reference trajectory is converted from the image frame to the Cartesian robot frame using a homography transformation. The resulting reference and the real-time positions of robot 20 are used in the trajectory generator and planner to produce multiple equidistant waypoints for the desired trajectory starting from the closest point on the desired trajectory to robot 20. Smooth, time-based desired trajectory segments are produced between the waypoints using, for example, Reflexxes Motion Libraries. Kinematics and Dynamics Library (KDL) in Open Robot Control Systems (OROCOS) may be used, for example, to transform the task-space trajectories of robot 20 to the joint-space trajectories and generate autonomous control command 222. Low level controller 240 then converts autonomous control command 222 to autonomous control signal 221.
[0106] FIG. 4 depicts a graphical user interface 300 for shared control system 200, in accordance with an embodiment of the present disclosure.
[0107] Graphical user interface (GUI) 300 depicts a video image of tissue 4 within task space 2, with reference trajectory 310 for the predetermined task inscribed thereon. GUI 300 also overlays a computer-generated image depicting desired trajectory 320 for the autonomous control mode, one or more suggested autonomous control mode regions 330, one or more suggested manual control mode regions 340, and control mode indicator 350. Suggested autonomous control mode regions 330 and suggested manual control mode regions 340 are determined based on the allocation function a(t). In certain embodiments, the shared control mode automatically switches between autonomous control mode and manual control mode based on the allocation function a(t) during the performance of the predetermined task. In other embodiments, the surgeon manually switches between the control modes, using haptic device 30, during the performance of the predetermined task.
[0108] FIG. 5 illustrates a series of tissue samples 400, in accordance with an embodiment of the present disclosure.
[0109] As discussed above, in order to determine manual confidence indicator 237 and autonomous confidence indicator 238, a predetermined task is first performed on different tissue samples in both manual control mode and autonomous control mode. In certain embodiments, the predetermined task is a circular pattern cut; other surgical procedures are also contemplated. Tissue samples 400 includes tissue sample 410 without pseudo-blood occlusions and with reference trajectory 412, tissue sample 420 with a small pseudo-blood occlusion and reference trajectory 422, tissue sample 430 with a medium pseudo-blood occlusion and reference trajectory 432, tissue sample 440 with a large pseudo-blood occlusion and reference trajectory 442, tissue sample 450 with a different size pseudo-blood occlusions and reference trajectory 452, and tissue sample 460 with symmetric, medium pseudo- blood occlusions and reference trajectory 462.
[0110] In certain embodiments, a laser pointer is attached to tool 24 and used to project a laser dot on tissue samples 400. Performance of the circular cut pattern on tissue samples 400 using a laser pointer attached to tool 24 sufficiently identifies the tracking accuracy of the autonomous and manual control modes. Tool 24 and attached laser pointer follow the desired cutting trajectory for each control mode for each tissue sample 400. In one embodiment, the motion of robot 20 was
constrained to a plane parallel to the X-Y plane of tissue samples 400 at a fixed height and orientation to minimize laser-pointing inaccuracies.
[0111 ] In one embodiment, two identification tests are performed on each tissue sample 400. The first identification test performs the circular cut pattern on the tissue sample 400 under manual control mode, and the second identification test performs the circular cut pattern on the tissue sample under autonomous control mode. For each identification test, the actual trajectory of the laser dot is captured by camera 40, and the image data are processed to determine the tracking error of tool 24 by comparing the actual trajectory of the laser dot to the reference trajectory. In this embodiment, the laser dot and the location and size of any pseudo-blood occlusions are detected using functionality provided by the OpenCV library.
Perspective transformations are applied to the image data to generate a top view of the laser dot trajectory, and then the image data is mapped to a new image frame that is a square 500 x 500 pixel plane. In this embodiment, each pixel represents 0.2 mm on the trajectory plane. The location of the laser dot is then tracked using color thresholding and blob detection, and the locations of any pseudo-blood occlusions in that tissue sample are similarly determined. The position of the laser dot is compared to the reference trajectory for that tissue sample, and the tracking error for that identification test is determined.
[0112] FIG. 6 depicts average tracking error graph 500 for tissue samples 400, in accordance with an embodiment of the present disclosure. Also depicted in FIG. 6 are tool trajectory and tracking error graphs 510 for tissue sample 410, and tool trajectory and tracking error graphs 550 for tissue sample 450.
[0113] Average tracking error graph 500 depicts average tracking error for manual control mode 502 and average tracking error for autonomous control mode 504 for identification tests performed on tissue sample 410, i.e. ,“none,” tissue sample 420, i.e.,“small,” tissue sample 430, i.e.,“medium,” tissue sample 440, i.e., “large,” and tissue samples 450, 460, i.e.,“multiple.” [0114] Tissue sample 410 does not have pseudo-blood occlusions. Average tracking error graph 500 for tissue sample 410 indicate that the autonomous control mode outperforms the manual control mode - the average tracking error for the autonomous control mode was about 0.5 mm, while the average tracking error for the manual control mode was about 0.8 mm. However, as the complexity of the tissue sample increases due to the size and number of pseudo-blood occlusions, the average tracking error of the autonomous control mode increases from about 0.5 mm to about 1.6 mm, while the average tracking error of the manual control mode essentially remains within the same range for all of the samples, i.e. , from about 0.6 mm to about 0.8 mm. More particularly, when pseudo-blood occlusions on the desired trajectory interferes with the detection algorithms of the autonomous control mode, the tracking error for the autonomous control mode locally increases near the pseudo-blood occlusions.
[0115] Tool trajectory and tracking error graphs 510 present more detailed data for tissue sample 410, including plots of the reference trajectory and the actual trajectory in the X-Y plane, and graphs of the trajectory tracking errors, for the manual control mode and the autonomous control mode. Tool trajectory and tracking error graphs 550 present more detailed data for tissue sample 450, including plots of the reference trajectory and the actual trajectory in the X-Y plane, and graphs of the trajectory tracking errors, for the manual control mode and the autonomous control mode. These data indicate that the local performance of the autonomous control mode on non-occluded regions of each desired trajectory is superior to the local of performance of the manual control mode on these regions. Conversely, the local performance of the manual control mode on occluded regions of each desired trajectory is superior to the local of performance of the autonomous control mode on these regions.
[0116] The shared control mode advantageously leverages the local performance strengths of both control modes to provide a more accurate control system by identifying confidence indicators for the autonomous control mode and the manual control mode in the vicinity of the occluded regions. The confidence indicators provide insight on how and when to switch the control modes to improve the overall task performance. [0117] FIG. 7 depicts normalized tracking error graphs 600, in accordance with an embodiment of the present disclosure. FIG. 7 also depicts annotated tissue sample 640.
[0118] To determine the confidence indicators for the manual control mode and the autonomous control mode, in one embodiment, the tracking error data are normalized using a metric related to the size of the occlusion in each tissue sample 400. Other normalization metrics may also be used.
[0119] In this embodiment, the normalization metric, d, identifies the intersection of the reference trajectory with the pseudo-blood occlusion. Annotated tissue sample 640 depicts a portion of tissue sample 440 with reference trajectory 442, and several values for d. When approaching the pseudo-blood occlusion along the reference trajectory, the start of the pseudo-blood occlusion is defined as d = -1 , the middle of the pseudo-blood occlusion is defined as d = 0, and the end of the pseudo-blood occlusion is defined as d = 1. Using these definitions, the intersection of the reference trajectory with pseudo-blood occlusions is normalized based on the size of the occlusion. In one embodiment, OpenCV blob detection algorithms are used to find the location and size of the pseudo-blood occlusions or blobs on the reference trajectory, and to normalize their intersections. The tracking error along d for each identification test was determined and normalized based on the blob sizes. Other blob detection algorithms are also contemplated.
[0120] The performances of autonomous control mode and the manual control mode, over all of the identification tests, are then analyzed based on the normalized proximity to the pseudo-blood occlusions. After the tracking error data is normalized for each control mode, a curve is fitted to each normalized control mode tracking error data set. In one example, the fitted curve for the manual control mode is a linear function, i.e. , manual control mode curve 602, while the fitted curve for the autonomous control mode is a skewed Gaussian function, i.e., autonomous control mode curve 604. In this example, the fitted function for the manual control mode is governed by Equation 2, while the fitted function for the autonomous control mode is governed by Equation 3.
yM = aM d + bM (2)
with aM = -0.002 and bM = 0.061
Figure imgf000025_0001
with aA = 0.206, bA = 0.213, CA= 1.257
[0121 ] Normalized tracking error graphs 600 include manual control mode normalized tracking error data, autonomous control mode normalized tracking error data, and the fitted curves for each data set.
[0122] These data suggest that the manual control mode is effective in pseudo-blood occlusion regions, while the autonomous control mode is more effective elsewhere. Based on these data, the confidence indicator for manual control mode is defined as CM = 1 - yM, and the confidence indicator for the autonomous control mode is defined as CA = 1 - yA.
[0123] FIG. 8 depicts confidence indicator graph 700 and allocation function graph 710, in accordance with an embodiment of the present disclosure.
[0124] Confidence indicator graph 700 depicts manual control mode confidence indicator 702 and autonomous control mode confidence indicator 704 for the experimental tests described above. After confidence indicators 702, 704 are determined, the allocation function a(t) is generated based on these confidence indicators. In one embodiment, confidence indicators 702, 704 are used to locally select the most reliable control mode as the predetermined task is performed.
Because confidence indicator 702 is more or less constant, the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode are determined based on confidence indicator 704.
[0125] Referring to the confidence indicator graph 700, as d approaches 0 from negative values, confidence indicator 704 is greater than confidence indicator 702. In other words, confidence in the autonomous control mode is greater than the manual control mode. As confidence indicator 704 gradually decreases from 1 , a lower decision threshold, Tiower 722, is reached at the point where confidence indicator 704 intersects confidence indicator 702 (Tlower = 0.93 at d = -1.15). As the middle of the pseudo-blood occlusion is approached ( d ~ 0), confidence in the autonomous control mode reaches a minimum level (Tminimum 0.79), and then begins to increase until upper decision threshold Tupper 724 is reached at the point where confidence indicator 704 intersects confidence indicator 702 (Tupper = 0.94 at d = 1.6). Between Tiower 722 and Tupper 724, confidence indicator 702 is greater than confidence indicator 704. In other words, confidence in the manual control mode is greater than the autonomous control mode. As d approaches positive values after Tupper 724, confidence indicator 704 is greater than confidence indicator 702 and gradually increases back to 1. In other words, confidence in the autonomous control mode is again greater than the manual control mode.
[0126] Allocation function graph 710 depicts allocation function 712, which is a function of the confidence in the autonomous control mode, i.e. , confidence indicator 704.
[0127] In this embodiment, allocation function 712 returns a value of 0 or 1 based on the value of confidence indicator 704. Referring to Equation 1 , the value 0 indicates that the autonomous control mode has been selected for the shared control mode, and the value 1 indicates that the manual control mode has been selected for the shared control mode. In one example, the shared control mode is initially set to the autonomous control mode, and allocation function 712 has an initial setting of 0. As tool 24 approaches the beginning of a pseudo-blood occlusion in tissue 4, the normalized distance d approaches lower decision threshold Tiower 722. When tool 24 crosses Tiower 722, allocation function 712 returns the value 1 , which changes the shared control mode to the manual control mode. As tool 24 approaches the end of the pseudo-blood occlusion in tissue 4, the normalized distance d approaches upper decision threshold Tupper 724. When tool 24 crosses Tupper 724, allocation function 712 returns the value 0, which changes the shared control mode back to the autonomous control mode.
[0128] FIGS. 10A and 10B present flow diagrams depicting at least some of the functionality of shared control module 134 depicted in FIG. 2, in accordance with embodiments of the present disclosure
[0129] FIG. 10A presents a flow diagram for controlling an articulating member including a tool, in accordance with an embodiment of the present disclosure.
[0130] At 910, a manual control mode confidence indicator is determined based on a manual control mode for the articulating member of the robot. As discussed above, tracking error data are acquired during the performance of a predetermined task under the manual control mode. The tracking error data represent the deviations between a reference trajectory and the actual trajectory of the tool. The manual control mode confidence indicator is determined based on this tracking error data. In one embodiment, the tracking error data may be normalized using a metric associated with the predetermined task, such as, for example, occlusion size, and then a curve may be fitted to the data to produce a normalized tracking error function. The manual control mode confidence indicator is then derived from the normalized tracking error function.
[0131 ] At 920, an autonomous control mode confidence indicator is determined based on an autonomous control mode for the articulating member of the robot. As discussed above, tracking error data are acquired during the performance of a predetermined task under the autonomous control mode. The tracking error data represent the deviations between a reference trajectory and the actual trajectory of the tool. The autonomous control mode confidence indicator is determined based on this tracking error data. In one embodiment, the tracking error data may be normalized using a metric associated with the predetermined task, such as, for example, occlusion size, and then a curve may be fitted to the data to produce a normalized tracking error function. The autonomous control mode confidence indicator is then derived from the normalized tracking error function.
[0132] At 930, an allocation function is generated based on the manual control mode confidence indicator and the autonomous control mode confidence indicator. As discussed above, the manual and autonomous control mode
confidence indicators are used to locally select the most reliable control mode as the predetermined task is performed. For example, if the manual control mode confidence indicator is more or less constant, the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode may be determined based on the autonomous control mode confidence indicator. Conversely, if the autonomous control mode confidence indicator is more or less constant, the allocation function a(t) and the decision thresholds for locally switching between manual control mode and autonomous control mode may be determined based on the manual control mode confidence indicator. In another example, the manual and autonomous control mode confidence indicators are blended to yield an allocation function a(t) that combines control commands from the manual control mode and control commands from the
autonomous control mode.
[0133] At 940, a control command is generated for the articulating member of the robot based on the allocation function. As discussed above, when a(t) is 0, the autonomous control command A(t) is selected as the control command. In other words, the control command is not influenced by the manual control command when a(t) is 0. Conversely, when a(t) is 1 , the manual control command M(t) is selected as the control command. In other words, the control command is not influenced by the autonomous control command when a(t) is 1. When a(t) is a number between 0 and 1 , the manual control command and the autonomous control command are blended, based on the value of the allocation function a(t), to generate the control command. As discussed above, the allocation function a(t) changes as a function of the independent variable x. For example, the independent variable x may be the confidence in the autonomous control mode, as discussed above.
[0134] At 950, the control command is sent to the articulating member. As discussed above, in one embodiment, the control command is input to a low level controller, which converts the control command to a robot-specific control signal.
The robot-specific control signal is then sent to the robot over the appropriate I/O Interface. In another embodiment, the control command is sent directly to the robot, which converts the control command to the appropriate robot-specific control signal.
[0135] It should be understood that the "autonomous control mode" referred to in connection with claim 10A may correspond to a fully autonomous control mode (e.g., in connection with subsystem 220 of FIG. 3D) or to a supervised autonomous control mode (e.g., in connection with subsystem 250 of FIG. 16).
[0136] FIG. 10B presents a flow diagram for controlling an articulating member including a tool, in accordance with an embodiment of the present disclosure.
[0137] At 960, the control command is converted to a robot-specific control signal, as discussed above.
[0138] At 970, the control signal is sent to the articulating member, as discussed above. [0139] FIGS. 11 -18 depict various features of a RAS system, in accordance with the present disclosure.
[0140] Generally, surgical imaging is challenging, particularly when it comes to intra-operative tracking of soft tissue. During the surgery, a variety of inevitable and unpredictable factors such as breath, heartbeat, patient movements and interventional surgical procedures cause organ shifts and tissue deformation. The direct vision of an MIS surgical scene maybe obstructed by the patient's body, such as in head and neck cancer surgeries. The oral cavity squamous cell carcinoma (OSCC) and oropharyngeal squamous cell carcinoma (OPSCC) are the two most common cancers in the head and neck region, and minimally invasive transoral robotic surgery (TORS) is an effective therapeutic method for their removal. Pre- operative examination with flexible endoscopy, computed tomography (CT) and magnetic resonance imaging (MRI) may be used to identify the tumor margins.
Narrow-band imaging (NBI) with only green and blue light to highlight the malignant lesions improves the identification of disease-free resection margins, and is an alternative to white light (WL) endoscopy. Based on pre-operative images, surgeons can mark the tumor margins with ink and use them as references during the resection while receiving 2D/3D and WL/NBI visual feedback from the endoscope. However, the marked tumor margins can easily get obscured by blood and charred tissue when observing the surgery on a video display.
[0141 ] As will be described, biocompatible near-infrared (NIR) markers may be used for robot guidance in these and other surgical situations, and provide strong penetration of the NIR light, durability, and bio-compatibility. More specifically, by observing the NIR light with higher wavelength than the visual light, the NIR markers can always be seen intra operatively with high signal to noise ratio (SNR), even when obstructed by blood and tissue. In long-term multimodality tumor treatment scenarios, several rounds of chemotherapy are performed before the surgery and the tumor dimension shrinks over time. In one embodiment, the location of the tumor is marked before chemotherapy, which provides surgeons with the original tumor region intra-operatively rather than the shrunken tumor post chemotherapy.
[0142] In some embodiments, NIR markers described herein may made from FDA-approved NIR fluorophore Indocyanine Green (ICG), cyanoacrylate
(Dermabond) and acetone. Once the marker is injected into the tissue, it forms a solid long lasting bead. NIR markers may be used on target tissue locations for suture planning via linear interpolation as well as 2D pattern cutting for pseudo-tumor resection. Additionally, NIR markers may be used on soft and unstructured 3D tissues in combination with more complex control methods compared to the 2D scenario. In some embodiments, a single point cloud of a tissue surface may be acquired (e.g., using a NIR camera, a RGBD camera, or a combination of the two), and a straight-line, 3D incision path for the robot may be determined. The start and end points may be manually selected in some embodiments.
[0143] FIG. 11 shows an illustrative RAS system 1100 that is included in a testbed. The RAS system 1100 may include a robot 20 having a robotic arm 22 (e.g., a 7-DOF KUKA lightweight robotic arm), a near-infrared (NIR) camera 1102 (e.g., which may be a 845 nm± 55 nm NIR camera), a Red-Green-Blue-Depth (RGBD) camera 1104 (e.g., which may be a Realsense D415 RGBD camera), a light source 1106 (e.g., which may be an infrared or NIR light source, and which may include a 760 nm high power light emitting diode), and an electrocautery tool 24. Examples of various functions of the RAS system 1100 will be illustrated with respect to a tissue sample 4.
[0144] In at least some ways, the RAS system 1100 may correspond to the RAS system 10 of FIGS. 1 and 2, but with a dual camera imaging system that includes the cameras 1102 and 1104 instead of the camera 40, which may allow for 3D and NIR imaging. Components of the RAS system 1100 having counterparts in the RAS system 10 may be referred to with the same reference numerals.
Accordingly, some details of the RAS system 1100 are were already described in connection with RAS system 10 are not repeated here for the sake of brevity.
[0145] As will be described, the NIR camera 1102 and the RGBD camera 1104 may be included in a supervised autonomous control subsystem 250 (e.g., which may correspond to the supervised control subsystem 250 shown of FIG. 3), shown in FIG. 16, of a shared control system that includes the robot 20. The subsystem 250 may control the robot 20 and electrocautery tool 24 to produce precise and consistent incisions on complex three-dimensional (3D) soft tissues, such as the tissue sample 4. The supervised autonomous control subsystem 250 may provide a supervised autonomous control mode in which an operator (e.g., a surgeon) may identify key points on a tissue of interest, such as, for example, a tumor, by selecting the NIR markers outlining the tumor using a GUI. In the supervised autonomous control mode, the operator may validate the electrocautery path before autonomous control is initiated. The subsystem may autonomously generate and filter a complete 3D electro-surgery pattern between multiple key points marking the tumor bed. Since the path planning and filtering is done via continuous and multiple measurements of the 3D tissue surface information, the resulting executed incision may be more accurate than a conventional single-step, offline path planning method. Compared to 2D image-based visual servoing, the supervised autonomous control mode may provide a more accurate 3D incision on real tissues, including a more accurate depth of cut.
[0146] Electrocautery tool 24 may be added to robot 20 for performing incisions on the tissue samples. Electrocautery tool 24 may use a needle electrode to send a cutting waveform, which may be generated via an electro-surgical generator (e.g., an DRE ASG-300 electro-surgical generator), to the target tissue. The cutting waveform may vaporize tissues in contact with the electrode.
[0147] FIG. 12A shows a perspective view of the testbed of FIG. 11 and the dual camera imaging system of the RAS 1100 that includes the NIR camera 1102 and the RGBD camera 1104. The dual camera system may detect: NIR markers 1220 disposed in or on the tissue sample 4, their 3D positions, a tissue surface (e.g., of tissue sample 4), and a manual control interface for result comparisons. The RGBD camera 1104 may obtain 3D tissue surface information. The NIR camera 1102 may detect the NIR markers 1220 when they are illuminated by the light source 1106. In some embodiments, other 3D cameras, such as plenoptic cameras and structured illumination cameras, may be used instead of or in addition to the RGBD camera 1104.
[0148] In some embodiments, in order to prevent interference of the projector of the RGBD camera 1104 with the readings captured by the NIR camera 1102, the projector may be periodically switched back and forth between on and off states (e.g., with a state transition occurring every 0.22 seconds) via software triggers that control the RGBD camera 1104. The NIR camera 1102 may be configured to capture images only when the projector of the RGBD camera 1104 is turned off. [0149] A real-time imaging system (e.g., which may be included in subsystem 250 of FIG. 16) may extract the 3D position of the biocompatible NIR markers 1220 by ray tracing the positions of the NIR markers 1220 via a co-registered point cloud generated by the RGBD camera 1104. The positions of the RGBD camera 1104 and the NIR camera 1102 may be compared to a checkerboard, and relative positions of the cameras with respect to each other may be determined (e.g., using the
"transform package in Robot Operating System (ROS)). A hand-eye calibration may be performed by finding the position of the checkerboard in the robot coordinates. The 3D position and orientation of the cameras 1102, 1104 compared to the robot 20 are then determined. A visual servoing platform (VISP) may be used to track portions of the NIR images captured by the NIR camera 1102 corresponding to the NIR markers 1220 between NIR image frames captured by the NIR camera. The operator may select the markers via mouse clicks (e.g., with a mouse of the I/O devices 142 of the computer 100 shown in FIG. 2). It should be understood that while NIR cameras, NIR images, and NIR markers are described, these elements are intended to be illustrative and not limiting. In alternate embodiments, other suitable camera types, image types, and/or marker types may be used in place of the NIR camera 1120, the NIR image data, and the NIR markers 1220 to provide landmark/feature detection as a basis for path planning.
[0150] FIG. 12B shows an illustrative image captured by the NIR camera 1102, showing the NIR markers 1220 disposed on the tissue sample 4.
[0151 ] FIG. 12C shows an illustrative point cloud image of the tissue sample 4 captured by the RGBD camera 1104. In the present example, the point cloud image has been overlaid with the 3D positions of the NIR markers 1220 (e.g., by the subsystem 250).
[0152] In one embodiment, supervised autonomous control subsystem 210 may include a manual control mode, some aspects (e.g., system components) of which are shown in FIG. 13. Using this interface, a surgeon manually controls the 3D motion of the tool-tip of the robot 20 using the input device 30. The coordinate frame transformations between input device 30, the camera view frame, and the robot frame are done in real-time via the ROS transform package, which matches all the motions that supervised autonomous control subsystem 250 may perform. Camera 40 provides high-resolution real-time visual feedback to the surgeon, and the NIR marker positions are overlaid on this view (e.g., green dots shown in FIG. 13.c) as a reference for the surgeon. In such embodiments, a third camera (not shown), which may be a RGB camera (e.g., camera 40 of FIG. 1 ), may be included in the RAS system 10, and may capture high-resolution video 1306, which is displayed on the monitor 1304 (e.g., which may correspond to the display 50 of FIG. 1 ) to provide real-time visual feedback to the operator. The positions of NIR markers 1120 (e.g., having been previously identified from the NIR image(s) and point cloud image(s) captured by the cameras 1102 and 1104, respectively) may be overlaid over the video 1306.
[0153] FIG. 14A depicts illustrative overlays 1400 and 1410 corresponding to an exemplary cutting task that uses 4 NIR markers 1420. Overlay 1400 depicts the desired incision pattern 1402 and an incision path 1404 on a tissue sample 1412 (e.g., which may correspond to tissue sample 4 of FIG. 11 ). The tissue sample 1412 may have been cut using subsystem 210, 220, 250, or a combination of these.
Overlay 1410 depicts an approximation 1406 of corresponding edges of the incision path 1404, which may be compared to the desired incision path 1402 for surface error measurement. FIG. 14B shows an example comparison between the desired incision pattern 1402 and the approximation 1406. Two regions are shown in higher resolution to illustrate surface error 1422 and 1424. The comparison may be performed by a post-processing system (not shown) in order to estimate error.
[0154] FIG. 15A depicts a side view of the tissue sample 1412 of FIG. 14A. A post-processing system (not shown) may extract an estimated top edge 1502 and an estimated bottom edge 1504 of a cut portion 1506 of the tissue sample 1412 for depth error measurement. The estimated top edge 1502 and the estimated bottom edge 1504 may be identified automatically be the subsystem 250 in some
embodiments, or may alternatively be identified based on manual input by an operator in other embodiments. FIG. 15B shows an illustrative comparison of the estimated top edge 1502 and bottom edge 1504. Distances (e.g., d1 , d2) between corresponding pixels of the top edge 1502 and the bottom edge 1504 may be calculated by the subsystem 250 and may be compared to a desired depth to calculate error. While only one side of the sample 1412 is shown here, it should be understood that the depths of all four sides of the incision may be measured and corresponding error values may be calculated in this way. [0155] While the examples of FIGS. 14A-15B are provided in the context of a manual cutting task, it should be understood that the results of automated cutting tasks and/or automation-assisted cutting tasks may be similarly analyzed to determine error.
[0156] FIG. 16 depicts a block diagram of a portion of the RAS system 1100, which includes the supervised autonomous control subsystem 250 and a low-level controller 240. As shown, the subsystem 250 may include the NIR camera 1102, the RGBD camera 1104, a 3D marker position module 1606, a path planner module 1608, a filter 1610, a trajectory generator and planner module 1612, and an inverse kinematics module 1614. In some embodiments, the supervised autonomous control subsystem 250 may be included in a shared control system (e.g., system 230 of FIGS. 3A and 3B), and supervised autonomous control commands 252 that may be generated by the subsystem 250 may be analyzed by such a control system to estimate corresponding error and/or confidence indicators, and may, in combination with separate manual control commands, be used as the basis for generating an allocation function and shared control commands, as will be described.
[0157] Real-time video frames from the RGBD camera 1104 and the NIR camera 1102 are collected and processed by the 3D marker position module 1606 to obtain the 3D coordinates of the NIR markers (e.g., markers 1120) in the robot frame. An offsetting technique is applied by the path planner module 1608 to project the NIR marker positions outwards on the point cloud and allow planning an incision path with specified margins around the NIR markers. The offsetting technique executed by the path planner module 1608 uses the 3D vectors formed from the previous and next markers to the current marker, calculates a 5mm offset on the superposition of the vectors and projects it to the tissue surface by finding the closest point the point cloud. A path planning algorithm executed by the path planner module 1608 calculates a 3D path on the point cloud model of the tissue surface between each two consecutive projected NIR marker positions (e.g., the corners of the desired pattern 1402 in overlay 1400 of FIG. 14A). The path planner module may thereby generate and output reference waypoints. The filter 1610 eliminates the dynamic inter-frame noise of the resulting path so that it is usable in the robot controllers. Real-time position feedback may be sent from the robot to the subsystem 250, to be processed by the trajectory generator and planner module 1612. The reference waypoints output by the path planner 1608 and filtered by the filter 1610 and the real-time robot positions may be received and used by the trajectory generator and planner module 1612 to obtain smooth time-based trajectories using, for example, Reflexxes Motion Libraries in the robot frame. The task-space trajectories of the robot 20 may be converted to the joint-space trajectories by the inverse kinematics module 1614 using, for example, Kinematics and Dynamics Library (KDL) of Open Robot Control Systems (OROCOS). Low-level closed-loop robot controllers may be implemented so that the robot 20 follows the desired joint space trajectories and hence the 3D path waypoints on the tissue 4. For example, the subsystem 250 may output a supervised autonomous control command 252 to the low level controller 240, which then converts autonomous control command 252 to supervised autonomous control signal 241 and sends the autonomous control signal 241 to the robot 20.
[0158] As another example, in connection with both FIGS. 3A and 16, the autonomous control command 252 may instead be sent to a shared control system 230, which may process the autonomous control command 252 and a separate manual control command 212 and apply an allocation function a(t) that defines respective percentages of the manual control command 212 and the supervised autonomous control command 252 that are combined to form a shared control command 232, which is then sent to the low level controller 240, which converts the shared control command 232 to a shared control signal 231 , which is sent to the robot 20 to control the robot 20.
[0159] The 3D path planning algorithm implemented by the path planner module 1608 may determine a 3D path between a start point and an end point on a point cloud using, for example, PCL in C++. FIG. 17 shows an illustrative 3D graph 1700 of a point cloud 1702, which may be generated by the RGBD camera 1104.
The 3D path planning algorithm may generate a path 1704 that connects a defined start point on the point cloud to a defined end point on the point cloud. First, for example, the point cloud 1702 may be captured by the RGBD camera 1104 and a pass-through filter may be applied to extract the point cloud from a region of interest near the tissue sample 4. Applying the region of interest on the point cloud 1702 may avoid the need for processing the entire raw point cloud and hence may reduce the computation time. Next, a statistical outlier removal (SOR) filter may be applied by the planner module 1608 to reduce the noise in the current point cloud 1702. The SOR filter measures the average distance m and standard deviation a of each point to its k nearest neighbors and rejects neighbors that lie beyond the distance m+as. In addition, a moving least square (MLS) filter may be applied by the path planner module 1608 to create a smooth distribution of the point cloud 1702 by calculating a fitting surface on each point in a sphere radius r through higher order polynomial that fits the original points, and resampling missing points based on the fitting surface. In one embodiment, the parameters may be set to around k - 10 neighbors and around a= 1 to filter the outliers, and around r = 0.01 to create a smoother point cloud. A mesh is then created by the path planner module 1608 using, for example, Delaunay triangulation among the point cloud. The shortest path between a start point and an end point is then computed using, for example, the Dijkstra algorithm, which determines an optimal path (i.e., shortest distance) if it exists.
[0160] To determine the start and end points of the path 1704, NIR markers (e.g., NIR markers 1120) may be used on the tissue and their positions are projected on the point cloud with a desired offset, as described above. This process is repeated for each two consecutive projected markers as start and end points of each segment of incision (e.g., as illustrated in FIG. 19).
[0161 ] Due to the inherent limitations of various sensing technologies and motions of cameras or objects in the scene, in-frame noise and inter-frame noise may affect the quality of results when using 3D point clouds for real-time
measurements and control. In-frame noise may distort the surface of an object of interest, such as, for example, causing a flat surface to appear bumpy. For solid objects, template matching may reduce and/or eliminate the effect of in-frame noise. Other methods for in-frame noise reduction include smoothing and removing outliers for reducing surface or volume noise. Such techniques may be applied (e.g., by the filter 1610 in conjunction with the path planner 1608) to each measurement of point cloud data.
[0162] Inter-frame noise, however, occurs in real-time measurements and is related to the slight noisy motion of the point cloud from the previous camera frame to the current one. When used in real-time path planning, inter-frame noise may cause a time-varying number of way-points at the output of path planning algorithm (e.g., the output of the path planner module 1608), and/or a noisy motion of these points between the frames. Inter frame noise may affect autonomous control when performing delicate and precise tasks such as tumor resection with small margins of error. FIG. 18A shows a sequence of frames 1802 that include raw/unfiltered paths generated over time by the path planner module 1608. Each path includes a start point 1804, an end point 1806, and several waypoints 1808. As shown, the waypoints 1808 may experience noisy motion between frames due to inter-frame noise.
[0163] FIG. 18B shows an individual frame 1810 showing tracked waypoints 1812 that are a subset of the waypoints 1808 that are selected for tracking between frames to counter in-frame noise. For example, a fixed number n (e.g., 4) of way- points 1808 (i.e. , the tracked waypoints 1812) on the 3D path may be tracked between the frames. A filtering or estimation algorithm may then be applied to the fixed number of way-points n (e.g., by the filter 1610) over time to obtain a filtered path as additional measurements are acquired. To counter inter-frame noise, a recursive least square (RLS) estimation method may be used to track the waypoints 1812 on the path. For example, for dynamic cases, a Kalman filter (KF), an Extended KF, an Unscented KF, etc., or a particle filter, etc., may be used. FIG. 18C shows, over time, tracked waypoints 1812 and filtered waypoints 1814 generated by the filter 1610 based on the tracked waypoints 1812.
[0164] A fixed number of candidate waypoints 1812 and their positions on the noisy path (defined as w,·) are first determined, and then a filtering method is applied by the filter 1610. In one embodiment, the candidate waypoints 1812 are determined using a waypoint extraction method, and then the candidate waypoints are filtered using a recursive least squares (RLS) method. Other methods are also
contemplated.
[0165] For the waypoint extraction method, s Î R3 and e Î R3are the start and end points of the desired path segment on the point cloud, and Psek Î R3 is the current calculated path at the time instant k with nk path points between s and e, and length lk. The elements nk (i.e. the number of waypoints 1808 in FIG. 18A) and lk will change dynamically depending on the current reading of the noisy point cloud data and how the path planning algorithm detects the trajectory at that time instant. The path will include at least nmin > 0 waypoints (i.e., nk ³ nmin, " k ³ 0 ). A fixed number of waypoints wi Î R3, i Î {1, ... , n } (e.g., which may interchangeably refer to the tracked waypoints 1812 of FIGS. 18B and 18C) are selected from this path, and then tracked and filtered as more measurements of the noisy path Psek are collected. If n < nmin waypoints are selected for tracking , a fixed number of points are always used in the filtering algorithm to track their dynamics over time, and nminmay be determined dynamically over time based on the resolution and density of the point cloud obtained from the 3D sensor/camera in the neighborhood of s and e . In order to find the position of the tracked waypoints wi, they are equally distributed along Psek using the total length lk (i.e. , breaking lk into n+ 1 equal sections). The positions of wi are determined as a point on the current path Psek that is the closest point to the location from the start points (e.g., at
Figure imgf000038_0003
Figure imgf000038_0004
and for n=3).
Figure imgf000038_0002
[0166] The RLS method, which may be applied after the waypoint extraction method, filters the positions of the waypoints wi to produce filtered waypoints (i.e. the filtered waypoints 1814 in FIG. 18C) using the noisy measurements of the path. With unknown but constant waypoint positions on the tissue, the augmented vector of Wj is w Î R3nx1 and the corresponding obtained measurements are: yk = Hkw + vk
[0167] where Hk Î R3nx3n n is the output/measurement matrix, yk Î R3nx1 is the current measurement of w and is obtained by augmenting the positions of wi detailed above, and vk Î R3nx1 is the measurement noise. The augmented vector of the estimation of wi at time step k is and the estimation error is:
Figure imgf000038_0005
Figure imgf000038_0007
[0168] The cost function is the aggregated variance of the estimation errors:
Figure imgf000038_0006
[0169] The following sequential algorithm minimizes the cost function (i.e., Equation 4) in order to obtain an accurate estimation of w :
Figure imgf000038_0001
[0170] Here, Kk Î R3nx3n is the estimation gain matrix, Hk Î R3nx3n = is measurement noise covariance matrix, and Pk Î R3nx3n is estimation-
Figure imgf000039_0002
error covariance matrix, and / is the identity matrix. After each measurement of w , the estimation (e.g., which may correspond to the filtered waypoints 1814) is
Figure imgf000039_0003
updated, which over time, converges to constant values. If the positions of the start and end points sand e suffer from the point cloud noise similar to the waypoints on the path, with a simple change of index i to include i Î {1, ... , n + 1}, these points are tracked and filtered as well.
[0171 ] FIG. 19 shows a snapshot example of an autonomous incision path 1900 with an offset around four NIR markers 1920 (e.g., NIR markers 1120 of FIG. 11 ) and n = 3 filtered waypoints 1914 (e.g., filtered waypoints 1814 of FIG. 18C) between the start/end points 1902 of each path. For comparison, the noisy (i.e. , unfiltered) path 1904 is also shown. In one example embodiment, the initial filter parameter values may be P0 = 100 I3n (i.e., wi unknown a priori), H0 = I3n (i.e., all x,y,z readings of wi are obtained from the point cloud), R0 = 213n (adjusted accordingly), and K0 = 0.0113n , where I3n is the identity matrix. The measurements matrix Hk may be constant with a fixed level of noise during the measurements for Rk . Therefore, "k with Hk = H0 and Rk = R0. Flowever, the filter may continue updating Kk, Pk, and until a steady-state level is achieved.
Figure imgf000039_0004
[0172] In order to control the depth of incisions, the planned incision path may be shifted by about 5 mm below the tissue surface along the z axis of the robot tool direction which is perpendicular to the tissue, and hence the robot 20 may perform the cut with the desired depth according to Equation 5:
Figure imgf000039_0001
where is the homogenous transformation for converting the electrocautery tool 24 to the robot base coordinates, is the transformation for converting the camera to
Figure imgf000039_0006
robot coordinates, and [x y z 1]c is formed by the coordinates of
Figure imgf000039_0005
[0173] The tissue (e.g., tissue sample 4) in contact with the electrode immediately vaporizes when the power setting on the electrocautery tool 24 matches the clinical setting and proper robot velocities are chosen for following the path. The contact forces with the tissue during the electrocautery are negligible and no disturbances interfere with the robot controllers.
[0174] FIGS. 20-26 depict various additional features of a RAS system (e.g., the RAS system 1100 of FIG. 11 ), in accordance with another embodiment of the present disclosure.
[0175] Different confidence indicator identification methods for supervised autonomous control subsystem 250 may be used. More specifically, the accuracies of the NIR marker position estimation and path planning algorithms may be evaluated via an identification pattern that is positioned at different configurations with respect to the camera system and is also subjected to different noises. These criteria may affect the accuracy of the incision paths performed by the autonomous robot controller.
[0176] FIG. 20 depicts an identification pattern 2002 mounted on arm 22 of the robot 20. In order to assess the accuracy of the 3D NIR marker projection method described above, a pattern with a known geometry may be used, such as, for example, the identification pattern 2002. This pattern 2002 shown in the present example includes 36 marker wells which are equally spaced at 1 -cm horizontal and vertical intervals to form a symmetric grid about the center of the identification pattern. This known geometry is used as the ground truth or baseline to evaluate how the accuracy of the camera system (e.g., cameras 1102 and 1104 of FIGS. 11 and 16) for 3D marker position projection varies as different parameters such as distance to the camera, angular positions, etc., are varied.
[0177] FIG. 21 shows a graph 2100, which provides an example of evaluating marker projection errors at a 31 cm distance with a -40 degree roll angle and a 0 degree pitch angle. Examples of baseline data and projected marker positions via the camera system are shown. The projection error is calculated using the average 3D distances between the baseline and the corresponding projections.
[0178] FIG. 22 depicts a graph 2202 illustrating the effects of changes in roll angle on marker projection error, and a graph 2204 illustrating the corresponding identified confidence indicator. [0179] FIG. 23 depicts a graph 2302 illustrating the effects of changes in pitch angle on marker projection error, and a graph 2304 illustrating the
corresponding identified confidence indicator.
[0180] FIG. 24 depicts a graph 2402 illustrating the effects of changes in distance on marker projection error, and a graph 2404 illustrating the corresponding identified confidence indicator.
[0181 ] The results of evaluating marker projection errors for a combination of three different pattern distances from the camera system (i.e. 31 cm, 35 cm, 39 cm), five different angular positions for roll (i.e. at -40 degrees, -20 degrees, 0 degrees, 20 degrees, 40 degrees), and five different angular positions for pitch (i.e. at -35 degrees, -17.5 degrees, 0 degrees, 17.5 degrees, 35 degrees) of the identification pattern are summarized in FIGS. 22-24. In graphs 2202, 2302, and 2402, raw data is shown with star makers and the fitted model with a solid curve. As it can be seen in FIGS. 20 and 21 , for both of the roll and pitch angles, the error models have a minimum error at a certain angular position (e.g. -24 degrees for roll and -16 degrees for pitch).
[0182] As the value of angular positions degrease or increase from these minimum error locations, the marker projection error increases. In general, depending on the camera system configuration, the angles at which the minimum error occurs can take the generic form of rmin for roll and Pmin for pitch.
[0183] The confidence indicators are calculated by inverting and shifting the curve fitted to the error models so that lower errors are associated with higher confidence values. As a representative example for FIG. 22, the error model for roll angle is err = 0.0001r2 + 0.0058 r + 1.815 (with r being the roll angle) and the corresponding confidence indicators is CA1 = 2.745 - err. Similarly, for FIG. 21 , the error model for pitch angle is err = 0.0002p2 + 0.0051 p + 1.761 (with p being the pitch angle) and the corresponding confidence indicators is CA2 = 2.725 - err.
[0184] When considering the effect of distance on the marker projection error, as seen in FIG. 22, the error increases with distance. For example, the error model identified for data shown in FIG. 22 is err = 0.00634 d - 0.3296 , where d is the distance of the pattern from the camera system. The corresponding identified model, by inverting and shifting the error models for distance error, is CA3 = 2.45 - err. Additionally, a scaling method may be implemented to normalize the
confidence values between 0 and 1.
[0185] The identification pattern 2002 can also be used for testing the accuracy of the path planning algorithm when the 3D point cloud is locally subjected to noise resulting in low density data. Clinical sources of noise include external or additional light sources used for illuminating the surgical scene for the surgeon. These light sources may cause local or global reflections and point cloud density degradations in real-time data coming from the camera system. FIG. 25A depicts an illustrative desired path planning pattern 2502 overlaid on the identification pattern 2002 of FIG. 20. To determine the magnitude of the error introduced by light sources, a symmetric set of 12 paths on the identification pattern between 4 different markers is first determined (i.e. , the desired path planning pattern 2502). In the present example, a white light source is used to project random external noises on the pattern to disturb the 3D point cloud obtained from the camera system. Data are collected at different angles and distances similar to the pattern configurations described for marker projection error.
[0186] FIG. 25B shows a 3D graph 2504 that illustrates effects of local noise on the point cloud density and path planning accuracy, and a 3D graph 2506 that illustrates path planning under no external noise. As shown in graph 2504, the path planner determines a less than optimal path between the markers within the low- density point cloud regions in the presence of local noise. As shown in graph 2506, the path planner may determine a more accurate path when noise is minimally present on the identification pattern point cloud.
[0187] FIG. 26 depicts a graph 2602 illustrating a path planning error model and corresponding effects of changes in point cloud density on path planning error, and a graph 2604 illustrating the corresponding identified confidence indicator.
[0188] As shown, the path planning error exponentially decreases as the point cloud density increases because the path planning algorithm relies on the density of the point cloud to produce accurate paths between the markers. In this example, the error model (e.g., density error model) is err = 10.55e-899s (where s is the point cloud density) and the confidence indicator is obtained by scaling and inverting the error model as CA4
Figure imgf000042_0001
[0189] The confidence indicators identified above can be used during surgery to estimate the autonomous incision error based on the distance and angular positions of the target tissue in the camera system, as well as the quality of the 3D point cloud data from the tissue. FIG. 27 shows an illustrative a block diagram of a multi-criteria confidence-based allocation system for 3D incision tasks, according to an embodiment of the present disclosure.
[0190] As shown the system 2700 may include a camera system 2702 (e.g., which may include some or all of the subsystem 250 of FIG. 16), a roll estimator 2704, a pitch estimator 2706, a distance estimator 2708, a density and noise estimator 2710, the confidence indicators (e.g., "confidence models")
CA1, CA2, CA3, and CA4, and allocation functions 2720. As shown, image data captured by the camera system 2702 may be processed (e.g., in parallel) by the estimators 2704-2710 to generate roll, pitch, yaw, distance, and point cloud density estimates, respectively, along with corresponding error. The confidence indicators
CA1, CA2, CA3, and CA4, may be used to calculate confidence values based on the errors from the estimators 2704-2710. In some embodiments, the confidence indicators CA1, CA2, CA3, and CA4 may, in combination, correspond to either of the confidence indicators 237 and 238 of FIG. 3B, and the allocation functions 2720 may correspond to at least a portion of the adaptive confidence-based autonomy allocation module 239 of FIG. 3B.
[0191 ] In some embodiments, the allocation functions 2720 may correspond to the allocation functions 800 shown in FIG. 9. As described in connection with FIG. 9, the allocation functions may include ax, ay, az, aroll, apitch, ayaw (e.g.,
corresponding to the movement of the tool 24 of the robot 20 in along x, y, z, axes, and roll, pitch, and yaw of the tool 24 of the robot 20) and, in the context of the present example of FIG. 27, x = w XCA1 + w2CA2 + w3CA3 + w4CA4 is the weighted confidence indicator outputs.
[0192] Each translational (x-y-z) and rotational (roll-pitch-yaw) motion of robot tool can be considered separately and can be assigned an allocation function of the allocation functions 2720 that uses weighted confidence indicators as input (i.e., the weights w1 - w4 ) . The confidence-based allocation functions
ax, ay, az, aroll, apitch, ayaw can be selected from the different forms shown and described in connection with FIG. 9 and may be used (e.g., when controlling the robot 20, the arm 22, and/or the tool 24) to generate shared control commands and/or shared control signals (e.g., shared control command 232 and/or shared control signal 231 of FIGS. 3A and 3B) to minimize the incision error. The allocation functions do not necessarily need to be different and in some cases, similar allocation functions can be used for controlling different degrees of freedom of the robot 20 (e.g., and the corresponding arm 22 and/or tool 24).
[0193] In some embodiments, the allocation functions 2720 may additionally or alternatively include the allocation functions described below in connection with FIG. 28 and Equations 7 and 8.
[0194] An example of the design and implementation of the allocation function a when a similar function is used for all the translational (x-y-z) and rotational (roll-pitch-yaw) motion of the robot 20 (e.g., and the corresponding arm 22 and/or tool 24). FIG. 28 shows an illustrative graph 2800 of an allocation function 2802 that is calculated based on both manual control confidence indicators CM, described above, and automatic control confidence indicators (e.g., which may include or be calculated based on CA1 - CA4). The total error for the manual control may be normalized as err M(t) Î [0,1] and for the autonomous control as err A(t) Î [0,1] The total error resulting from a confidence-based shared control may be defined according to Equation 6: err(t) = a(t) err M(t) + (1 - a(t))err A(t) (6)
which is the weighted compound error from the manual and autonomous control sources. The optimal solution to minimizing err(t) by a choice of a(t) can be found according to Equation 7:
Figure imgf000044_0001
[0195] However, the allocation function in Equation 7 may cause noisy/jittery allocations of autonomy, which are not necessarily easy for a human to follow due to sudden and frequent changes. An alternate solution is provided in Equation 8:
Figure imgf000044_0002
where x Î [x,x] is the independent variable with lower/upper bound for the
Figure imgf000045_0006
allocation function. In the allocation function of Equation 8, b is a bias at which a(t) and s is a steepness control parameter. When s ® ¥, the allocation
Figure imgf000045_0005
function of Equation 8 turns to a step an non-smooth function. In some
embodiments, x = CA - CM Î [-1,1] may be selected, that is the difference between the overall confidence in the autonomous control Ca and the manual control Cm. In some embodiments, the upper and lower bounds may be selected as
Figure imgf000045_0007
b = 0 for a symmetric and normalized allocation a function according to the
Figure imgf000045_0008
Equations 9-12:
Figure imgf000045_0001
where
Figure imgf000045_0002
and
Figure imgf000045_0003
then we obtain
Figure imgf000045_0004
[0196] Since CA - CM Î [-1,1], is bounded in [0, 1 ] if s > 0 is
Figure imgf000045_0010
chosen. If and are Lipschitz continuous (i.e. no sudden failure occurs in either
Figure imgf000045_0009
manual or autonomous control modes to cause a discontinuous change in the confidence indicators), ά is largely affected by s. For example s may be set to 5 or around 5 for both smooth and fast allocation of autonomy between the manual and autonomous robot controllers.
[0197] FIG. 29 shows a graph 2900 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a high level of skill, resulting in little allocation of
autonomous control. [0198] FIG. 30 shows a graph 3000 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a moderate level of skill, resulting in moderate allocation of autonomous control.
[0199] FIG. 31 shows a graph 3100 illustrating autonomy allocation for time- varying confidence in autonomous control and manual control, where the manually controlling operator has a low level of skill, resulting in high allocation of autonomous control.
[0200] FIG. 32A shows an example of autonomy allocation using the RAS system 1100 of FIG. 11. As shown in the illustrative graph 3200, the shared control strategy smoothly produces a smaller a(t) when the autonomous control is superior to manual control and vice versa.
[0201 ] FIG. 32B shows a graph 3202 illustrating autonomous and manual commands sent to the 2nd joint of the robot 20 and the overall shared output applied to the 2nd joint.
[0202] FIG. 32C shows a graph 3204 illustrating autonomous and manual commands sent to the 6th joint of the robot 20 and the overall shared output applied to the 6th joint.
[0203] FIG. 33 shows an illustrative user interface 3300 that may be displayed (e.g., via the monitor 1304 of FIG. 13) to the operator and that include 3302 an indicator depicting, in real-time or near-real-time, how much control the operator has over the robot (e.g., compared to the percentage of control that is being handled autonomously by the system 250).
[0204] FIGS. 34 and 35 depict graphical user interfaces for RAS system 1100, (e.g., which may be displayed on the monitor 1304 of FIG. 13) in accordance with embodiments of the present disclosure. These GUIs are depicted for a suturing procedure.
[0205] As shown in FIG. 34, graphical user interface (GUI) 3402 depicts a video image of a task space (e.g., task space 2 of FIG. 16) and unidimensional control mode indicator 3404. A graphical user interface (GUI) 3406 depicts a video image of a task space (e.g., task space 2 of FIG. 16) and multidimensional control mode indicator 3408. [0206] As shown in FIG. 35, a graphical user interface (GUI) 3500 depicts a visual video image 3504 of a task space (e.g., task space 2, FIG. 16), a NIR video image 3506 of the task space, and procedure and control mode indicator 3502.
[0207] In this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms“comprises,” “comprising,”“includes,”“including,”“has,”“having,” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by“comprises ... a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
[0208] Reference throughout this document to“one embodiment,”“certain embodiments,”“an embodiment,”“implementation(s),”“aspect(s),” or similar terms means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of such phrases or in various places throughout this specification are not necessarily all referring to the same embodiment.
Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments without limitation.
[0209] The term“or” as used herein is to be interpreted as an inclusive or meaning any one or any combination. Therefore,“A, B or C” means“any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive. Also, grammatical conjunctions are intended to express any and all disjunctive and conjunctive combinations of conjoined clauses, sentences, words, and the like, unless otherwise stated or clear from the context. Thus, the term“or” should generally be understood to mean “and/or” and so forth. References to items in the singular should be understood to include items in the plural, and vice versa, unless explicitly stated otherwise or clear from the text. [0210] Recitation of ranges of values herein are not intended to be limiting, referring instead individually to any and all values falling within the range, unless otherwise indicated, and each separate value within such a range is incorporated into the specification as if it were individually recited herein. The words“about,” “approximately,” or the like, when accompanying a numerical value, are to be construed as indicating a deviation as would be appreciated by one of ordinary skill in the art to operate satisfactorily for an intended purpose. Ranges of values and/or numeric values are provided herein as examples only, and do not constitute a limitation on the scope of the described embodiments. The use of any and all examples, or exemplary language (“e.g.,”“such as,”“for example,” or the like) provided herein, is intended merely to better illuminate the embodiments and does not pose a limitation on the scope of the embodiments. No language in the specification should be construed as indicating any unclaimed element as essential to the practice of the embodiments.
[0211 ] For simplicity and clarity of illustration, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
Numerous details are set forth to provide an understanding of the embodiments described herein. The embodiments may be practiced without these details. In other instances, well-known methods, procedures, and components have not been described in detail to avoid obscuring the embodiments described. The description is not to be considered as limited to the scope of the embodiments described herein.
[0212] In the following description, it is understood that terms such as“first,” “second,”“top,”“bottom,”“up,”“down,”“above,”“below,” and the like, are words of convenience and are not to be construed as limiting terms. Also, the terms apparatus, device, system, etc. may be used interchangeably in this text.
[0213] The many features and advantages of the disclosure are apparent from the detailed specification, and, thus, it is intended by the appended claims to cover all such features and advantages of the disclosure which fall within the scope of the disclosure. Further, since numerous modifications and variations will readily occur to those skilled in the art, it is not desired to limit the disclosure to the exact construction and operation illustrated and described, and, accordingly, all suitable modifications and equivalents may be resorted to that fall within the scope of the disclosure.

Claims

WHAT IS CLAIMED IS:
1. A system comprising:
a camera system that includes a first camera and a second camera;
an articulating member that includes a tool;
a computer comprising:
at least one processor; and
a non-transitory memory configured to store computer-readable instructions which, when executed, cause the at least one processor to:
receive image data from the first camera;
receive point cloud image data from the second camera, wherein the image data and the point cloud image data correspond to a tissue on which markers are disposed;
identify marker positions of the markers based on the image data and the point cloud image data;
generate a path between a first point on the point cloud and a second point on the point cloud based at least on the marker positions;
filter the path;
receive real-time position data corresponding to the articulating member;
generate a three-dimensional (3D) trajectory based on the filtered path and the real-time position data;
generate control commands based on the 3D trajectory; and control the articulating member and the tool to follow the 3D trajectory based on the control commands.
2. The system of claim 1 , wherein the tool comprises an electrocautery tool, and wherein the computer-readable instructions which cause the at least one processor to control the articulating member and the tool further cause the electrocautery tool to cut the tissue along the path.
3. The system of claim 1 , wherein the first camera comprises a near-infrared (NIR) camera, wherein the second camera comprises a red-blue-green-depth (RGBD) camera, wherein the image data comprises NIR image data, and wherein the markers comprise NIR markers.
4. The system of claim 1 , wherein the computer-readable instructions which cause the at least one processor to generate the path further cause the at least one processor to:
identify projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data; and
generate reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
5. The system of claim 4, wherein the computer-readable instructions which cause the at least on processor to filter the path further cause the at least one processor to:
select tracked waypoints as a subset of the reference waypoints; and generate filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
6. The system of claim 5, wherein the filtering algorithm is selected from the group consisting of: a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
7. The system of claim 1 , wherein the computer-readable instructions, when executed, further cause the at least one processor to:
calculate at least one autonomous confidence indicator based on autonomous incision error;
calculate a manual confidence indicator based on manual incision error;
generate at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator; and
generate the control commands based on the at least one allocation function.
8. The system of claim 7, wherein the at least one autonomous confidence indicator is selected from the group consisting of: a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error; and wherein the at least one allocation function comprises a plurality of allocation functions corresponding to movement of the articulating member in three-dimensional directions, and roll, pitch, and yaw of the articulated member.
9. A method comprising:
generating image data and point cloud image data corresponding to a region of interest on which markers are disposed;
identifying marker positions of the markers based on the image data and the point cloud image data;
generating a path between a first point of the point cloud image data and a second point of the point cloud image data, based at least on the marker positions; receiving real-time position data corresponding to an articulating member; generating a three-dimensional (3D) trajectory for the articulating member based on the path and the real-time position data;
generating control commands based on the 3D trajectory; and
controlling the articulating member to follow the 3D trajectory based on the control commands.
10. The system of claim 9, wherein the articulating member comprises a robotic arm, and wherein controlling the articulating member comprises:
causing the robotic arm to cut tissue in the region of interest along the path.
11. The method of claim 10, wherein generating the path comprises:
identifying projected marker positions by applying an offsetting technique to project the marker positions outward on a point cloud of the point cloud image data; and
generating reference waypoints on the point cloud between two of the projected marker positions, such that the reference waypoints of the path are separate from the marker positions by at least a predetermined margin, wherein the path comprises the reference waypoints.
12. The method of claim 11 , wherein filtering the path comprises:
selecting tracked waypoints as a subset of the reference waypoints; and generating filtered waypoints by applying a filtering algorithm to track the tracked waypoints.
13. The method of claim 12, wherein the filtering algorithm is selected from the group consisting of: a recursive least square algorithm, a Kalman filter, an extended Kalman filter, an unscented Kalman filter, and a particle filter.
14. The method of claim 9, further comprising:
calculating at least one autonomous confidence indicator based on
autonomous incision error;
calculating a manual confidence indicator based on manual incision error; generating at least one allocation function based on the manual confidence indicator and the at least one autonomous confidence indicator; and
generating the control commands based on the at least one allocation function.
15. The method of claim 14, wherein the at least one autonomous confidence indicator comprises at least one confidence indicator selected from a group consisting of: a roll angle confidence indicator which is generated based on roll angle error, a pitch angle confidence indicator which is generated based on pitch angle error, a distance confidence indicator which is generated based on distance error, and a density confidence indicator which is generated based on density error; and wherein the at least one allocation function comprises a plurality of allocation functions corresponding to movement of the articulating member in three- dimensional directions, and roll, pitch, and yaw of the articulated member.
16. The method of claim 9, wherein the image data comprises near-infrared (NIR) image data, and wherein the markers comprise NIR markers.
PCT/US2020/033270 2018-05-16 2020-05-15 Confidence-based robotically-assisted surgery system WO2020232406A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/098,990 US20210077195A1 (en) 2018-05-16 2020-11-16 Confidence-based robotically-assisted surgery system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962848979P 2019-05-16 2019-05-16
US62/848,979 2019-05-16
US201962907872P 2019-09-30 2019-09-30
US62/907,872 2019-09-30

Related Child Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/032635 Continuation WO2019222480A1 (en) 2018-05-16 2019-05-16 Confidence-based robotically-assisted surgery system

Publications (1)

Publication Number Publication Date
WO2020232406A1 true WO2020232406A1 (en) 2020-11-19

Family

ID=73289691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/033270 WO2020232406A1 (en) 2018-05-16 2020-05-15 Confidence-based robotically-assisted surgery system

Country Status (1)

Country Link
WO (1) WO2020232406A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022237166A1 (en) * 2021-05-11 2022-11-17 梅卡曼德(北京)机器人科技有限公司 Trajectory generation method and apparatus, electronic device, storage medium, and 3d camera

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018209042A2 (en) * 2017-05-10 2018-11-15 Mako Surgical Corp. Robotic spine surgery system and methods

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018209042A2 (en) * 2017-05-10 2018-11-15 Mako Surgical Corp. Robotic spine surgery system and methods

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LE ET AL.: "Semi-Autonomous Laparoscopic Robotic Electro-surgery with a Novel 3D Endoscope*", IEEE INT CONF ROBOT AUTOM., May 2018 (2018-05-01), pages 1 - 22, XP033403486, Retrieved from the Internet <URL:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6716798> [retrieved on 20200619], DOI: 10.1109/ICRA.2018.8461060 *
LIU ET AL.: "Robotic Online Path Planning on Point Cloud", IEEE TRANSACTIONS ON CYBERNETICS, vol. 46, no. 5, May 2016 (2016-05-01), pages 1217 - 1228, XP011606145, Retrieved from the Internet <URL:https://www.researchgate.net/publication/277413655_Robotic_Online_Path_Planning_on_Point_Cloud#fullTextFileContent> [retrieved on 20200619], DOI: 10.1109/TCYB.2015.2430526 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022237166A1 (en) * 2021-05-11 2022-11-17 梅卡曼德(北京)机器人科技有限公司 Trajectory generation method and apparatus, electronic device, storage medium, and 3d camera

Similar Documents

Publication Publication Date Title
US20210077195A1 (en) Confidence-based robotically-assisted surgery system
US20230107693A1 (en) Systems and methods for localizing, tracking and/or controlling medical instruments
US9101267B2 (en) Method of real-time tracking of moving/flexible surfaces
US11751948B2 (en) Methods and systems for robot-assisted surgery
US20200222146A1 (en) Endoscopic imaging with augmented parallax
US20220160445A1 (en) Robotic surgical collision detection systems
US20230000565A1 (en) Systems and methods for autonomous suturing
CN112472297B (en) Pose monitoring system, pose monitoring method, surgical robot system and storage medium
Zhang et al. Autonomous scanning for endomicroscopic mosaicing and 3D fusion
CN111317567A (en) Thoracic imaging, distance measurement and notification system and method
CN112263332B (en) System, method, medium, and terminal for adjusting surgical robot
WO2015151098A2 (en) An articulated structured light based-laparoscope
US20230165649A1 (en) A collaborative surgical robotic platform for autonomous task execution
US11897127B2 (en) Systems and methods for master/tool registration and control for intuitive motion
US20220415006A1 (en) Robotic surgical safety via video processing
EP3414737A1 (en) Autonomic system for determining critical points during laparoscopic surgery
Saeidi et al. Supervised autonomous electrosurgery via biocompatible near-infrared tissue tracking techniques
Beyl et al. Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room
Wang et al. Robot-assisted occlusion avoidance for surgical instrument optical tracking system
WO2020232406A1 (en) Confidence-based robotically-assisted surgery system
Zevallos et al. A surgical system for automatic registration, stiffness mapping and dynamic image overlay
US20230210627A1 (en) Three-dimensional instrument pose estimation
US20240070875A1 (en) Systems and methods for tracking objects crossing body wallfor operations associated with a computer-assisted system
US20230092980A1 (en) Surgical robotic system setup
WO2023018684A1 (en) Systems and methods for depth-based measurement in a three-dimensional view

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20806769

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20806769

Country of ref document: EP

Kind code of ref document: A1