US20190254759A1 - Reconfigurable display in computer-assisted tele-operated surgery - Google Patents

Reconfigurable display in computer-assisted tele-operated surgery Download PDF

Info

Publication number
US20190254759A1
US20190254759A1 US16/347,298 US201716347298A US2019254759A1 US 20190254759 A1 US20190254759 A1 US 20190254759A1 US 201716347298 A US201716347298 A US 201716347298A US 2019254759 A1 US2019254759 A1 US 2019254759A1
Authority
US
United States
Prior art keywords
surgical
data sources
multiple data
user
arrangement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/347,298
Inventor
Mahdi AZIZIAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Priority to US16/347,298 priority Critical patent/US20190254759A1/en
Assigned to Intuitive Surgical Operations, Inc. reassignment Intuitive Surgical Operations, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AZIZIAN, Mahdi
Publication of US20190254759A1 publication Critical patent/US20190254759A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/258User interfaces for surgical systems providing specific settings for specific users
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Definitions

  • This disclosure relates to devices and methods for minimally invasive computer-assisted tele-operated surgery.
  • Tele-surgery is a general term for surgical systems where the surgeon uses some form of remote control, e.g., a servomechanism, or the like, to manipulate surgical instrument movements rather than directly holding and moving the instruments by hand.
  • remote control e.g., a servomechanism, or the like
  • the surgeon is provided with an image of the surgical site at the remote location. The surgeon performs the surgical procedures on the patient by manipulating master control input devices, which in turn control the motion of robotic instruments.
  • this document features a method that includes operating a surgical system to perform a surgical process, the surgical system including a display device, and receiving, at one or more processing devices, data from multiple data sources.
  • the method also includes determining a current phase of the surgical process, and displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • this document features a surgical system that includes a display device and one or more processing devices.
  • the one or more processing devices are configured to operate the surgical system to perform a surgical process, and receive data from multiple data sources.
  • the one or more processing devices are also configured to determine a current phase of the surgical process, and display, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • this document features one or more machine-readable non-transitory storage devices encoded with machine-readable instructions configured to cause one or more processing devices to perform various operations.
  • the operations include operating a surgical system to perform a surgical process, the surgical system including a display device, and receiving, at one or more processing devices, data from multiple data sources.
  • the operations also include determining a current phase of the surgical process, and displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • Implementations of the above aspects may include one or more of the following.
  • a new phase of the surgical process may be determined, and at least one of the first set of the multiple data sources and the first arrangement may be updated in response to determining the new phase of the surgical process. Updating the at least one of the first set of the multiple data sources and the first arrangement can be based on a user preference record for a current user of the surgical system. Updating the at least one of the first set of the multiple data sources and the first arrangement can be based on a predetermined safety profile for the surgical system.
  • User-input indicative of adjustments to one or more of the visual representations may be received, and the display device may be updated in accordance with the adjustments.
  • a user-profile may also be updated in accordance with the adjustments.
  • the user-profile can be stored at a storage location and be made accessible to other users.
  • the multiple data sources can include at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the multiple data sources can include at least one of: (i) a computing device generating one or more of an image, text, interactive graphics, or a graphical user interface (GUI), and (ii) a storage device providing one or more pre-stored images or videos. Determining the current phase can be based on a user-input indicative of the current phase.
  • GUI graphical user interface
  • Determining the current phase can be based on an image analysis process executed on the data from at least one of the multiple data sources.
  • the data from one or more of the multiple data sources can include positional information with respect to a common reference frame.
  • Displaying the visual representations can include overlaying a first visual representation on a second visual representation, wherein the first visual representation is registered with respect to the second visual representation based on the common reference frame.
  • the first arrangement can be determined based on a user profile loaded prior to commencement of the surgical process.
  • the user profile can identify an individual performing the surgical process, and include user-preferences of the individual regarding organization of the visual representations corresponding to the data from the multiple data sources during different phases of the surgical process.
  • the display device can include multiple screens.
  • this document features a method for controlling configurability of visual representations of data from multiple data sources on a display device during a surgical process.
  • the method includes receiving data from the multiple data sources, displaying, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations, and receiving, via an input device, user-input indicative of adjustments to one or more of the visual representations.
  • the method also includes determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and in response, generating a control signal configured to alert a user of the violation.
  • this document features a surgical system that includes a display device and one or more processing devices.
  • the one or more processing devices are configured to receive data from multiple data sources, and display, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations.
  • the one or more processing devices are also configured to receive, via an input device, user-input indicative of adjustments to one or more of the visual representations, determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and responsive to determining the violation, generating a control signal configured to alert a user of the violation.
  • this document features one or more machine-readable non-transitory storage devices encoded with machine-readable instructions configured to cause one or more processing devices to perform various operations.
  • the operations include receiving data from the multiple data sources, displaying, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations, and receiving, via an input device, user-input indicative of adjustments to one or more of the visual representations.
  • the operations also include determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and in response, generating a control signal configured to alert a user of the violation.
  • visual representation of data from multiple data sources can be displayed on a console of a surgical system based on user-preferences.
  • the display preferences of an individual e.g., a senior surgeon
  • the display preferences may be specific to phases of surgery, and may be automatically loaded upon detection of corresponding phases.
  • the technology described herein may allow for improved user experience for surgeons performing minimally invasive robotic surgery (also referred to herein as minimally invasive surgery (MIS)).
  • MIS minimally invasive surgery
  • virtual proctoring tools also referred to as ghost tools
  • Various safety protocols may govern the location and configuration of the images from different sources, for example, to guard against a surgeon accidentally missing important information. This in turn may increase patient safety by reducing chances of human errors that may otherwise affect MIS.
  • FIG. 1 is a perspective view of an example patient-side cart of a computer-assisted tele-operated surgery system.
  • FIG. 2 is a front view of an example surgeon console of a computer-assisted tele-operated surgery system.
  • FIG. 3 is a side view of an example robotic manipulator arm assembly of a computer-assisted tele-operated surgery system.
  • FIGS. 4A and 4B are example configurations of a display associated with a computer-assisted tele-operated surgery system.
  • FIG. 4C is an example illustrating display of data overlaid on an endoscope image.
  • FIGS. 5A and 5B show examples of how a display associated with a computer-assisted tele-operated surgery system may be configured by a user.
  • FIG. 6 is an example block diagram of a system for displaying images from multiple data sources.
  • FIG. 7 is a flowchart illustrating an example process of providing feedback to a healthcare professional during a surgical process.
  • FIG. 8 is a flowchart illustrating an example process of controlling configurability of visual representations from multiple data sources.
  • MIS minimally invasive surgery
  • the technology allows for configuring locations of images from various sources on a display device associated with a surgeon's console. This may be done manually, for example, in accordance with the preferences of the surgeon as indicated via a user-input, or automatically, for example, based on pre-stored preferences, and/or based on detecting a phase of the surgery.
  • the technology may allow for various types of configurability (e.g., overlaying images on one another, minimizing or removing a feed from a particular image source, or concurrently displaying feeds from multiple data sources) that may in turn allow a surgeon to perform a surgery with increases effectiveness.
  • configurability may be governed using safety protocols aimed at reducing the possibility of a surgeon missing useful information.
  • safety protocols may prevent an endoscope image from being configured to a size less than a threshold size, or lock certain displays from being removed from the console.
  • systems for minimally invasive computer-assisted tele-surgery can include a patient-side cart 100 and a surgeon console 40 .
  • Tele-surgery is a general term for surgical systems where the surgeon uses some form of remote control, e.g., a servomechanism, or the like, to manipulate surgical instrument movements rather than directly holding and moving the instruments by hand.
  • the robotically manipulatable surgical instruments can be inserted through small, minimally invasive surgical apertures to treat tissues at surgical sites within the patient, avoiding the trauma associated with accessing for open surgery.
  • robotic systems can move the working ends of the surgical instruments with sufficient dexterity to perform quite intricate surgical tasks, often by pivoting shafts of the instruments at the minimally invasive aperture, sliding of the shaft axially through the aperture, rotating of the shaft within the aperture, and/or the like.
  • the patient-side cart 100 includes a base 110 , a first robotic manipulator arm assembly 120 , a second robotic manipulator arm assembly 130 , a third robotic manipulator arm assembly 140 , and a fourth robotic manipulator arm assembly 150 .
  • Each robotic manipulator arm assembly 120 , 130 , 140 , and 150 is pivotably coupled to the base 110 .
  • fewer than four or more than four robotic manipulator arm assemblies may be included as part of the patient-side cart 100 .
  • the base 110 includes casters to allow ease of mobility, in some embodiments the patient-side cart 100 is fixedly mounted to a floor, ceiling, operating table, structural framework, or the like.
  • two of the robotic manipulator arm assemblies 120 , 130 , 140 , or 150 hold surgical instruments and a third holds a stereo endoscope.
  • the remaining robotic manipulator arm assembly is available so that another instrument may be introduced at the work site.
  • the remaining robotic manipulator arm assembly may be used for introducing a second endoscope or another image capturing device, such as an ultrasound transducer, to the work site.
  • Each of the robotic manipulator arm assemblies 120 , 130 , 140 , and 150 is conventionally formed of links that are coupled together and manipulated through actuatable joints.
  • Each of the robotic manipulator arm assemblies 120 , 130 , 140 , and 150 includes a setup arm and a device manipulator.
  • the setup arm positions its held device so that a pivot point occurs at its entry aperture into the patient.
  • the device manipulator may then manipulate its held device so that it may be pivoted about the pivot point, inserted into and retracted out of the entry aperture, and rotated about its shaft axis.
  • the surgeon console 40 includes a stereo vision display 45 so that the user may view the surgical work site in stereo vision from images captured by the stereoscopic camera of the patient-side cart 100 .
  • Left and right eyepieces, 46 and 47 are provided in the stereo vision display 45 so that the user may view left and right display screens inside the display 45 respectively with the user's left and right eyes.
  • the surgeon While viewing typically an image of the surgical site on a suitable viewer or display, the surgeon performs the surgical procedures on the patient by manipulating master control input devices, which in turn control the motion of robotic instruments.
  • the surgeon console 40 also includes left and right input devices 41 , 42 that the user may grasp respectively with his/her left and right hands to manipulate devices (e.g., surgical instruments) being held by the robotic manipulator arm assemblies 120 , 130 , 140 , and 150 of the patient-side cart 100 in preferably six degrees-of-freedom (“DOF”).
  • Foot pedals 44 with toe and heel controls are provided on the surgeon console 40 so the user may control movement and/or actuation of devices associated with the foot pedals.
  • a processing device 43 is provided in the surgeon console 40 for control and other purposes.
  • the processing device 43 performs various functions in the medical robotic system.
  • One function performed by processing device 43 is to translate and transfer the mechanical motion of input devices 41 , 42 to actuate their respective joints in their associated robotic manipulator arm assemblies 120 , 130 , 140 , and 150 so that the surgeon can effectively manipulate devices, such as the surgical instruments.
  • Another function of the processing device 43 is to implement the methods, cross-coupling control logic, and controllers described herein.
  • the processing device 43 can include one or more processors, digital signal processors (DSPs), and/or microcontrollers, and may be implemented as a combination of hardware, software and/or firmware. Also, its functions as described herein may be performed by one unit or divided up among a number of subunits, each of which may be implemented in turn by any combination of hardware, software and firmware. Further, although being shown as part of or being physically adjacent to the surgeon console 40 , the processing device 43 may also be distributed as subunits throughout the tele-surgery system. One or more of the subunits may be physically remote (e.g., located on a remote server) to the tele-surgery system.
  • DSPs digital signal processors
  • the robotic manipulator arm assemblies 120 , 130 , 140 , and 150 can manipulate devices such as surgical instruments to perform MIS.
  • the robotic manipulator arm assembly 120 is pivotably coupled to an instrument holder 122 .
  • a cannula 180 and a surgical instrument 200 and are, in turn, releasably coupled to the instrument holder 122 .
  • the cannula 180 is a tubular member that is located at the patient interface site during a surgery.
  • the cannula 180 defines a lumen in which an elongate shaft 220 of the surgical instrument 200 is slidably disposed.
  • the cannula 180 includes a distal end portion with a body wall retractor member.
  • the instrument holder 122 is pivotably coupled to a distal end of the robotic manipulator arm assembly 120 .
  • the pivotable coupling between the instrument holder 122 and the distal end of robotic manipulator arm assembly 120 is a motorized joint that is actuatable from the surgeon console 40 and processor 43 .
  • the instrument holder 122 includes an instrument holder frame 124 , a cannula clamp 126 , and an instrument holder carriage 128 .
  • the cannula clamp 126 is fixed to a distal end of the instrument holder frame 124 .
  • the cannula clamp 126 can be actuated to couple with, or to uncouple from, the cannula 180 .
  • the instrument holder carriage 128 is movably coupled to the instrument holder frame 124 . More particularly, the instrument holder carriage 128 is linearly translatable along the instrument holder frame 124 .
  • the movement of the instrument holder carriage 128 along the instrument holder frame 124 is a motorized, translational movement that is actuatable/controllable by the processor 43 .
  • the surgical instrument 200 includes a transmission assembly 210 , the elongate shaft 220 , and an end effector 230 .
  • the transmission assembly 210 may be releasably coupled with the instrument holder carriage 128 .
  • the shaft 220 extends distally from the transmission assembly 210 .
  • the end effector 230 is disposed at a distal end of the shaft 220 .
  • the shaft 220 defines a longitudinal axis 222 that is coincident with a longitudinal axis of the cannula 180 .
  • the instrument holder carriage 128 translates along the instrument holder frame 124 , the elongate shaft 220 of the surgical instrument 200 is moved along the longitudinal axis 222 .
  • the end effector 230 can be inserted and/or retracted from a surgical workspace within the body of a patient.
  • FIGS. 4A and 4B are example configurations of a display associated with a computer-assisted tele-operated surgery system.
  • the configurations shown in the examples of FIGS. 4A and 4B may be visible in the stereo vision display 45 described above with reference to FIG. 2 .
  • the display configuration 400 can include visual representations of data from multiple data sources.
  • the example configuration 400 includes an endoscope video feed (or image) 410 , an X-Ray image 420 , an ultrasound image 430 , a representation 440 of surgical tools (e.g., real tools or virtual proctoring tools), and a three-dimensional (3D) visualization 450 .
  • Visual representations from other sources may also be displayed.
  • the sources can include an endoscope (providing images/video feed in the visual, near infra-red (NIR) or other parts of the spectrum), an ultrasound imaging device (2D or 3D), a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device, and an X-Ray fluoroscope device.
  • NIR near infra-red
  • CT computed tomography
  • the source is a storage device storing pre-recorded video or images. In such cases stored images, data, or videos associated with a surgical process may also be displayed in accordance with user-preferences.
  • a source can include a processing device generating a graphical user interface (GUI) that includes one or more controls, for example, for adjusting or configuring the display device.
  • GUI graphical user interface
  • the images or video feeds from the multiple sources may be configured in various ways.
  • the visual representations from the multiple sources may be arranged within the available real-estate of the display device at substantially non-overlapping portions as per corresponding user-preferences. An example of such a configuration is shown in FIG. 4A .
  • the visual representations corresponding to one or more sources may be removed from (or minimized) and/or added to the visible area of the display device.
  • one or more visual representations may be displayed as an overlay on another image. This is shown as an example in the configuration 460 of FIG.
  • FIG. 4B where the endoscope image 410 is displayed as the main image, and the X-ray image 420 and the 3D visualization 450 are displayed as overlays on the endoscope image 410 .
  • FIG. 4C Another example is shown in FIG. 4C , where a visual representation 480 including text and image based information is displayed as an overlay on the endoscope image 410 .
  • the two images may be registered with respect to one another.
  • the images being displayed are geo-tagged with location information (e.g., position and orientation with respect to the origin of a known coordinate system)
  • they may be aligned with respect to one another based on the location information associated with the individual images.
  • the alignment can be calculated, for example, via an image registration process that includes transforming the sets of data corresponding to the acquired images into one coordinate system based on location information corresponding to the images. This can also be referred to as warping, and can include various rigid or non-rigid transformations such as translation, rotation, shear etc.
  • the configuration of the visual representations from the multiple sources can be done in various ways. Examples of how a display associated with a computer-assisted tele-operated surgery system may be configured are shown in FIGS. 5A and 5B . In some implementations, and as shown in FIG. 5A , the configuration may be done using a touch-based interface. In such cases, a touchscreen device 520 (e.g., a tablet computer) can be provided to the user to perform touch based adjustments of the visual representations. The adjustments can be configured to be reflected on a second display device 510 such as the display device 45 associated with the surgeon's console 40 described above with reference to FIG. 1 .
  • a second display device 510 such as the display device 45 associated with the surgeon's console 40 described above with reference to FIG. 1 .
  • moving an ultrasound image 530 over an endoscope image 540 on the touchscreen device 530 can cause a corresponding ultrasound image 535 to move over a corresponding endoscope image 545 on the second display device 510 .
  • configuration operations include moving visual representations to different portions of the screen, controlling 3D poses of displayed objects, rotating a rendered volumetric image, and performing adjustments for registering two images.
  • the user-interface can be touchless.
  • An example of such an interface is shown in FIG. 5B .
  • the visual representations may be configured, for example, using gestures but without touching the display device.
  • the index finger and the thumb of the user are detected and displayed on the display device as the points 550 and 560 , respectively. Moving the index finger and thumb causes the points 550 and 560 to move on the display device, thereby allowing the underlying visual representations to be adjusted, for example, as if the user is touching the display device at the points 550 and 560 .
  • user interfaces may also be used.
  • other touchless technologies such as gaze tracking may be used to configure the visual representations on a display device.
  • input device(s) for the physical operation of the surgical system such as master tool manipulators (MTM) available on da Vinci® Surgical Systems, wireless input devices, or gesture detection systems, among others
  • MTM master tool manipulators
  • dedicated configuration control input systems such as a keypad, touchpad, touchscreen, or joystick, among others
  • FIG. 6 is an example block diagram of a system 600 for displaying images from multiple sources.
  • the system 600 may help in maintaining display latency to below threshold levels associated with computer-assisted tele-operated surgery systems.
  • certain operating guidelines may specify that the maximum tolerable perceived latency for a tele-operated system is less than 100 milliseconds. This can be the case, for example, in da Vinci® Surgical Systems where endoscopic images may be used for closing a master-slave control loop, thereby affecting the transparency of master-slave control.
  • the perceived latency can be the sum of (i) a latency from when an action happens until a representation of the action is displayed on the da Vinci® surgeon console and (ii) the latency from when the user moves master manipulators until the slave instruments make the corresponding motion.
  • the system 600 can include a frame grabber unit 605 that digitizes incoming video frames and transfers the digitized images to a graphics processing unit (GPU) 610 via remote direct memory access (RDMA) to the GPU memory 615 .
  • the incoming video frames can be sent to the display device from the GPU 610 , thereby avoiding the central processing unit (CPU) 620 in the processing path of the images. This in turn may keep the latency associated with the endoscopic images below a tolerable threshold.
  • data from one or more other sources may traverse through both the CPU 620 and the GPU 610 before being displayed on the display device.
  • the system 600 may also include a network interface card 635 for routing data from one or more sources to the CPU 620 .
  • the composed images may be routed back to the frame grabber 605 via RDMA transfer and the video output may be provided via the frame grabber.
  • the video inputs and video outputs to/from the frame grabber 605 may be routed via a high bandwidth fiber optic link.
  • the CPU 620 may execute one or more applications 640 for processing the data received from the one or more sources.
  • One application may be logging of data, including video data from multiple sources to a local storage 645 .
  • one or more video sources connected to the CPU 620 may each be identified using a unique identifier and/or a set of parameters (e.g., image size, color or grayscale, mono or stereo, frame rate, etc.).
  • the source may be a video stream or an image that occasionally gets updated.
  • the source may also be linked to a particular position on the display device, for example, based on user preferences or user-inputs.
  • the display area on which the visual representation from a particular source may be positioned is larger than the physical visible area of the display device. In such cases, portions of the display area may be moved in or out of the physical display area based on user input.
  • the particular source may be linked, for example, with a transformation matrix with respect to a common coordinate system or with respect to the other image's transformation.
  • the parameters and identifiers associated with each source may be stored as a module, and modules may be added removed or modified in run time to configure visual representations from corresponding sources.
  • a module corresponding to a source may be modified at run time, for example, at a real-time or near real-time basis.
  • non-image information e.g., text, charts etc.
  • non-image information may be added to a module associated with a source for the non-image information to be displayed together with a visual representation of the corresponding source.
  • the system 800 described with reference to FIG. 8 may provide a flexible framework for implementing a reconfigurable display system for a computer-assisted tele-operated surgery system.
  • the visual representations of signals from the multiple sources may be configured in various ways.
  • a surgeon may configure the visual representations on the display device (using user interfaces and/or input devices described above) and store the preferences within a user-profile.
  • the user-profile may then be saved at a storage location (e.g., a remote server or local storage device) and downloaded by others as needed.
  • a storage location e.g., a remote server or local storage device
  • a senior surgeon may save his/her preferences in such a user-profile for junior/trainee surgeons to user or review.
  • the preferences stored within a user-profile may reflect preferences of an institution (e.g., a hospital) or regulations promulgated by a regulatory body.
  • the configuration can be specific to various phases of a surgery. For example, during an initial phase of a surgery (e.g., when a surgeon is making an incision) a surgeon may prefer to have the entire display area to be occupied by the endoscope image. However, during a later phase (e.g., when arteries are being clamped), the surgeon may prefer to see corresponding CT images showing the vasculature, either as an independent image, or registered over the endoscopic image.
  • the phase-dependent configurations may be stored on a storage device and loaded as needed upon determination of a particular phase of the surgical process. The determination that signals being received from one or more sources correspond to a particular phase of the surgery may be done in various ways.
  • the determination may be made based on manual user-input (e.g., voice-input or user-input received through an input device). For example, a surgeon may provide user-input indicative of a new phase in the surgery, and the corresponding phase-dependent display profile may be loaded accordingly.
  • the phase determination may be made automatically, for example, by processing one or more images from a source. For example, the endoscope image/feed may be analyzed to detect the presence of a particular anatomical feature or surgical tool, and the phase of the surgery may be determined accordingly.
  • an endoscope image is analyzed to detect the presence of a clamp, a determination may be made that the surgeon intends to clamp a portion of the vasculature, and accordingly, a CT image that highlights the vasculature may be displayed.
  • the contents of a visual representation corresponding to one or more sources may be analyzed in various ways to make such determinations.
  • the events generated by the surgical system e.g., da Vinci® surgical system
  • artificial intelligence processes e.g., machine learning based processes
  • the configurability of the display device may be delimited, for example, based on one or more safety conditions.
  • a surgeon attempts to configure the visual representations from the sources in a way that violates a safety condition, the system may prevent such a configuration and/or generate one or more alerts to flag the violation.
  • a safety condition may require that the endoscope feed is always displayed on the display device.
  • the system may prevent the surgeon from making such an adjustment, and/or generate an alert (e.g., a visual or audible alarm) indicating that the safety condition has been violated.
  • a supervisor process may monitor the configuration of the display device, and take one or more actions if a safety condition is violated. For example, if a safety condition is violated, control of the surgical tools and/or robotic arms may be affected to alert the surgeon of the violation. For example, if a determination is made that the surgeon is not looking at the endoscope feed (e.g., using gaze tracking), the system may limit sensitivity of one or more instruments to reduce the chances of accidental injuries to unintended portions. In some cases, such generation of control signals based on determining a violation of a safety condition may improve the safety of surgical processes performed using computer-assisted tele-operated surgical systems.
  • one or more image sources or other sources of information may be processed to detect the occurrence of one or more events, and take one or more actions based on associated rules. For example, if processing of endoscope images reveals the occurrence of bleeding, and the user display configuration in the meantime has a preoperative image covering most of the display, the system may be configured to automatically change the display configuration to bring the endoscope images to the surgeon's attention, possibly in conjunction with one or more visual or audio warnings. Rules for such safety measures can be set based on predetermined safety logic. The rules can also be implemented via machine learning tools such as neural networks trained on pre-collected and annotated datasets.
  • one or more image sources may be processed in the background to detect certain events that may trigger changes in the display mode, for example, to present more information to the surgeon when required.
  • Fluorescence imaging is often used to investigate tissue perfusion or to view lymph nodes during surgery.
  • fluorescent compounds can be injected locally or via vasculature and an endoscope camera can be operated in an appropriate mode to capture fluorescence.
  • the user may continue working with normal endoscope images, while a background process analyzes the observed fluorescence.
  • the display can be configured to automatically switch to showing the fluorescence images (or a composed white-light fluorescence endoscope image) upon detecting that an amount of the fluorescence exceeds a threshold.
  • one or more processes or filters may be applied to an image source based on detection of the phase of surgery.
  • the system can be configured to detect that an instrument is being used to burn and cut soft tissue, and accordingly, a haze removal process can be automatically applied to the processing pipeline for the endoscope images.
  • an energy instrument e.g., an instrument used for burning or cutting soft tissue
  • the ultrasound image display can be automatically hidden (or at least reduced in size), for example, to make sure that the user has an adequately large view of the endoscopic images.
  • the ultrasound images are affected by the noise from the energy instruments, such images may also be prevented from being displayed.
  • FIG. 7 is a flowchart illustrating an example process 800 of displaying visual representations of data during a surgical process.
  • the process 800 may be executed at a surgeon's console of a computer-assisted tele-operated surgery system (e.g., by the processing device 43 of the surgeon's console 40 depicted in FIG. 2 ).
  • Operations of the process 700 include operating a surgical system to perform a surgical process ( 710 ). This can include, for example, receiving one or more commands at the processing device 43 via input devices of the surgeon's console 40 , and generating control signals for operating the patient-side cart 100 .
  • Operations of the process 700 also include receiving data from multiple data sources ( 720 ).
  • the multiple data sources can include, for example, at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the multiple data sources can include a computing device generating one or more of an image, text, interactive graphics, or a graphical user interface (GUI).
  • GUI graphical user interface
  • the multiple data sources can also include a storage device providing one or more pre-stored images or videos.
  • one or more of the signals from the multiple data sources may include location information (e.g., based on data from position sensors such as EM, optical, RF, or shape sensors, kinematic modeling, image recognition/tracking, or any other modality) with respect to a coordinate system.
  • location information e.g., based on data from position sensors such as EM, optical, RF, or shape sensors, kinematic modeling, image recognition/tracking, or any other modality
  • Operations of the process 700 also includes determining a current phase of the surgical process ( 730 ). This may be done in various ways. In some implementations, determining the current phase may be based on a user-input (e.g., voice input, or input provided through an input device) indicative of the current phase. In some implementations, the determination can be made automatically, for example, based on an image analysis process executed on signals from at least one of the multiple sources.
  • determining the current phase may be based on a user-input (e.g., voice input, or input provided through an input device) indicative of the current phase. In some implementations, the determination can be made automatically, for example, based on an image analysis process executed on signals from at least one of the multiple sources.
  • Operations of the process 700 further includes displaying, on the display device, visual representations corresponding to the data from a first set of the multiple sources in a first arrangement within a display region of the display device ( 740 ). At least one of the first set of the multiple sources and the first arrangement is associated with the current phase of the surgical process. In some implementations, upon determination of a new phase of the surgical process, at least one of the first set of the multiple sources or the first arrangement is updated. The updating can be based on, for example, based on a user preference record for a current user of the surgical system. Such user preference records may be maintained, for example, as a user profile. In some implementations, the updating can be based on a predetermined safety profile for the surgical system.
  • the first arrangement can be determined, for example, based on a user profile.
  • the user-profile may be loaded prior to the commencement of the surgical process.
  • the user profile can identify an individual performing the surgical process, and include the user-preferences of the individual regarding organization of the visual representations of the signals from the multiple sources during different phases of the surgical process.
  • the user profile may be updated in accordance with such adjustments.
  • a representation of a user-profile may be stored at a storage location accessible to other users.
  • user-input indicative of adjustments to one or more of the visual representations may be received via an input device, and the display device may be updated in accordance with the adjustments.
  • data from the one or more multiple data sources can include position data with respect to a common frame of reference (e.g., a common coordinate system).
  • displaying the visual representations can include overlaying a first visual representation on a second visual representation in accordance with that common reference frame.
  • the display device can include multiple screens or display areas (e.g., a main display that shows active visual representations, and a side display that shows the visual displays that are minimized or not being used).
  • the display area may be larger than the physical dimensions of the viewable area of the display device, and portions of the display area may be dragged in and out of the viewable area as needed.
  • dragging one of those visual representations onto the other can cause it to “snap” into registration with the other, thereby ensuring that the visual representations are properly aligned.
  • FIG. 8 is a flowchart illustrating an example process 800 of controlling configurability of visual representations from multiple sources.
  • the process 800 may be executed at a surgeon's console of a computer-assisted tele-operated surgery system (e.g., by the processing device 43 of the surgeon's console 40 depicted in FIG. 2 ).
  • Operations of the process 800 includes receiving signals from multiple sources ( 810 ).
  • the multiple sources can include the sources as described above with reference to FIG. 8 .
  • the operations of the process 800 also includes displaying visual representations corresponding to the signals from at least a subset of the multiple sources at locations determined for each of the visual representations ( 820 ).
  • the subset of the multiple sources and/or the corresponding locations may be determined based on user preferences stored within a user-profile.
  • Operations of the process 800 further includes receiving user-input indicative of adjustments to one or more of the visual representations ( 830 ).
  • a surgeon may need to readjust the default positions and/or size of the visual representations in accordance with the particular surgery at hand, and may make such adjustments using an input device or user interface described above.
  • a surgeon performing a nephrectomy may prefer to have ultrasound images aligned or registered to the endoscope feed, and have the CT images on one side for reviewing as needed.
  • the surgeon may make the necessary adjustments to an existing display configuration, for example, using an input device such as a tablet computer (e.g., as shown in FIG. 5A ), a touchless input device (e.g., as shown in FIG. 5B ), a haptic device, or a gaze-tracking based input device.
  • an input device such as a tablet computer (e.g., as shown in FIG. 5A ), a touchless input device (e.g., as shown in FIG. 5B ), a
  • Operations of the process 800 further includes determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation ( 840 ).
  • a safety condition associated with an endoscope feed may specify that the visual representation of the feed may not be reduced to a size smaller than a threshold. In such a case, if a user attempts to reduce the visual representation of the endoscope feed to a size smaller than the threshold, a violation of the safety condition may be determined.
  • determining the violation can include detecting that the user-input is requesting a particular visual representation to be removed from the display device, whereas the corresponding predetermined safety condition specifies that the particular visual representation to be displayed throughout the surgical process.
  • determining the violation can also include detecting (e.g., via gaze tracking) that a user is looking at an incorrect visual representation during the adjustment process.
  • Operations of the process 800 also includes generating, responsive to determining the violation, a control signal configured to alert a user of the violation ( 850 ).
  • the control signal can be configured to disable (or reduce the sensitivity of) one or more instruments being used in the surgical process.
  • the control signal may also cause the generation of a visible or audible message that alerts the user of the violation.
  • the control signal may cause the violating adjustment to be undone or reversed, and alert the user that such a readjustment has been made.
  • the predetermined safety conditions can be specific to the various visual representations, and/or specific to particular phase of the surgical process.
  • the functionality described herein, or portions thereof, and its various modifications can be implemented, at least in part, via a computer program product, e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine-readable media or storage device, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a DSP, a microcontroller, a computer, multiple computers, and/or programmable logic components.
  • a computer program product e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine-readable media or storage device, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a DSP, a microcontroller, a computer, multiple computers, and/or programmable logic components.
  • a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program can be deployed to be executed one or more processing devices at one site or distributed across multiple sites and interconnected by a network.
  • Actions associated with implementing all or part of the functions can be performed by one or more programmable processors or processing devices executing one or more computer programs to perform the functions of the processes described herein. All or part of the functions can be implemented as, special purpose logic circuitry, e.g., an FPGA and/or an ASIC (application-specific integrated circuit).
  • special purpose logic circuitry e.g., an FPGA and/or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.

Abstract

The technology described in this document can be embodied in a method that includes operating a surgical system to perform a surgical process, the surgical system including a display device, and receiving, at one or more processing devices, data from multiple data sources. The method also includes determining a current phase of the surgical process, and displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 62/417,493, filed Nov. 4, 2016. The disclosure of the prior application is considered part of and is incorporated by reference in the disclosure of this application.
  • TECHNICAL FIELD
  • This disclosure relates to devices and methods for minimally invasive computer-assisted tele-operated surgery.
  • BACKGROUND
  • Minimally invasive tele-surgical systems for use in surgery are being developed to increase a surgeon's dexterity as well as to allow a surgeon to operate on a patient from a remote location. Tele-surgery is a general term for surgical systems where the surgeon uses some form of remote control, e.g., a servomechanism, or the like, to manipulate surgical instrument movements rather than directly holding and moving the instruments by hand. In such a tele-surgery system, the surgeon is provided with an image of the surgical site at the remote location. The surgeon performs the surgical procedures on the patient by manipulating master control input devices, which in turn control the motion of robotic instruments.
  • SUMMARY
  • In one aspect, this document features a method that includes operating a surgical system to perform a surgical process, the surgical system including a display device, and receiving, at one or more processing devices, data from multiple data sources. The method also includes determining a current phase of the surgical process, and displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • In another aspect this document features a surgical system that includes a display device and one or more processing devices. The one or more processing devices are configured to operate the surgical system to perform a surgical process, and receive data from multiple data sources. The one or more processing devices are also configured to determine a current phase of the surgical process, and display, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • In another aspect, this document features one or more machine-readable non-transitory storage devices encoded with machine-readable instructions configured to cause one or more processing devices to perform various operations. The operations include operating a surgical system to perform a surgical process, the surgical system including a display device, and receiving, at one or more processing devices, data from multiple data sources. The operations also include determining a current phase of the surgical process, and displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device. At least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
  • Implementations of the above aspects may include one or more of the following. A new phase of the surgical process may be determined, and at least one of the first set of the multiple data sources and the first arrangement may be updated in response to determining the new phase of the surgical process. Updating the at least one of the first set of the multiple data sources and the first arrangement can be based on a user preference record for a current user of the surgical system. Updating the at least one of the first set of the multiple data sources and the first arrangement can be based on a predetermined safety profile for the surgical system. User-input indicative of adjustments to one or more of the visual representations may be received, and the display device may be updated in accordance with the adjustments. A user-profile may also be updated in accordance with the adjustments. The user-profile can be stored at a storage location and be made accessible to other users. The multiple data sources can include at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device. The multiple data sources can include at least one of: (i) a computing device generating one or more of an image, text, interactive graphics, or a graphical user interface (GUI), and (ii) a storage device providing one or more pre-stored images or videos. Determining the current phase can be based on a user-input indicative of the current phase. Determining the current phase can be based on an image analysis process executed on the data from at least one of the multiple data sources. The data from one or more of the multiple data sources can include positional information with respect to a common reference frame. Displaying the visual representations can include overlaying a first visual representation on a second visual representation, wherein the first visual representation is registered with respect to the second visual representation based on the common reference frame. The first arrangement can be determined based on a user profile loaded prior to commencement of the surgical process. The user profile can identify an individual performing the surgical process, and include user-preferences of the individual regarding organization of the visual representations corresponding to the data from the multiple data sources during different phases of the surgical process. The display device can include multiple screens.
  • In another aspect, this document features a method for controlling configurability of visual representations of data from multiple data sources on a display device during a surgical process. The method includes receiving data from the multiple data sources, displaying, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations, and receiving, via an input device, user-input indicative of adjustments to one or more of the visual representations. The method also includes determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and in response, generating a control signal configured to alert a user of the violation.
  • In another aspect, this document features a surgical system that includes a display device and one or more processing devices. The one or more processing devices are configured to receive data from multiple data sources, and display, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations. The one or more processing devices are also configured to receive, via an input device, user-input indicative of adjustments to one or more of the visual representations, determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and responsive to determining the violation, generating a control signal configured to alert a user of the violation.
  • In another aspect, this document features one or more machine-readable non-transitory storage devices encoded with machine-readable instructions configured to cause one or more processing devices to perform various operations. The operations include receiving data from the multiple data sources, displaying, on the display device, visual representations corresponding to the data from at least a subset of the multiple data sources at locations determined for each of the visual representations, and receiving, via an input device, user-input indicative of adjustments to one or more of the visual representations. The operations also include determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation, and in response, generating a control signal configured to alert a user of the violation.
  • Some or all of the embodiments described herein may provide one or more of the following advantages. In some cases, visual representation of data from multiple data sources can be displayed on a console of a surgical system based on user-preferences. In some cases, the display preferences of an individual (e.g., a senior surgeon) may be saved as a profile, and later used by other individuals (e.g., junior surgeons, medical students etc.). The display preferences may be specific to phases of surgery, and may be automatically loaded upon detection of corresponding phases. By allowing for overlaying images from different sources (possibly warped and registered with one another), and providing a user control over the locations of the various images, the technology described herein may allow for improved user experience for surgeons performing minimally invasive robotic surgery (also referred to herein as minimally invasive surgery (MIS)). In some cases, virtual proctoring tools (also referred to as ghost tools) may be overlaid on images to allow a surgeon to rehearse a surgical procedure before using actual tools to perform the procedure. Various safety protocols may govern the location and configuration of the images from different sources, for example, to guard against a surgeon accidentally missing important information. This in turn may increase patient safety by reducing chances of human errors that may otherwise affect MIS.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of an example patient-side cart of a computer-assisted tele-operated surgery system.
  • FIG. 2 is a front view of an example surgeon console of a computer-assisted tele-operated surgery system.
  • FIG. 3 is a side view of an example robotic manipulator arm assembly of a computer-assisted tele-operated surgery system.
  • FIGS. 4A and 4B are example configurations of a display associated with a computer-assisted tele-operated surgery system.
  • FIG. 4C is an example illustrating display of data overlaid on an endoscope image.
  • FIGS. 5A and 5B show examples of how a display associated with a computer-assisted tele-operated surgery system may be configured by a user.
  • FIG. 6 is an example block diagram of a system for displaying images from multiple data sources.
  • FIG. 7 is a flowchart illustrating an example process of providing feedback to a healthcare professional during a surgical process.
  • FIG. 8 is a flowchart illustrating an example process of controlling configurability of visual representations from multiple data sources.
  • DETAILED DESCRIPTION
  • This document describes technology that, in some cases, improves visualization of surgical sites and anatomical parts during image-guided surgical processes such as minimally invasive robotic surgery (also referred to herein as minimally invasive surgery (MIS)). For example, the technology allows for configuring locations of images from various sources on a display device associated with a surgeon's console. This may be done manually, for example, in accordance with the preferences of the surgeon as indicated via a user-input, or automatically, for example, based on pre-stored preferences, and/or based on detecting a phase of the surgery. In some implementations, the technology may allow for various types of configurability (e.g., overlaying images on one another, minimizing or removing a feed from a particular image source, or concurrently displaying feeds from multiple data sources) that may in turn allow a surgeon to perform a surgery with increases effectiveness. In addition, the configurability may be governed using safety protocols aimed at reducing the possibility of a surgeon missing useful information. For example, safety protocols may prevent an endoscope image from being configured to a size less than a threshold size, or lock certain displays from being removed from the console.
  • Aspects of the invention are described primarily in terms of an implementation using a da Vinci® Surgical System, commercialized by Intuitive Surgical, Inc. of Sunnyvale, Calif. Examples of such surgical systems are the da Vinci® Xi™ Surgical System (Model IS4000) and the da Vinci® Si™ HD™ Surgical System (Model IS3000). It should be understood that aspects disclosed herein may be embodied and implemented in various ways, including computer-assisted, non-computer-assisted, and hybrid combinations of manual and computer-assisted embodiments and implementations. Implementations on da Vinci® Surgical Systems (e.g., the Model IS4000, the Model IS3000, the Model IS2000, the Model IS1200) are described for illustrative purposes, and are not to be considered as limiting the scope of the inventive aspects disclosed herein. As applicable, inventive aspects may be embodied and implemented in both relatively smaller, hand-held, hand-operated devices and relatively larger systems that have additional mechanical support, as well as in other embodiments of computer-assisted tele-operated medical devices.
  • Referring to FIGS. 1 and 2, systems for minimally invasive computer-assisted tele-surgery (also referred to as MIS) can include a patient-side cart 100 and a surgeon console 40. Tele-surgery is a general term for surgical systems where the surgeon uses some form of remote control, e.g., a servomechanism, or the like, to manipulate surgical instrument movements rather than directly holding and moving the instruments by hand. The robotically manipulatable surgical instruments can be inserted through small, minimally invasive surgical apertures to treat tissues at surgical sites within the patient, avoiding the trauma associated with accessing for open surgery. These robotic systems can move the working ends of the surgical instruments with sufficient dexterity to perform quite intricate surgical tasks, often by pivoting shafts of the instruments at the minimally invasive aperture, sliding of the shaft axially through the aperture, rotating of the shaft within the aperture, and/or the like.
  • In the depicted embodiment, the patient-side cart 100 includes a base 110, a first robotic manipulator arm assembly 120, a second robotic manipulator arm assembly 130, a third robotic manipulator arm assembly 140, and a fourth robotic manipulator arm assembly 150. Each robotic manipulator arm assembly 120, 130, 140, and 150 is pivotably coupled to the base 110. In some embodiments, fewer than four or more than four robotic manipulator arm assemblies may be included as part of the patient-side cart 100. While in the depicted embodiment the base 110 includes casters to allow ease of mobility, in some embodiments the patient-side cart 100 is fixedly mounted to a floor, ceiling, operating table, structural framework, or the like.
  • In a typical application, two of the robotic manipulator arm assemblies 120, 130, 140, or 150 hold surgical instruments and a third holds a stereo endoscope. The remaining robotic manipulator arm assembly is available so that another instrument may be introduced at the work site. Alternatively, the remaining robotic manipulator arm assembly may be used for introducing a second endoscope or another image capturing device, such as an ultrasound transducer, to the work site.
  • Each of the robotic manipulator arm assemblies 120, 130, 140, and 150 is conventionally formed of links that are coupled together and manipulated through actuatable joints. Each of the robotic manipulator arm assemblies 120, 130, 140, and 150 includes a setup arm and a device manipulator. The setup arm positions its held device so that a pivot point occurs at its entry aperture into the patient. The device manipulator may then manipulate its held device so that it may be pivoted about the pivot point, inserted into and retracted out of the entry aperture, and rotated about its shaft axis.
  • In the depicted embodiment, the surgeon console 40 includes a stereo vision display 45 so that the user may view the surgical work site in stereo vision from images captured by the stereoscopic camera of the patient-side cart 100. Left and right eyepieces, 46 and 47, are provided in the stereo vision display 45 so that the user may view left and right display screens inside the display 45 respectively with the user's left and right eyes. While viewing typically an image of the surgical site on a suitable viewer or display, the surgeon performs the surgical procedures on the patient by manipulating master control input devices, which in turn control the motion of robotic instruments.
  • The surgeon console 40 also includes left and right input devices 41, 42 that the user may grasp respectively with his/her left and right hands to manipulate devices (e.g., surgical instruments) being held by the robotic manipulator arm assemblies 120, 130, 140, and 150 of the patient-side cart 100 in preferably six degrees-of-freedom (“DOF”). Foot pedals 44 with toe and heel controls are provided on the surgeon console 40 so the user may control movement and/or actuation of devices associated with the foot pedals.
  • A processing device 43 is provided in the surgeon console 40 for control and other purposes. The processing device 43 performs various functions in the medical robotic system. One function performed by processing device 43 is to translate and transfer the mechanical motion of input devices 41, 42 to actuate their respective joints in their associated robotic manipulator arm assemblies 120, 130, 140, and 150 so that the surgeon can effectively manipulate devices, such as the surgical instruments. Another function of the processing device 43 is to implement the methods, cross-coupling control logic, and controllers described herein.
  • The processing device 43 can include one or more processors, digital signal processors (DSPs), and/or microcontrollers, and may be implemented as a combination of hardware, software and/or firmware. Also, its functions as described herein may be performed by one unit or divided up among a number of subunits, each of which may be implemented in turn by any combination of hardware, software and firmware. Further, although being shown as part of or being physically adjacent to the surgeon console 40, the processing device 43 may also be distributed as subunits throughout the tele-surgery system. One or more of the subunits may be physically remote (e.g., located on a remote server) to the tele-surgery system.
  • Referring also to FIG. 3, the robotic manipulator arm assemblies 120, 130, 140, and 150 can manipulate devices such as surgical instruments to perform MIS. For example, in the depicted arrangement the robotic manipulator arm assembly 120 is pivotably coupled to an instrument holder 122. A cannula 180 and a surgical instrument 200 and are, in turn, releasably coupled to the instrument holder 122. The cannula 180 is a tubular member that is located at the patient interface site during a surgery. The cannula 180 defines a lumen in which an elongate shaft 220 of the surgical instrument 200 is slidably disposed. As described further below, in some embodiments the cannula 180 includes a distal end portion with a body wall retractor member. The instrument holder 122 is pivotably coupled to a distal end of the robotic manipulator arm assembly 120. In some embodiments, the pivotable coupling between the instrument holder 122 and the distal end of robotic manipulator arm assembly 120 is a motorized joint that is actuatable from the surgeon console 40 and processor 43.
  • The instrument holder 122 includes an instrument holder frame 124, a cannula clamp 126, and an instrument holder carriage 128. In the depicted embodiment, the cannula clamp 126 is fixed to a distal end of the instrument holder frame 124. The cannula clamp 126 can be actuated to couple with, or to uncouple from, the cannula 180. The instrument holder carriage 128 is movably coupled to the instrument holder frame 124. More particularly, the instrument holder carriage 128 is linearly translatable along the instrument holder frame 124. In some embodiments, the movement of the instrument holder carriage 128 along the instrument holder frame 124 is a motorized, translational movement that is actuatable/controllable by the processor 43. The surgical instrument 200 includes a transmission assembly 210, the elongate shaft 220, and an end effector 230. The transmission assembly 210 may be releasably coupled with the instrument holder carriage 128. The shaft 220 extends distally from the transmission assembly 210. The end effector 230 is disposed at a distal end of the shaft 220.
  • The shaft 220 defines a longitudinal axis 222 that is coincident with a longitudinal axis of the cannula 180. As the instrument holder carriage 128 translates along the instrument holder frame 124, the elongate shaft 220 of the surgical instrument 200 is moved along the longitudinal axis 222. In such a manner, the end effector 230 can be inserted and/or retracted from a surgical workspace within the body of a patient.
  • FIGS. 4A and 4B are example configurations of a display associated with a computer-assisted tele-operated surgery system. For example, the configurations shown in the examples of FIGS. 4A and 4B may be visible in the stereo vision display 45 described above with reference to FIG. 2. In some implementations, the display configuration 400 can include visual representations of data from multiple data sources. For example, the example configuration 400 includes an endoscope video feed (or image) 410, an X-Ray image 420, an ultrasound image 430, a representation 440 of surgical tools (e.g., real tools or virtual proctoring tools), and a three-dimensional (3D) visualization 450. Visual representations from other sources may also be displayed. For example, the sources can include an endoscope (providing images/video feed in the visual, near infra-red (NIR) or other parts of the spectrum), an ultrasound imaging device (2D or 3D), a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device, and an X-Ray fluoroscope device. In some implementations, the source is a storage device storing pre-recorded video or images. In such cases stored images, data, or videos associated with a surgical process may also be displayed in accordance with user-preferences. For example, the display device may be connected to a laptop, smartphone, tablet, or other computing device to display pre-operative or inter-operative scans, test results, notes or other image or text-based information. In some implementations, a source can include a processing device generating a graphical user interface (GUI) that includes one or more controls, for example, for adjusting or configuring the display device.
  • The images or video feeds from the multiple sources may be configured in various ways. In some implementations, the visual representations from the multiple sources may be arranged within the available real-estate of the display device at substantially non-overlapping portions as per corresponding user-preferences. An example of such a configuration is shown in FIG. 4A. In some implementations, the visual representations corresponding to one or more sources may be removed from (or minimized) and/or added to the visible area of the display device. In some implementations, one or more visual representations may be displayed as an overlay on another image. This is shown as an example in the configuration 460 of FIG. 4B, where the endoscope image 410 is displayed as the main image, and the X-ray image 420 and the 3D visualization 450 are displayed as overlays on the endoscope image 410. Another example is shown in FIG. 4C, where a visual representation 480 including text and image based information is displayed as an overlay on the endoscope image 410.
  • In some implementations, where an image is displayed as an overlay on another image, the two images may be registered with respect to one another. For example, if the images being displayed are geo-tagged with location information (e.g., position and orientation with respect to the origin of a known coordinate system), they may be aligned with respect to one another based on the location information associated with the individual images. The alignment can be calculated, for example, via an image registration process that includes transforming the sets of data corresponding to the acquired images into one coordinate system based on location information corresponding to the images. This can also be referred to as warping, and can include various rigid or non-rigid transformations such as translation, rotation, shear etc.
  • The configuration of the visual representations from the multiple sources can be done in various ways. Examples of how a display associated with a computer-assisted tele-operated surgery system may be configured are shown in FIGS. 5A and 5B. In some implementations, and as shown in FIG. 5A, the configuration may be done using a touch-based interface. In such cases, a touchscreen device 520 (e.g., a tablet computer) can be provided to the user to perform touch based adjustments of the visual representations. The adjustments can be configured to be reflected on a second display device 510 such as the display device 45 associated with the surgeon's console 40 described above with reference to FIG. 1. For example, moving an ultrasound image 530 over an endoscope image 540 on the touchscreen device 530 can cause a corresponding ultrasound image 535 to move over a corresponding endoscope image 545 on the second display device 510. Examples of configuration operations that may be possible using a touch-based user interface include moving visual representations to different portions of the screen, controlling 3D poses of displayed objects, rotating a rendered volumetric image, and performing adjustments for registering two images.
  • In some implementations, the user-interface can be touchless. An example of such an interface is shown in FIG. 5B. In such cases, the visual representations may be configured, for example, using gestures but without touching the display device. In the example shown in FIG. 5B, the index finger and the thumb of the user are detected and displayed on the display device as the points 550 and 560, respectively. Moving the index finger and thumb causes the points 550 and 560 to move on the display device, thereby allowing the underlying visual representations to be adjusted, for example, as if the user is touching the display device at the points 550 and 560.
  • Other forms of user interfaces may also be used. In some implementations, other touchless technologies such as gaze tracking may be used to configure the visual representations on a display device. In some implementations, input device(s) for the physical operation of the surgical system (such as master tool manipulators (MTM) available on da Vinci® Surgical Systems, wireless input devices, or gesture detection systems, among others) may be used for configuring the visual representations. In some implementations, dedicated configuration control input systems (such as a keypad, touchpad, touchscreen, or joystick, among others) may be used for configuring the visual representations (and optionally any other aspects to the surgical system not involving physical operation of the system).
  • FIG. 6 is an example block diagram of a system 600 for displaying images from multiple sources. In some implementations, the system 600 may help in maintaining display latency to below threshold levels associated with computer-assisted tele-operated surgery systems. For example, certain operating guidelines may specify that the maximum tolerable perceived latency for a tele-operated system is less than 100 milliseconds. This can be the case, for example, in da Vinci® Surgical Systems where endoscopic images may be used for closing a master-slave control loop, thereby affecting the transparency of master-slave control. In such cases the perceived latency can be the sum of (i) a latency from when an action happens until a representation of the action is displayed on the da Vinci® surgeon console and (ii) the latency from when the user moves master manipulators until the slave instruments make the corresponding motion. In some implementations, the system 600 can include a frame grabber unit 605 that digitizes incoming video frames and transfers the digitized images to a graphics processing unit (GPU) 610 via remote direct memory access (RDMA) to the GPU memory 615. The incoming video frames can be sent to the display device from the GPU 610, thereby avoiding the central processing unit (CPU) 620 in the processing path of the images. This in turn may keep the latency associated with the endoscopic images below a tolerable threshold. In some implementations, data from one or more other sources (e.g., a third party imaging device 625 or a tracking device 630) may traverse through both the CPU 620 and the GPU 610 before being displayed on the display device. The system 600 may also include a network interface card 635 for routing data from one or more sources to the CPU 620. In some implementations, the composed images may be routed back to the frame grabber 605 via RDMA transfer and the video output may be provided via the frame grabber. In some implementation, the video inputs and video outputs to/from the frame grabber 605 may be routed via a high bandwidth fiber optic link. In some implementations, the CPU 620 may execute one or more applications 640 for processing the data received from the one or more sources. One application may be logging of data, including video data from multiple sources to a local storage 645.
  • In some implementations, one or more video sources connected to the CPU 620 may each be identified using a unique identifier and/or a set of parameters (e.g., image size, color or grayscale, mono or stereo, frame rate, etc.). The source may be a video stream or an image that occasionally gets updated. The source may also be linked to a particular position on the display device, for example, based on user preferences or user-inputs. In some implementations, the display area on which the visual representation from a particular source may be positioned is larger than the physical visible area of the display device. In such cases, portions of the display area may be moved in or out of the physical display area based on user input. In some implementations, where an image from a particular source is registered with respect to another image, the particular source may be linked, for example, with a transformation matrix with respect to a common coordinate system or with respect to the other image's transformation. The parameters and identifiers associated with each source may be stored as a module, and modules may be added removed or modified in run time to configure visual representations from corresponding sources. A module corresponding to a source may be modified at run time, for example, at a real-time or near real-time basis. For example, non-image information (e.g., text, charts etc.) may be added to a module associated with a source for the non-image information to be displayed together with a visual representation of the corresponding source. In some cases, by making the latency associated with a particular source (e.g., endoscopic images) independent of the other image sources, the system 800 described with reference to FIG. 8 may provide a flexible framework for implementing a reconfigurable display system for a computer-assisted tele-operated surgery system.
  • The visual representations of signals from the multiple sources may be configured in various ways. In some implementations, a surgeon may configure the visual representations on the display device (using user interfaces and/or input devices described above) and store the preferences within a user-profile. The user-profile may then be saved at a storage location (e.g., a remote server or local storage device) and downloaded by others as needed. For example, a senior surgeon may save his/her preferences in such a user-profile for junior/trainee surgeons to user or review. In some implementations, the preferences stored within a user-profile may reflect preferences of an institution (e.g., a hospital) or regulations promulgated by a regulatory body.
  • In some implementations, the configuration can be specific to various phases of a surgery. For example, during an initial phase of a surgery (e.g., when a surgeon is making an incision) a surgeon may prefer to have the entire display area to be occupied by the endoscope image. However, during a later phase (e.g., when arteries are being clamped), the surgeon may prefer to see corresponding CT images showing the vasculature, either as an independent image, or registered over the endoscopic image. The phase-dependent configurations may be stored on a storage device and loaded as needed upon determination of a particular phase of the surgical process. The determination that signals being received from one or more sources correspond to a particular phase of the surgery may be done in various ways. In some implementations, the determination may be made based on manual user-input (e.g., voice-input or user-input received through an input device). For example, a surgeon may provide user-input indicative of a new phase in the surgery, and the corresponding phase-dependent display profile may be loaded accordingly. In some implementations, the phase determination may be made automatically, for example, by processing one or more images from a source. For example, the endoscope image/feed may be analyzed to detect the presence of a particular anatomical feature or surgical tool, and the phase of the surgery may be determined accordingly. If an endoscope image is analyzed to detect the presence of a clamp, a determination may be made that the surgeon intends to clamp a portion of the vasculature, and accordingly, a CT image that highlights the vasculature may be displayed. As such, the contents of a visual representation corresponding to one or more sources may be analyzed in various ways to make such determinations. In some implementations, the events generated by the surgical system (e.g., da Vinci® surgical system) may be used to determine the phase of surgery or be used as indications for change in the display layout. In some implementations, artificial intelligence processes (e.g., machine learning based processes) may be used in determining phases of a surgical process. In some cases, such dynamic phase-based reconfiguration may help in improving a surgeon's user-experience, for example, by loading an appropriate display configuration automatically.
  • In some implementations, the configurability of the display device may be delimited, for example, based on one or more safety conditions. In such cases, if a surgeon attempts to configure the visual representations from the sources in a way that violates a safety condition, the system may prevent such a configuration and/or generate one or more alerts to flag the violation. For example, in certain surgical processes, a safety condition may require that the endoscope feed is always displayed on the display device. In such cases, if a surgeon attempts to remove the endoscope feed from the display device (or reduce it to a size smaller than an allowable threshold), the system may prevent the surgeon from making such an adjustment, and/or generate an alert (e.g., a visual or audible alarm) indicating that the safety condition has been violated. Other safety conditions may cause one or more visual representations to be “locked” within the display device such that a user may not remove (or possibly even resize) such visual representations. In some implementations, a supervisor process may monitor the configuration of the display device, and take one or more actions if a safety condition is violated. For example, if a safety condition is violated, control of the surgical tools and/or robotic arms may be affected to alert the surgeon of the violation. For example, if a determination is made that the surgeon is not looking at the endoscope feed (e.g., using gaze tracking), the system may limit sensitivity of one or more instruments to reduce the chances of accidental injuries to unintended portions. In some cases, such generation of control signals based on determining a violation of a safety condition may improve the safety of surgical processes performed using computer-assisted tele-operated surgical systems.
  • In some implementations, one or more image sources or other sources of information may be processed to detect the occurrence of one or more events, and take one or more actions based on associated rules. For example, if processing of endoscope images reveals the occurrence of bleeding, and the user display configuration in the meantime has a preoperative image covering most of the display, the system may be configured to automatically change the display configuration to bring the endoscope images to the surgeon's attention, possibly in conjunction with one or more visual or audio warnings. Rules for such safety measures can be set based on predetermined safety logic. The rules can also be implemented via machine learning tools such as neural networks trained on pre-collected and annotated datasets.
  • In some implementations, one or more image sources may be processed in the background to detect certain events that may trigger changes in the display mode, for example, to present more information to the surgeon when required. Fluorescence imaging is often used to investigate tissue perfusion or to view lymph nodes during surgery. For fluorescence imaging, fluorescent compounds can be injected locally or via vasculature and an endoscope camera can be operated in an appropriate mode to capture fluorescence. In some implementations, the user may continue working with normal endoscope images, while a background process analyzes the observed fluorescence. In some implementations, the display can be configured to automatically switch to showing the fluorescence images (or a composed white-light fluorescence endoscope image) upon detecting that an amount of the fluorescence exceeds a threshold.
  • In some implementations, one or more processes or filters may be applied to an image source based on detection of the phase of surgery. For example, the system can be configured to detect that an instrument is being used to burn and cut soft tissue, and accordingly, a haze removal process can be automatically applied to the processing pipeline for the endoscope images. As another example, when an energy instrument (e.g., an instrument used for burning or cutting soft tissue) is used, the ultrasound image display can be automatically hidden (or at least reduced in size), for example, to make sure that the user has an adequately large view of the endoscopic images. In some implementations, if the ultrasound images are affected by the noise from the energy instruments, such images may also be prevented from being displayed.
  • FIG. 7 is a flowchart illustrating an example process 800 of displaying visual representations of data during a surgical process. In some implementations, at least a portion of the process 800 may be executed at a surgeon's console of a computer-assisted tele-operated surgery system (e.g., by the processing device 43 of the surgeon's console 40 depicted in FIG. 2). Operations of the process 700 include operating a surgical system to perform a surgical process (710). This can include, for example, receiving one or more commands at the processing device 43 via input devices of the surgeon's console 40, and generating control signals for operating the patient-side cart 100. Operations of the process 700 also include receiving data from multiple data sources (720). The multiple data sources can include, for example, at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device. In some implementations, the multiple data sources can include a computing device generating one or more of an image, text, interactive graphics, or a graphical user interface (GUI). The multiple data sources can also include a storage device providing one or more pre-stored images or videos. In some implementations, one or more of the signals from the multiple data sources may include location information (e.g., based on data from position sensors such as EM, optical, RF, or shape sensors, kinematic modeling, image recognition/tracking, or any other modality) with respect to a coordinate system.
  • Operations of the process 700 also includes determining a current phase of the surgical process (730). This may be done in various ways. In some implementations, determining the current phase may be based on a user-input (e.g., voice input, or input provided through an input device) indicative of the current phase. In some implementations, the determination can be made automatically, for example, based on an image analysis process executed on signals from at least one of the multiple sources.
  • Operations of the process 700 further includes displaying, on the display device, visual representations corresponding to the data from a first set of the multiple sources in a first arrangement within a display region of the display device (740). At least one of the first set of the multiple sources and the first arrangement is associated with the current phase of the surgical process. In some implementations, upon determination of a new phase of the surgical process, at least one of the first set of the multiple sources or the first arrangement is updated. The updating can be based on, for example, based on a user preference record for a current user of the surgical system. Such user preference records may be maintained, for example, as a user profile. In some implementations, the updating can be based on a predetermined safety profile for the surgical system.
  • The first arrangement can be determined, for example, based on a user profile. In some implementations, the user-profile may be loaded prior to the commencement of the surgical process. In some implementations, the user profile can identify an individual performing the surgical process, and include the user-preferences of the individual regarding organization of the visual representations of the signals from the multiple sources during different phases of the surgical process. In some implementations, if the user adjusts one or more visual representations, the user profile may be updated in accordance with such adjustments. In some implementations, a representation of a user-profile may be stored at a storage location accessible to other users.
  • In some implementations, user-input indicative of adjustments to one or more of the visual representations may be received via an input device, and the display device may be updated in accordance with the adjustments. In some implementations, data from the one or more multiple data sources can include position data with respect to a common frame of reference (e.g., a common coordinate system). In such cases, displaying the visual representations can include overlaying a first visual representation on a second visual representation in accordance with that common reference frame. In some implementations, the display device can include multiple screens or display areas (e.g., a main display that shows active visual representations, and a side display that shows the visual displays that are minimized or not being used). In some implementations, the display area may be larger than the physical dimensions of the viewable area of the display device, and portions of the display area may be dragged in and out of the viewable area as needed. In some implementations, when two visual representations from two sources are associated with a common frame of reference, dragging one of those visual representations onto the other can cause it to “snap” into registration with the other, thereby ensuring that the visual representations are properly aligned.
  • FIG. 8 is a flowchart illustrating an example process 800 of controlling configurability of visual representations from multiple sources. In some implementations, at least a portion of the process 800 may be executed at a surgeon's console of a computer-assisted tele-operated surgery system (e.g., by the processing device 43 of the surgeon's console 40 depicted in FIG. 2). Operations of the process 800 includes receiving signals from multiple sources (810). The multiple sources can include the sources as described above with reference to FIG. 8. The operations of the process 800 also includes displaying visual representations corresponding to the signals from at least a subset of the multiple sources at locations determined for each of the visual representations (820). In some implementations, the subset of the multiple sources and/or the corresponding locations may be determined based on user preferences stored within a user-profile.
  • Operations of the process 800 further includes receiving user-input indicative of adjustments to one or more of the visual representations (830). A surgeon may need to readjust the default positions and/or size of the visual representations in accordance with the particular surgery at hand, and may make such adjustments using an input device or user interface described above. For example, a surgeon performing a nephrectomy may prefer to have ultrasound images aligned or registered to the endoscope feed, and have the CT images on one side for reviewing as needed. Accordingly, the surgeon may make the necessary adjustments to an existing display configuration, for example, using an input device such as a tablet computer (e.g., as shown in FIG. 5A), a touchless input device (e.g., as shown in FIG. 5B), a haptic device, or a gaze-tracking based input device.
  • Operations of the process 800 further includes determining that at least a portion of the adjustments is in violation of a predetermined safety condition associated with the corresponding visual representation (840). For example, a safety condition associated with an endoscope feed may specify that the visual representation of the feed may not be reduced to a size smaller than a threshold. In such a case, if a user attempts to reduce the visual representation of the endoscope feed to a size smaller than the threshold, a violation of the safety condition may be determined. In another example, determining the violation can include detecting that the user-input is requesting a particular visual representation to be removed from the display device, whereas the corresponding predetermined safety condition specifies that the particular visual representation to be displayed throughout the surgical process. In some implementations, determining the violation can also include detecting (e.g., via gaze tracking) that a user is looking at an incorrect visual representation during the adjustment process.
  • Operations of the process 800 also includes generating, responsive to determining the violation, a control signal configured to alert a user of the violation (850). In some implementations, the control signal can be configured to disable (or reduce the sensitivity of) one or more instruments being used in the surgical process. The control signal may also cause the generation of a visible or audible message that alerts the user of the violation. In some implementations, the control signal may cause the violating adjustment to be undone or reversed, and alert the user that such a readjustment has been made. The predetermined safety conditions can be specific to the various visual representations, and/or specific to particular phase of the surgical process.
  • The functionality described herein, or portions thereof, and its various modifications (hereinafter “the functions”) can be implemented, at least in part, via a computer program product, e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine-readable media or storage device, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a DSP, a microcontroller, a computer, multiple computers, and/or programmable logic components.
  • A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed one or more processing devices at one site or distributed across multiple sites and interconnected by a network.
  • Actions associated with implementing all or part of the functions can be performed by one or more programmable processors or processing devices executing one or more computer programs to perform the functions of the processes described herein. All or part of the functions can be implemented as, special purpose logic circuitry, e.g., an FPGA and/or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.
  • While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described herein as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described herein should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single product or packaged into multiple products.
  • Elements described in detail with reference to one embodiment, implementation, or application optionally may be included, whenever practical, in other embodiments, implementations, or applications in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one embodiment, implementation, or application may be incorporated into other embodiments, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an embodiment or implementation non-functional, or unless two or more of the elements provide conflicting functions.
  • Particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results. As one example, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.

Claims (33)

1. A method comprising:
operating a surgical system to perform a surgical process, the surgical system including a display device;
receiving, at one or more processing devices, data from multiple data sources;
determining a current phase of the surgical process; and
displaying, on the display device, visual representations corresponding to the data from a first set of the multiple sources in a first arrangement within a display region of the display device,
wherein at least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
2. The method of claim 1, further comprising:
determining a new phase of the surgical process; and
updating at least one of the first set of the multiple data sources and the first arrangement in response to determining the new phase of the surgical process.
3. The method of claim 2, wherein updating the at least one of the first set of the multiple data sources and the first arrangement is based on a user preference record for a current user of the surgical system.
4. The method of claim 2, wherein updating the at least one of the first set of the multiple data sources and the first arrangement is based on a predetermined safety profile for the surgical system.
5. The method of claim 1, further comprising:
receiving, via an input device, user-input indicative of adjustments to one or more of the visual representations corresponding to the data from the multiple data sources; and
updating the display device in accordance with the adjustments.
6. (canceled)
7. (canceled)
8. The method of claim 1, wherein the multiple data sources comprise at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device.
9. (canceled)
10. The method of claim 1, wherein determining the current phase is based on a user-input indicative of the current phase.
11. (canceled)
12. The method of claim 1, wherein data from one or more of the multiple data sources includes positional information with respect to a common reference frame, and wherein displaying the visual representations comprises overlaying a first visual representation on a second visual representation, wherein the first visual representation is registered with respect to the second visual representation based on the common reference frame.
13. (canceled)
14. The method of claim 1, wherein the first arrangement is determined based on a user profile loaded prior to commencement of the surgical process, the user profile identifying an individual performing the surgical process, and includes user-preferences of the individual regarding organization of the visual representations corresponding to the data from the multiple data sources during different phases of the surgical process.
15.-25. (canceled)
26. A surgical system comprising:
a display device; and
one or more processing devices configured to:
operate the surgical system to perform a surgical process;
receive data from multiple data sources,
determine a current phase of the surgical process, and
display, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device,
wherein at least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
27. The surgical system of claim 26, wherein the one or more processing devices are further configured to:
determine a new phase of the surgical process; and
update at least one of the first set of the multiple data sources and the first arrangement in response to determining the new phase of the surgical process.
28. The surgical system of claim 27, wherein updating the at least one of the first set of the multiple data sources and the first arrangement is based on a user preference record for a current user of the surgical system.
29. The surgical system of claim 27, wherein updating the at least one of the first set of the multiple data sources and the first arrangement is based on a predetermined safety profile for the surgical system.
30. The surgical system of claim 26, wherein the one or more processing devices are further configured to:
receive, via an input device, user-input indicative of adjustments to one or more of the visual representations corresponding to the data from the multiple data sources; and
update the display device in accordance with the adjustments.
31. (canceled)
32. (canceled)
33. The surgical system of claim 26, wherein the multiple data sources comprise at least two of: an endoscope, an ultrasound imaging device, a computed tomography (CT) imaging device, a nuclear imaging device, a radiography imaging device, and a magnetic resonance imaging (MRI) device.
34. (canceled)
35. The surgical system of claim 26, wherein the one or more processing devices are configured to determine the current phase based on a user-input indicative of the current phase or on an image analysis process executed on the data from at least one of the multiple data sources.
36. (canceled)
37. The surgical system of claim 26, wherein data from one or more of the multiple data sources includes positional information with respect to a common reference frame, and wherein displaying the visual representations comprises overlaying a first visual representation on a second visual representation, wherein the first visual representation is registered with respect to the second visual representation based on the common reference frame.
38. (canceled)
39. The surgical system of claim 26, wherein the first arrangement is determined based on a user profile loaded prior to commencement of the surgical process, the user profile identifying an individual performing the surgical process, and includes user-preferences of the individual regarding organization of the visual representations corresponding to the data from the multiple data sources during different phases of the surgical process.
40.-50. (canceled)
51. One or more machine-readable non-transitory storage devices encoded with machine-readable instructions configured to cause one or more processing devices to perform operations comprising:
operating a surgical system to perform a surgical process, the surgical system including a display device;
receiving data from multiple data sources;
determining a current phase of the surgical process; and
displaying, on the display device, visual representations corresponding to the data from a first set of the multiple data sources in a first arrangement within a display region of the display device,
wherein at least one of the first set of the multiple data sources and the first arrangement is associated with the current phase of the surgical process.
52. The one or more machine-readable non-transitory storage devices of claim 51, further comprising instructions for:
determining a new phase of the surgical process; and
updating at least one of the first set of the multiple data sources and the first arrangement in response to determining the new phase of the surgical process.
53.-75. (canceled)
US16/347,298 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery Abandoned US20190254759A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/347,298 US20190254759A1 (en) 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662417493P 2016-11-04 2016-11-04
US16/347,298 US20190254759A1 (en) 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery
PCT/US2017/060000 WO2018085694A1 (en) 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/060000 A-371-Of-International WO2018085694A1 (en) 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/677,253 Continuation US20220175470A1 (en) 2016-11-04 2022-02-22 Reconfigurable display in computer-assisted tele-operated surgery

Publications (1)

Publication Number Publication Date
US20190254759A1 true US20190254759A1 (en) 2019-08-22

Family

ID=62076639

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/347,298 Abandoned US20190254759A1 (en) 2016-11-04 2017-11-03 Reconfigurable display in computer-assisted tele-operated surgery
US17/677,253 Pending US20220175470A1 (en) 2016-11-04 2022-02-22 Reconfigurable display in computer-assisted tele-operated surgery

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/677,253 Pending US20220175470A1 (en) 2016-11-04 2022-02-22 Reconfigurable display in computer-assisted tele-operated surgery

Country Status (4)

Country Link
US (2) US20190254759A1 (en)
EP (1) EP3534817A4 (en)
CN (2) CN117717411A (en)
WO (1) WO2018085694A1 (en)

Cited By (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190206216A1 (en) * 2017-12-28 2019-07-04 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US10695081B2 (en) 2017-12-28 2020-06-30 Ethicon Llc Controlling a surgical instrument according to sensed closure parameters
US10755813B2 (en) 2017-12-28 2020-08-25 Ethicon Llc Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US10849697B2 (en) 2017-12-28 2020-12-01 Ethicon Llc Cloud interface for coupled surgical devices
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892899B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Self describing data packets generated at an issuing instrument
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US10932806B2 (en) 2017-10-30 2021-03-02 Ethicon Llc Reactive algorithm for surgical system
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US20210085425A1 (en) * 2017-05-09 2021-03-25 Boston Scientific Scimed, Inc. Operating room devices, methods, and systems
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US20210121251A1 (en) * 2018-06-21 2021-04-29 Procept Biorobotics Corporation Artificial intelligence for robotic surgery
US11013563B2 (en) 2017-12-28 2021-05-25 Ethicon Llc Drive arrangements for robot-assisted surgical platforms
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US11058498B2 (en) 2017-12-28 2021-07-13 Cilag Gmbh International Cooperative surgical actions for robot-assisted surgical platforms
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11114195B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Surgical instrument with a tissue marking assembly
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US20210275264A1 (en) * 2020-03-03 2021-09-09 Verb Surgical Inc. Graphical User Guidance for a Robotic Surgical System
US11127116B2 (en) * 2015-12-01 2021-09-21 Sony Corporation Surgery control apparatus, surgery control method, program, and surgery system
US11129611B2 (en) 2018-03-28 2021-09-28 Cilag Gmbh International Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US20220060678A1 (en) * 2019-03-29 2022-02-24 Sony Group Corporation Control device and master slave system
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US20220095903A1 (en) * 2019-01-25 2022-03-31 Intuitive Surgical Operations, Inc. Augmented medical vision systems and methods
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11298148B2 (en) 2018-03-08 2022-04-12 Cilag Gmbh International Live time tissue classification using electrical parameters
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11304763B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US20230013884A1 (en) * 2021-07-14 2023-01-19 Cilag Gmbh International Endoscope with synthetic aperture multispectral camera array
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11589932B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11596291B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11696760B2 (en) 2017-12-28 2023-07-11 Cilag Gmbh International Safety systems for smart powered surgical stapling
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11786319B2 (en) * 2017-12-14 2023-10-17 Verb Surgical Inc. Multi-panel graphical user interface for a robotic surgical system
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11963729B2 (en) * 2019-06-21 2024-04-23 Procept Biorobotics Corporation Artificial intelligence for robotic surgery

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3793468A4 (en) * 2018-05-15 2022-01-26 Intuitive Surgical Operations, Inc. Method and apparatus for manipulating tissue
CN112971688A (en) * 2021-02-07 2021-06-18 杭州海康慧影科技有限公司 Image processing method and device and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080269596A1 (en) * 2004-03-10 2008-10-30 Ian Revie Orthpaedic Monitoring Systems, Methods, Implants and Instruments
US20130345718A1 (en) * 2007-02-16 2013-12-26 Excelsius Surgical, L.L.C. Surgical robot platform
US20140005484A1 (en) * 2012-06-27 2014-01-02 CamPlex LLC Interface for viewing video from cameras on a surgical visualization system
US20190254757A1 (en) * 2016-10-31 2019-08-22 Cameron Anthony Piron 3D Navigation System and Methods

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8862200B2 (en) * 2005-12-30 2014-10-14 DePuy Synthes Products, LLC Method for determining a position of a magnetic source
US7525309B2 (en) * 2005-12-30 2009-04-28 Depuy Products, Inc. Magnetic sensor array
US8280483B2 (en) * 2006-06-14 2012-10-02 Koninklijke Philips Electronics N.V. Multi-modality medical image viewing
US7794396B2 (en) * 2006-11-03 2010-09-14 Stryker Corporation System and method for the automated zooming of a surgical camera
US9161817B2 (en) 2008-03-27 2015-10-20 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic catheter system
US20100063400A1 (en) * 2008-09-05 2010-03-11 Anne Lindsay Hall Method and apparatus for catheter guidance using a combination of ultrasound and x-ray imaging
AU2010216283A1 (en) * 2009-02-20 2011-07-21 Sunpower Corporation Automated solar collector installation design including exceptional condition management and display
KR101390383B1 (en) * 2010-11-16 2014-04-29 한국전자통신연구원 Apparatus for managing a reconfigurable platform for virtual reality based training simulator
JP6021353B2 (en) * 2011-08-04 2016-11-09 オリンパス株式会社 Surgery support device
JP6193554B2 (en) * 2011-11-04 2017-09-06 ファナック アメリカ コーポレイション Robot teaching apparatus having a three-dimensional display unit
KR20150079577A (en) * 2012-10-25 2015-07-08 엘지전자 주식회사 Method and apparatus for processing edge violation phenomenon in multi-view 3dtv service
US9448407B2 (en) * 2012-12-13 2016-09-20 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and work supporting system
DE102013000250A1 (en) * 2013-01-09 2014-07-10 Kuka Laboratories Gmbh Configurable security monitoring for a robot arrangement
US9788906B2 (en) * 2013-03-15 2017-10-17 Synaptive Medical (Barbados) Inc. Context aware surgical systems for intraoperatively configuring imaging devices
WO2014200016A1 (en) * 2013-06-11 2014-12-18 Tanji Atsushi Surgical assistance system, surgical assistance device, surgical assistance method, surgical assistance program, and information processing device
KR101645624B1 (en) * 2013-06-12 2016-08-05 삼성전자주식회사 Method and apparatus for providing medical information
KR102071530B1 (en) * 2013-07-12 2020-01-30 삼성전자주식회사 Apparatas and method for proposing a response manual of occurring denial in an electronic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080269596A1 (en) * 2004-03-10 2008-10-30 Ian Revie Orthpaedic Monitoring Systems, Methods, Implants and Instruments
US20130345718A1 (en) * 2007-02-16 2013-12-26 Excelsius Surgical, L.L.C. Surgical robot platform
US20140005484A1 (en) * 2012-06-27 2014-01-02 CamPlex LLC Interface for viewing video from cameras on a surgical visualization system
US20190254757A1 (en) * 2016-10-31 2019-08-22 Cameron Anthony Piron 3D Navigation System and Methods

Cited By (206)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11127116B2 (en) * 2015-12-01 2021-09-21 Sony Corporation Surgery control apparatus, surgery control method, program, and surgery system
US20210085425A1 (en) * 2017-05-09 2021-03-25 Boston Scientific Scimed, Inc. Operating room devices, methods, and systems
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11793537B2 (en) 2017-10-30 2023-10-24 Cilag Gmbh International Surgical instrument comprising an adaptive electrical system
US11109878B2 (en) 2017-10-30 2021-09-07 Cilag Gmbh International Surgical clip applier comprising an automatic clip feeding system
US10932806B2 (en) 2017-10-30 2021-03-02 Ethicon Llc Reactive algorithm for surgical system
US11759224B2 (en) 2017-10-30 2023-09-19 Cilag Gmbh International Surgical instrument systems comprising handle arrangements
US11696778B2 (en) 2017-10-30 2023-07-11 Cilag Gmbh International Surgical dissectors configured to apply mechanical and electrical energy
US11648022B2 (en) 2017-10-30 2023-05-16 Cilag Gmbh International Surgical instrument systems comprising battery arrangements
US11819231B2 (en) 2017-10-30 2023-11-21 Cilag Gmbh International Adaptive control programs for a surgical system comprising more than one type of cartridge
US10959744B2 (en) 2017-10-30 2021-03-30 Ethicon Llc Surgical dissectors and manufacturing techniques
US11602366B2 (en) 2017-10-30 2023-03-14 Cilag Gmbh International Surgical suturing instrument configured to manipulate tissue using mechanical and electrical power
US11564703B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Surgical suturing instrument comprising a capture width which is larger than trocar diameter
US10980560B2 (en) 2017-10-30 2021-04-20 Ethicon Llc Surgical instrument systems comprising feedback mechanisms
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11123070B2 (en) 2017-10-30 2021-09-21 Cilag Gmbh International Clip applier comprising a rotatable clip magazine
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11026712B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Surgical instruments comprising a shifting mechanism
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11026713B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Surgical clip applier configured to store clips in a stored state
US11413042B2 (en) 2017-10-30 2022-08-16 Cilag Gmbh International Clip applier comprising a reciprocating clip advancing member
US11406390B2 (en) 2017-10-30 2022-08-09 Cilag Gmbh International Clip applier comprising interchangeable clip reloads
US11045197B2 (en) 2017-10-30 2021-06-29 Cilag Gmbh International Clip applier comprising a movable clip magazine
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11051836B2 (en) 2017-10-30 2021-07-06 Cilag Gmbh International Surgical clip applier comprising an empty clip cartridge lockout
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11291465B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Surgical instruments comprising a lockable end effector socket
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11071560B2 (en) 2017-10-30 2021-07-27 Cilag Gmbh International Surgical clip applier comprising adaptive control in response to a strain gauge circuit
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11207090B2 (en) 2017-10-30 2021-12-28 Cilag Gmbh International Surgical instruments comprising a biased shifting mechanism
US11141160B2 (en) 2017-10-30 2021-10-12 Cilag Gmbh International Clip applier comprising a motor controller
US11129636B2 (en) 2017-10-30 2021-09-28 Cilag Gmbh International Surgical instruments comprising an articulation drive that provides for high articulation angles
US11925373B2 (en) 2017-10-30 2024-03-12 Cilag Gmbh International Surgical suturing instrument comprising a non-circular needle
US11103268B2 (en) 2017-10-30 2021-08-31 Cilag Gmbh International Surgical clip applier comprising adaptive firing control
US11786319B2 (en) * 2017-12-14 2023-10-17 Verb Surgical Inc. Multi-panel graphical user interface for a robotic surgical system
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11701185B2 (en) 2017-12-28 2023-07-18 Cilag Gmbh International Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11114195B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Surgical instrument with a tissue marking assembly
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11931110B2 (en) 2017-12-28 2024-03-19 Cilag Gmbh International Surgical instrument comprising a control system that uses input from a strain gage circuit
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US10695081B2 (en) 2017-12-28 2020-06-30 Ethicon Llc Controlling a surgical instrument according to sensed closure parameters
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11918302B2 (en) 2017-12-28 2024-03-05 Cilag Gmbh International Sterile field interactive control displays
US11179204B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US10755813B2 (en) 2017-12-28 2020-08-25 Ethicon Llc Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11903587B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Adjustment to the surgical stapling control based on situational awareness
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11213359B2 (en) 2017-12-28 2022-01-04 Cilag Gmbh International Controllers for robot-assisted surgical platforms
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US11890065B2 (en) 2017-12-28 2024-02-06 Cilag Gmbh International Surgical system to limit displacement
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11864845B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Sterile field interactive control displays
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11844579B2 (en) 2017-12-28 2023-12-19 Cilag Gmbh International Adjustments based on airborne particle properties
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US10849697B2 (en) 2017-12-28 2020-12-01 Ethicon Llc Cloud interface for coupled surgical devices
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11058498B2 (en) 2017-12-28 2021-07-13 Cilag Gmbh International Cooperative surgical actions for robot-assisted surgical platforms
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892899B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Self describing data packets generated at an issuing instrument
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11304763B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11779337B2 (en) 2017-12-28 2023-10-10 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US11775682B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US11751958B2 (en) 2017-12-28 2023-09-12 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11737668B2 (en) 2017-12-28 2023-08-29 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11382697B2 (en) 2017-12-28 2022-07-12 Cilag Gmbh International Surgical instruments comprising button circuits
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11712303B2 (en) 2017-12-28 2023-08-01 Cilag Gmbh International Surgical instrument comprising a control circuit
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11045591B2 (en) 2017-12-28 2021-06-29 Cilag Gmbh International Dual in-series large and small droplet filters
US10943454B2 (en) * 2017-12-28 2021-03-09 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11696760B2 (en) 2017-12-28 2023-07-11 Cilag Gmbh International Safety systems for smart powered surgical stapling
US11678881B2 (en) 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11672605B2 (en) 2017-12-28 2023-06-13 Cilag Gmbh International Sterile field interactive control displays
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11013563B2 (en) 2017-12-28 2021-05-25 Ethicon Llc Drive arrangements for robot-assisted surgical platforms
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US20190206216A1 (en) * 2017-12-28 2019-07-04 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11612408B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Determining tissue composition via an ultrasonic system
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11596291B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws
US11601371B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11589932B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11701162B2 (en) 2018-03-08 2023-07-18 Cilag Gmbh International Smart blade application for reusable and disposable devices
US11589915B2 (en) 2018-03-08 2023-02-28 Cilag Gmbh International In-the-jaw classifier based on a model
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11844545B2 (en) 2018-03-08 2023-12-19 Cilag Gmbh International Calcified vessel identification
US11839396B2 (en) 2018-03-08 2023-12-12 Cilag Gmbh International Fine dissection mode for tissue classification
US11534196B2 (en) 2018-03-08 2022-12-27 Cilag Gmbh International Using spectroscopy to determine device use state in combo instrument
US11617597B2 (en) 2018-03-08 2023-04-04 Cilag Gmbh International Application of smart ultrasonic blade technology
US11298148B2 (en) 2018-03-08 2022-04-12 Cilag Gmbh International Live time tissue classification using electrical parameters
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11344326B2 (en) 2018-03-08 2022-05-31 Cilag Gmbh International Smart blade technology to control blade instability
US11464532B2 (en) 2018-03-08 2022-10-11 Cilag Gmbh International Methods for estimating and controlling state of ultrasonic end effector
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
US11678927B2 (en) 2018-03-08 2023-06-20 Cilag Gmbh International Detection of large vessels during parenchymal dissection using a smart blade
US11457944B2 (en) 2018-03-08 2022-10-04 Cilag Gmbh International Adaptive advanced tissue treatment pad saver mode
US11678901B2 (en) 2018-03-08 2023-06-20 Cilag Gmbh International Vessel sensing for adaptive advanced hemostasis
US11707293B2 (en) 2018-03-08 2023-07-25 Cilag Gmbh International Ultrasonic sealing algorithm with temperature control
US11701139B2 (en) 2018-03-08 2023-07-18 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11399858B2 (en) 2018-03-08 2022-08-02 Cilag Gmbh International Application of smart blade technology
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11197668B2 (en) 2018-03-28 2021-12-14 Cilag Gmbh International Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout
US11937817B2 (en) 2018-03-28 2024-03-26 Cilag Gmbh International Surgical instruments with asymmetric jaw arrangements and separate closure and firing systems
US11931027B2 (en) 2018-03-28 2024-03-19 Cilag Gmbh Interntional Surgical instrument comprising an adaptive control system
US11129611B2 (en) 2018-03-28 2021-09-28 Cilag Gmbh International Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US11166716B2 (en) 2018-03-28 2021-11-09 Cilag Gmbh International Stapling instrument comprising a deactivatable lockout
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11406382B2 (en) 2018-03-28 2022-08-09 Cilag Gmbh International Staple cartridge comprising a lockout key configured to lift a firing member
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11213294B2 (en) 2018-03-28 2022-01-04 Cilag Gmbh International Surgical instrument comprising co-operating lockout features
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11589865B2 (en) 2018-03-28 2023-02-28 Cilag Gmbh International Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems
US20210121251A1 (en) * 2018-06-21 2021-04-29 Procept Biorobotics Corporation Artificial intelligence for robotic surgery
US20220095903A1 (en) * 2019-01-25 2022-03-31 Intuitive Surgical Operations, Inc. Augmented medical vision systems and methods
US11331100B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Staple cartridge retainer system with authentication keys
US11517309B2 (en) 2019-02-19 2022-12-06 Cilag Gmbh International Staple cartridge retainer with retractable authentication key
US11291445B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical staple cartridges with integral authentication keys
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11298130B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Staple cartridge retainer with frangible authentication key
US11272931B2 (en) 2019-02-19 2022-03-15 Cilag Gmbh International Dual cam cartridge based feature for unlocking a surgical stapler lockout
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11298129B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11925350B2 (en) 2019-02-19 2024-03-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
US11291444B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a closure lockout
US11751872B2 (en) 2019-02-19 2023-09-12 Cilag Gmbh International Insertable deactivator element for surgical stapler lockouts
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11331101B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Deactivator element for defeating surgical stapling device lockouts
US20220060678A1 (en) * 2019-03-29 2022-02-24 Sony Group Corporation Control device and master slave system
US11582438B2 (en) * 2019-03-29 2023-02-14 Sony Group Corporation Control device and master slave system
US11963729B2 (en) * 2019-06-21 2024-04-23 Procept Biorobotics Corporation Artificial intelligence for robotic surgery
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
US11633247B2 (en) * 2020-03-03 2023-04-25 Verb Surgical Inc. Graphical user guidance for a robotic surgical system
US20210275264A1 (en) * 2020-03-03 2021-09-09 Verb Surgical Inc. Graphical User Guidance for a Robotic Surgical System
US20230013884A1 (en) * 2021-07-14 2023-01-19 Cilag Gmbh International Endoscope with synthetic aperture multispectral camera array

Also Published As

Publication number Publication date
WO2018085694A1 (en) 2018-05-11
EP3534817A1 (en) 2019-09-11
CN117717411A (en) 2024-03-19
EP3534817A4 (en) 2020-07-29
CN109890311B (en) 2024-01-02
US20220175470A1 (en) 2022-06-09
CN109890311A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
US20220175470A1 (en) Reconfigurable display in computer-assisted tele-operated surgery
US20210157403A1 (en) Operating room and surgical site awareness
Qian et al. A review of augmented reality in robotic-assisted surgery
US10918445B2 (en) Surgical system with augmented reality display
US10182873B2 (en) Structural adjustment systems and methods for a teleoperational medical system
US10542908B2 (en) Surgical equipment control input visualization field
US20200038124A1 (en) Systems and methods for constraining a virtual reality surgical system
US11918306B2 (en) Multi-dimensional visualization in computer-assisted tele-operated surgery
US20230225816A1 (en) Graphical user guidance for a robotic surgical system
US11204640B2 (en) Methods for determining if teleoperation should be disengaged based on the user's gaze
Kogkas et al. Free-view, 3D gaze-guided robotic scrub nurse
US20190220097A1 (en) System and method for assisting operator engagement with input devices
EP4106660A1 (en) Robotic surgical system and method for providing a stadium view with arm set-up guidance
US11960645B2 (en) Methods for determining if teleoperation should be disengaged based on the user's gaze
Kogkas A Gaze-contingent Framework for Perceptually-enabled Applications in Healthcare

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AZIZIAN, MAHDI;REEL/FRAME:049071/0759

Effective date: 20190423

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION