US20230225810A1 - Guiding a robotic surgical system to perform a surgical procedure - Google Patents

Guiding a robotic surgical system to perform a surgical procedure Download PDF

Info

Publication number
US20230225810A1
US20230225810A1 US18/174,287 US202318174287A US2023225810A1 US 20230225810 A1 US20230225810 A1 US 20230225810A1 US 202318174287 A US202318174287 A US 202318174287A US 2023225810 A1 US2023225810 A1 US 2023225810A1
Authority
US
United States
Prior art keywords
images
robotic endoscope
endoscope
robotic
pov
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/174,287
Inventor
Justin Esterberg
Jeffrey Roh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IX Innovation LLC
Original Assignee
IX Innovation LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IX Innovation LLC filed Critical IX Innovation LLC
Priority to US18/174,287 priority Critical patent/US20230225810A1/en
Publication of US20230225810A1 publication Critical patent/US20230225810A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1613Component parts
    • A61B17/1615Drill bits, i.e. rotating tools extending from a handpiece to contact the worked material
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1613Component parts
    • A61B17/1622Drill handpieces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means

Definitions

  • the present disclosure is generally related to a robotic surgical system in a Virtual Reality (VR) environment.
  • VR Virtual Reality
  • Robotic surgical devices are now routinely used for numerous surgical procedures such as general surgery, pediatric surgery, and those related to the medical fields of gynecology, urology, cardiology, and otorhinolaryngology. Robotic devices continue to evolve and are being more frequently utilized in surgical procedures.
  • the robotic devices are used most in surgical procedures that require a high degree of accuracy and/or precision.
  • Such robotic devices include autonomous, tele-operated, and interactive type robotic systems.
  • Interactive robotic systems are most frequently used for providing the surgeon with direct hands-on control of the surgical procedure, thus achieving a high degree of accuracy and/or precision.
  • a surgeon can use an interactive robotic arm to sculpt a bone and/or to receive a knee implant.
  • a surgeon may directly control and manipulate tissue, albeit at some distance from the patient through a fulcrum point in the abdominal wall.
  • robotic devices In other surgical procedures performed using robotic devices, the surgeon may sit at a console in the operating room, but outside the sterile field, directing and controlling the movements of one or more robotic arms.
  • robotic devices can be intrusive during a surgical procedure, blocking the surgeon's point of view and occupying substantial space around an operating table, increasing the likelihood of an operator error.
  • Instruments for robotic devices may be implemented in surgical procedures as safeguards. Such instruments help guide the robotic devices and assist the surgeons in avoiding errors.
  • the movement detection equipment and the navigation markers help determine the position of the instrument in space and prevent the instrument from deviating beyond the path set by the surgeon.
  • neuromonitors with sensors are used to detect the threshold level, and a signal is sent to an appropriate system to stop insertion of the surgical instrument or to move the instrument away for preventing any damage when an error is detected as being imminent.
  • actuators are also used for controlled movement and positioning of end effectors in the robotic arm.
  • FIG. 1 shows a network connection diagram 100 for a system 102 to guide a robotic surgical system when performing a surgical procedure, according to an embodiment.
  • FIG. 2 shows a virtual grid used to determine locations of different instruments present in a Three-Dimensional (3D) space corresponding to an operating table, according to an embodiment.
  • FIG. 3 A shows a front point of view (PoV) of a surgical drill, according to an embodiment.
  • FIG. 3 B shows a cross-section PoV of a surgical drill along a Z-Z′ axis, according to an embodiment.
  • FIG. 4 shows a Virtual Reality (VR) drill set at a first angle, relative to images of the subject patient, according to an embodiment.
  • VR Virtual Reality
  • FIG. 5 shows the VR drill set at a second angle relative to the images of the subject patient, according to an embodiment.
  • FIG. 1 shows a network connection diagram 100 for a system 102 to guide a robotic surgical system when performing surgery, according to an embodiment.
  • the system 102 may be connected to a communication network 104 .
  • the communication network 104 may further be connected to an image database 106 and a position database 108 to facilitate data transfer therebetween with the system 102 .
  • the image database 106 may store images of a subject patient, as well as images of previous patients who have undergone similar surgeries.
  • the images may be captured using an X-ray, ultrasound, and Magnetic Resonance Imaging (MRI). Further, the images may be present in raw form, as Three-Dimensional (3D) models, Augmented Reality (AR) images, Virtual Reality (VR) images, and Point of View (PoV) images.
  • the position database 108 may store real-time position information of a PoV surgical drill 122 and that of a virtual drill that may he shown to a surgeon during surgery.
  • the communication network 104 may be either of a wired and/or a wireless network.
  • the communication network 104 if wireless, may be implemented using communication techniques such as Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WIMAX), Long Term Evolution (LTETM), Wireless Local Area Network (WLAN), Infrared (IR) communication, Public Switched Telephone Network (PSTN), Radio waves, and other communication techniques known in the art.
  • VLC Visible Light Communication
  • WIMAX Worldwide Interoperability for Microwave Access
  • LTETM Long Term Evolution
  • WLAN Wireless Local Area Network
  • IR Infrared
  • PSTN Public Switched Telephone Network
  • Radio waves and other communication techniques known in the art.
  • the system 102 may further include a processor 110 , interface(s) 112 , and a memory 114 .
  • the processor 110 may execute an algorithm stored in the memory 114 for processing the PoV images and for guiding the robotic surgical system when performing a surgical procedure.
  • the processor 110 may also be configured to decode and execute any instructions received from one or more other electronic devices or server(s).
  • the processor 110 may include one or more general purpose processors (e.g., INTEL® or Advanced Micro Devices® (AMD) microprocessors) and/or one or more special purpose processors (e.g., digital signal processors or Xilinx System On Chip (SOC) Field Programmable Gate Array (FPGA) processor).
  • the processor 110 may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described in this description.
  • the interface(s) 112 may facilitate interaction between a surgeon and the system 102 .
  • the interface(s) 112 may accept an input from the surgeon or other user who is associated with an on-going surgery and/or provide an output to the surgeon or other user.
  • the interface(s) 112 may either be a Command Line Interface (CLI), Graphical User Interface (GUI), or a voice interface.
  • CLI Command Line Interface
  • GUI Graphical User Interface
  • the memory 114 may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, Compact Disc Read-Only Memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, Random Access Memories (RAMS), Programmable Read-Only Memories (PROMs), Erasable PROMs (EPROMs), Electrically Erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
  • RAMS Random Access Memories
  • PROMs Programmable Read-Only Memories
  • EPROMs Erasable PROMs
  • EEPROMs Electrically Erasable PROMs
  • flash memory magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
  • FIG. 1 also shows a user device 116 , AR/VR display 118 , guidance drill 120 , and a Point of View (PoV) surgical drill 122 that are ail connected to the communication network 104 .
  • the surgeon may manoeuvre the guidance drill 120 to remotely control the actual PoV surgical drill 122 .
  • Locations and motions of the guidance drill 120 may be tracked at all times by an operating table to surgical drill reference system (hereafter “reference system”) 124 and a reference holder system 126 .
  • the reference holder system 126 and the reference system 124 may control the PoV surgical drill 122 based on the tracked locations and motions thereof.
  • the user device 116 is shown as a tablet in FIG. 1 ; however, other user devices having Graphical User Interfaces (GUIs) may also be used. Other implementations of user device 116 may include, but not be limited to, a smart phone, phablet, laptop, and desktop.
  • GUIs Graphical User Interfaces
  • FIG. 1 also shows AR/VR display 118 as a VR glass in present case, although the example embodiments are not so limited.
  • FIG. 2 shows a virtual grid 200 used to determine locations of different instruments present in a Three-Dimensional (3D) space, by the reference system 124 , according to an embodiment.
  • the virtual grid 200 is shown in relation to an operating table 202 .
  • a marker 204 for AR reference is used on the virtual grid 200 to help an AR imaging system determine a location of the AR reference.
  • the AR imaging system may identify a location of a drill bit reference 206 .
  • the AR/VR display 118 may identify a location of a virtual drill in the 3D space.
  • the virtual drill may be shown to the surgeon, using the AR/VR display 118 , during an actual surgical procedure.
  • a drill bit may be placed in an opening 208 of a drill holder 210 of the PoV surgical drill 122 .
  • a module 212 connected to the drill holder 210 may identify parameters of PoV surgical drill 122 , including a type of the surgical drill, a type of the drill bit, a size of the drill bit, and an absolute position of a tip of the drill bit in an XYZ coordinate system referencing the operating table 202 .
  • the module 212 may further comprise a surgical drill reader configured to read a serial number present on the drill bit.
  • the serial number may be related to the PoV surgical drill 122 and/or the drill bit thereof.
  • Serial numbers respectively corresponding to different drill bits and different categories of surgical drills may be stored in a memory corresponding to the module 212 .
  • the received serial number may be matched with the serial numbers stored in the memory to identify details related to the PoV surgical drill 122 and the drill bit.
  • the surgical drill reader may be implemented as a Near Field Communication (NFC) reader, and NFC encoded chip may be attached to the drill bit. The NFC reader may therefore communicate with the NFC encoded chip to receive the serial number of the drill bit.
  • NFC Near Field Communication
  • the module 212 may identify and/or determine the drill bit being cradled, reference the position of the drill bit with the virtual grid 200 , identify the surgical drill and the drill bit, convert the surgical drill identification to an associated virtual surgical icon, and convert the drill bit identification to an associated virtual surgical drill bit icon.
  • the module 212 may further transmit the virtual surgical icon and the virtual drill bit icon, referenced to the XYZ coordinate system of the operating table 202 , to an AR imaging system and to the reference holder system 126 .
  • the reference holder system 126 may be an integral unit of the PoV surgical drill 122 , configured to identify and store, in real-time, XYZ coordinates of the PoV surgical drill 122 and drill bit tip, and an angle of the PoV surgical drill 122 .
  • the reference holder system 126 may include an accelerometer to detect changes in position of the PoV surgical drill 122 e.g., a three-axis accelerometer.
  • the data recorded by the reference holder system 126 may be transmitted to the AR imaging system.
  • the data may include the virtual icon of the PoV surgical drill 122 and the virtual icon of the drill bit along with their real positions relative to the operating table 202 .
  • FIG. 3 A shows a front view of the PoV surgical drill 122 ; and FIG. 3 B shows a cross-section of the PoV surgical drill 122 along a Z-Z′ axis, in accordance with at least one example embodiment.
  • Drill bit 302 is centrally located, length-wise, on PoV surgical drill 122 .
  • a Light Amplification by Stimulated Emission of Radiation (LASER) device 304 is disposed on the periphery of block 306 surrounding the drill bit 302 .
  • the LASER device 304 may guide a surgeon in a direction in which the PoV surgical drill 122 is moving.
  • LASER Stimulated Emission of Radiation
  • cameras 308 , 310 , and 312 are shown on the periphery of the block surrounding the drill bit 302 , although the cameras are not so limited in quantity. At least one of the cameras 308 , 310 , and 312 may capture a PoV image of a surgical area of a subject patient.
  • images captured by any one or more of the cameras 308 , 310 , and 312 may be integrated to produce one composite PoV image using known image processing tools and techniques.
  • the composite PoV image may be cropped in a circle and centered with regard to the drill bit 302 based on defaults settings stored by the surgeon. The cropped image may then be sent to the reference holder system 126 .
  • the surgeon may maneuver the guidance drill 120 to control the PoV surgical drill 122 based on the PoV images seen on the AR/VR display 118 .
  • the PoV images may be collected using one or more of the cameras 308 , 310 , and 312 positioned on the head of the PoV surgical drill 122 .
  • content of the PoV images may change based on an orientation and direction faced by the PoV surgical drill 122 .
  • the processor 110 may synchronize the position of the PoV surgical drill 122 with a reference linked with augmented images shown on the AR display device 118 .
  • a Virtual Reality (VR) drill may be shown to the surgeon on the augmented images displayed using the AR display device 118 .
  • the VR drill may move based on changes in position of the PoV surgical drill 122 , controlled by the surgeon controlling the guidance drill 120 .
  • operating room cameras may also be used for capturing images of the surgical procedure from a fixed angle, as set by the surgeon or based on a positioning of the operating room cameras.
  • Such images may be stored in the image database 106 , and may be displayed to the surgeon using an image display, e.g., AR display device 118 .
  • FIG. 4 shows the VR drill 402 set at a first angle, relative to images of the subject patient, according to an embodiment.
  • the VR drill 402 is displayed, via a GUI, to the surgeon and other users associated with the surgery; and the VR drill 402 replicates the position and the direction of the PoV surgical drill 122 .
  • FIG. 4 also shows a LASER marker 404 , a PoV image region 406 , and a line of sight 408 of the VR drill 402 (replicating a line of sight of the PoV surgical drill 122 ).
  • the surgeon may be able to see an actual image 410 of the, e.g., spine of the patient, an AR image 412 of the spine, and a VR image 414 of the spine.
  • the images may be seen based on the surgeon's preferences e.g., images may be overlaid over each other, the images may be shown in parallel in a side to side arrangement, etc.
  • a highlighted section 116 is also shown as the PoV image captured by one or more of the cameras 308 , 310 , and 312 of the PoV surgical drill 122 , in accordance with at least one example.
  • FIG. 5 shows the VR drill 402 set at a second angle relative to the images of the subject patient, according to an embodiment.
  • FIG. 5 also shows a LASER marker 504 , a PoV image region 506 , and a line of sight 508 of the VR drill 402 (replicating a line of sight of the PoV surgical drill 122 ).
  • the surgeon or other user associated with the surgery may be able to see the actual image 410 of, e.g., the spine of the patient, AR image 412 of the spine, and the VR image 414 of the spine. The images may be seen based on the surgeon's preferences.
  • a highlighted section 516 is also shown as the PoV image captured by at least one of the cameras 308 , 310 , and 312 of the PoV surgical drill 122 , in at accordance with least one example.
  • the position and the direction of the VR drill has changed and thus, content of the PoV images 416 and 516 captured by the cameras of the PoV surgical drill is different. In this way, the surgeon may leverage different points of view to improve accuracy and to reduce errors while performing the surgical procedure.
  • any of the operations, processes, etc. described herein can be implemented as computer-readable instructions stored on a computer-readable medium.
  • the computer-readable instructions can be executed by a processor of a mobile unit, a network element, and/or any other computing device.
  • the implementer may opt for a mainly hardware and/or firmware vehicle; if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.
  • a signal bearing medium examples include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • a typical data processing system generally includes one or more of a system unit housing, a video display device, a memory such as volatile and non-volatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities).
  • a typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
  • any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically rateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.

Abstract

A robotic surgical system may be used to perform a surgical procedure. Providing guidance for the robotic surgical system includes integrating a Point of View (PoV) surgical drill with a camera to capture a PoV image of a surgical area of a subject patient; displaying an image of the surgical area, based on a viewing angle of the PoV surgical drill, thus enabling the surgeon to operate on the surgical area using the PoV surgical drill. The PoV surgical drill operates based on the surgeon's control of a guidance drill. The content of the images may change based on a change in the viewing angle of the PoV surgical drill.

Description

    FIELD OF THE DISCLOSURE
  • The present disclosure is generally related to a robotic surgical system in a Virtual Reality (VR) environment.
  • BACKGROUND
  • The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also correspond to implementations of the claimed technology.
  • Robotic surgical devices are now routinely used for numerous surgical procedures such as general surgery, pediatric surgery, and those related to the medical fields of gynecology, urology, cardiology, and otorhinolaryngology. Robotic devices continue to evolve and are being more frequently utilized in surgical procedures.
  • The robotic devices are used most in surgical procedures that require a high degree of accuracy and/or precision. Such robotic devices include autonomous, tele-operated, and interactive type robotic systems. Interactive robotic systems are most frequently used for providing the surgeon with direct hands-on control of the surgical procedure, thus achieving a high degree of accuracy and/or precision. For example, in a knee surgery, a surgeon can use an interactive robotic arm to sculpt a bone and/or to receive a knee implant. In a laparoscopic surgical procedure, using a robotic system, a surgeon may directly control and manipulate tissue, albeit at some distance from the patient through a fulcrum point in the abdominal wall.
  • In other surgical procedures performed using robotic devices, the surgeon may sit at a console in the operating room, but outside the sterile field, directing and controlling the movements of one or more robotic arms. However, robotic devices can be intrusive during a surgical procedure, blocking the surgeon's point of view and occupying substantial space around an operating table, increasing the likelihood of an operator error.
  • Instruments for robotic devices, like movement detection equipment and navigation markers, may be implemented in surgical procedures as safeguards. Such instruments help guide the robotic devices and assist the surgeons in avoiding errors. The movement detection equipment and the navigation markers help determine the position of the instrument in space and prevent the instrument from deviating beyond the path set by the surgeon. For example, in neurosurgery, neuromonitors with sensors are used to detect the threshold level, and a signal is sent to an appropriate system to stop insertion of the surgical instrument or to move the instrument away for preventing any damage when an error is detected as being imminent. Further, actuators are also used for controlled movement and positioning of end effectors in the robotic arm.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate various embodiments of systems, methods, and embodiments of various other aspects of the disclosure. Any person with ordinary skills in the art will appreciate that the illustrated element boundaries (e.g. boxes, groups of boxes, or other shapes) in the figures represent one example of the boundaries. It may be that in some examples one element may be designed as multiple elements or that multiple elements may be designed as one element. In some examples, an element shown as an internal component of one element may be implemented as an external component in another, and vice versa. Furthermore, elements may not be drawn to scale. Non-limiting and non-exhaustive descriptions are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles.
  • FIG. 1 shows a network connection diagram 100 for a system 102 to guide a robotic surgical system when performing a surgical procedure, according to an embodiment.
  • FIG. 2 shows a virtual grid used to determine locations of different instruments present in a Three-Dimensional (3D) space corresponding to an operating table, according to an embodiment.
  • FIG. 3A shows a front point of view (PoV) of a surgical drill, according to an embodiment.
  • FIG. 3B shows a cross-section PoV of a surgical drill along a Z-Z′ axis, according to an embodiment.
  • FIG. 4 shows a Virtual Reality (VR) drill set at a first angle, relative to images of the subject patient, according to an embodiment.
  • FIG. 5 shows the VR drill set at a second angle relative to the images of the subject patient, according to an embodiment.
  • DETAILED DESCRIPTION
  • Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words “comprising,” “having,” “containing,” and “including,” and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
  • It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the preferred, systems and methods are now described.
  • Embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout the several figures, and in which example embodiments are shown. Embodiments of the claims may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The examples set forth herein are non-limiting examples and are merely examples among other possible examples.
  • FIG. 1 shows a network connection diagram 100 for a system 102 to guide a robotic surgical system when performing surgery, according to an embodiment. The system 102 may be connected to a communication network 104. The communication network 104 may further be connected to an image database 106 and a position database 108 to facilitate data transfer therebetween with the system 102.
  • The image database 106 may store images of a subject patient, as well as images of previous patients who have undergone similar surgeries. The images may be captured using an X-ray, ultrasound, and Magnetic Resonance Imaging (MRI). Further, the images may be present in raw form, as Three-Dimensional (3D) models, Augmented Reality (AR) images, Virtual Reality (VR) images, and Point of View (PoV) images. The position database 108 may store real-time position information of a PoV surgical drill 122 and that of a virtual drill that may he shown to a surgeon during surgery.
  • The communication network 104 may be either of a wired and/or a wireless network. The communication network 104, if wireless, may be implemented using communication techniques such as Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WIMAX), Long Term Evolution (LTE™), Wireless Local Area Network (WLAN), Infrared (IR) communication, Public Switched Telephone Network (PSTN), Radio waves, and other communication techniques known in the art.
  • The system 102 may further include a processor 110, interface(s) 112, and a memory 114. The processor 110 may execute an algorithm stored in the memory 114 for processing the PoV images and for guiding the robotic surgical system when performing a surgical procedure. The processor 110 may also be configured to decode and execute any instructions received from one or more other electronic devices or server(s).
  • In at least one embodiment, the processor 110 may include one or more general purpose processors (e.g., INTEL® or Advanced Micro Devices® (AMD) microprocessors) and/or one or more special purpose processors (e.g., digital signal processors or Xilinx System On Chip (SOC) Field Programmable Gate Array (FPGA) processor). The processor 110 may be configured to execute one or more computer-readable program instructions, such as program instructions to carry out any of the functions described in this description.
  • The interface(s) 112 may facilitate interaction between a surgeon and the system 102. The interface(s) 112 may accept an input from the surgeon or other user who is associated with an on-going surgery and/or provide an output to the surgeon or other user. The interface(s) 112 may either be a Command Line Interface (CLI), Graphical User Interface (GUI), or a voice interface.
  • The memory 114 may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, Compact Disc Read-Only Memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, Random Access Memories (RAMS), Programmable Read-Only Memories (PROMs), Erasable PROMs (EPROMs), Electrically Erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions.
  • FIG. 1 also shows a user device 116, AR/VR display 118, guidance drill 120, and a Point of View (PoV) surgical drill 122 that are ail connected to the communication network 104. The surgeon may manoeuvre the guidance drill 120 to remotely control the actual PoV surgical drill 122. Locations and motions of the guidance drill 120 may be tracked at all times by an operating table to surgical drill reference system (hereafter “reference system”) 124 and a reference holder system 126. The reference holder system 126 and the reference system 124 may control the PoV surgical drill 122 based on the tracked locations and motions thereof.
  • The user device 116 is shown as a tablet in FIG. 1 ; however, other user devices having Graphical User Interfaces (GUIs) may also be used. Other implementations of user device 116 may include, but not be limited to, a smart phone, phablet, laptop, and desktop.
  • FIG. 1 also shows AR/VR display 118 as a VR glass in present case, although the example embodiments are not so limited.
  • FIG. 2 shows a virtual grid 200 used to determine locations of different instruments present in a Three-Dimensional (3D) space, by the reference system 124, according to an embodiment. The virtual grid 200 is shown in relation to an operating table 202. A marker 204 for AR reference is used on the virtual grid 200 to help an AR imaging system determine a location of the AR reference. Based on identified location of the AR marker 204, the AR imaging system may identify a location of a drill bit reference 206. Using the location of the drill hit reference 206, the AR/VR display 118 may identify a location of a virtual drill in the 3D space. The virtual drill may be shown to the surgeon, using the AR/VR display 118, during an actual surgical procedure.
  • In at least one embodiment, a drill bit may be placed in an opening 208 of a drill holder 210 of the PoV surgical drill 122. Once the drill bit is placed in the drill holder 210, a module 212 connected to the drill holder 210 may identify parameters of PoV surgical drill 122, including a type of the surgical drill, a type of the drill bit, a size of the drill bit, and an absolute position of a tip of the drill bit in an XYZ coordinate system referencing the operating table 202.
  • In at least one embodiment, the module 212 may further comprise a surgical drill reader configured to read a serial number present on the drill bit. The serial number may be related to the PoV surgical drill 122 and/or the drill bit thereof. Serial numbers respectively corresponding to different drill bits and different categories of surgical drills may be stored in a memory corresponding to the module 212. The received serial number may be matched with the serial numbers stored in the memory to identify details related to the PoV surgical drill 122 and the drill bit. In at least one example, the surgical drill reader may be implemented as a Near Field Communication (NFC) reader, and NFC encoded chip may be attached to the drill bit. The NFC reader may therefore communicate with the NFC encoded chip to receive the serial number of the drill bit.
  • In at least one embodiment, the module 212 may identify and/or determine the drill bit being cradled, reference the position of the drill bit with the virtual grid 200, identify the surgical drill and the drill bit, convert the surgical drill identification to an associated virtual surgical icon, and convert the drill bit identification to an associated virtual surgical drill bit icon. The module 212 may further transmit the virtual surgical icon and the virtual drill bit icon, referenced to the XYZ coordinate system of the operating table 202, to an AR imaging system and to the reference holder system 126.
  • In at least one other embodiment, the reference holder system 126, shown and described with regard to FIG. 1 , may be an integral unit of the PoV surgical drill 122, configured to identify and store, in real-time, XYZ coordinates of the PoV surgical drill 122 and drill bit tip, and an angle of the PoV surgical drill 122. The reference holder system 126 may include an accelerometer to detect changes in position of the PoV surgical drill 122 e.g., a three-axis accelerometer. In at least one other example, the data recorded by the reference holder system 126 may be transmitted to the AR imaging system. The data may include the virtual icon of the PoV surgical drill 122 and the virtual icon of the drill bit along with their real positions relative to the operating table 202.
  • FIG. 3A shows a front view of the PoV surgical drill 122; and FIG. 3B shows a cross-section of the PoV surgical drill 122 along a Z-Z′ axis, in accordance with at least one example embodiment. Drill bit 302 is centrally located, length-wise, on PoV surgical drill 122. A Light Amplification by Stimulated Emission of Radiation (LASER) device 304 is disposed on the periphery of block 306 surrounding the drill bit 302. The LASER device 304 may guide a surgeon in a direction in which the PoV surgical drill 122 is moving. Further, cameras 308, 310, and 312 are shown on the periphery of the block surrounding the drill bit 302, although the cameras are not so limited in quantity. At least one of the cameras 308, 310, and 312 may capture a PoV image of a surgical area of a subject patient.
  • In at least one embodiment, images captured by any one or more of the cameras 308, 310, and 312 may be integrated to produce one composite PoV image using known image processing tools and techniques. In at least one example, the composite PoV image may be cropped in a circle and centered with regard to the drill bit 302 based on defaults settings stored by the surgeon. The cropped image may then be sent to the reference holder system 126.
  • In at least one embodiment, while performing a surgical procedure on the subject patientm the surgeon may maneuver the guidance drill 120 to control the PoV surgical drill 122 based on the PoV images seen on the AR/VR display 118. As set forth above, the PoV images may be collected using one or more of the cameras 308, 310, and 312 positioned on the head of the PoV surgical drill 122. Thus, content of the PoV images may change based on an orientation and direction faced by the PoV surgical drill 122.
  • In at least one embodiment, the processor 110 may synchronize the position of the PoV surgical drill 122 with a reference linked with augmented images shown on the AR display device 118. Based on such synchronization, a Virtual Reality (VR) drill may be shown to the surgeon on the augmented images displayed using the AR display device 118. The VR drill may move based on changes in position of the PoV surgical drill 122, controlled by the surgeon controlling the guidance drill 120. Thus, such synchronization of the VR drill and the PoV surgical drill 122 provides a realistic experience to the surgeon. Further, operating room cameras may also be used for capturing images of the surgical procedure from a fixed angle, as set by the surgeon or based on a positioning of the operating room cameras. Such images may be stored in the image database 106, and may be displayed to the surgeon using an image display, e.g., AR display device 118.
  • FIG. 4 shows the VR drill 402 set at a first angle, relative to images of the subject patient, according to an embodiment. The VR drill 402 is displayed, via a GUI, to the surgeon and other users associated with the surgery; and the VR drill 402 replicates the position and the direction of the PoV surgical drill 122. FIG. 4 also shows a LASER marker 404, a PoV image region 406, and a line of sight 408 of the VR drill 402 (replicating a line of sight of the PoV surgical drill 122). The surgeon may be able to see an actual image 410 of the, e.g., spine of the patient, an AR image 412 of the spine, and a VR image 414 of the spine. The images may be seen based on the surgeon's preferences e.g., images may be overlaid over each other, the images may be shown in parallel in a side to side arrangement, etc. A highlighted section 116 is also shown as the PoV image captured by one or more of the cameras 308, 310, and 312 of the PoV surgical drill 122, in accordance with at least one example.
  • FIG. 5 shows the VR drill 402 set at a second angle relative to the images of the subject patient, according to an embodiment. FIG. 5 also shows a LASER marker 504, a PoV image region 506, and a line of sight 508 of the VR drill 402 (replicating a line of sight of the PoV surgical drill 122). The surgeon or other user associated with the surgery may be able to see the actual image 410 of, e.g., the spine of the patient, AR image 412 of the spine, and the VR image 414 of the spine. The images may be seen based on the surgeon's preferences. A highlighted section 516 is also shown as the PoV image captured by at least one of the cameras 308, 310, and 312 of the PoV surgical drill 122, in at accordance with least one example. As evident from comparison of the FIG. 4 and FIG. 5 , the position and the direction of the VR drill has changed and thus, content of the PoV images 416 and 516 captured by the cameras of the PoV surgical drill is different. In this way, the surgeon may leverage different points of view to improve accuracy and to reduce errors while performing the surgical procedure.
  • In an illustrative embodiment, any of the operations, processes, etc. described herein can be implemented as computer-readable instructions stored on a computer-readable medium. The computer-readable instructions can be executed by a processor of a mobile unit, a network element, and/or any other computing device.
  • There is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. There are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.
  • The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific. Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • Those skilled in the art will recognize that it is common within the art to describe devices and/or processes in the fashion set forth herein, and thereafter use engineering practices to integrate such described devices and/or processes into data processing systems. That is, at least a portion of the devices and/or processes described herein can be integrated into a data processing system via a reasonable amount of experimentation. Those having skill in the art will recognize that a typical data processing system generally includes one or more of a system unit housing, a video display device, a memory such as volatile and non-volatile memory, processors such as microprocessors and digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices, such as a touch pad or screen, and/or control systems including feedback loops and control motors (e.g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A typical data processing system may be implemented utilizing any suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
  • The herein described subject matter sometimes shows different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures can be implemented which achieve the same functionality, In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically rateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
  • From the foregoing, it will be appreciated that various embodiments of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made without departing from the scope and spirit of the present disclosure. Accordingly, the various embodiments disclosed herein are not intended to be limiting.

Claims (17)

What is claimed is:
1. A method of guiding a robotic endoscope, the method comprising:
identifying a region of interest of an affected body part of a subject patient;
creating distance-based rules for the region of interest for generating alerts during a surgical procedure;
endoscope image referencing by displaying real images of the subject patient captured by the robotic endoscope along with augmented reality (AR) anatomical images of the region of interest;
wherein after the surgical procedure including inserting the robotic endoscope into the subject patient has been performed, guiding the robotic endoscope based on the captured real images and the AR anatomical images.
2. The method of claim 1, wherein performing the endoscope image referencing includes displaying images taken by the robotic endoscope with the AR anatomical images.
3. The method of claim 2, wherein the distance-based rules are created based on at least one of the AR anatomical image, the real images from the robotic endoscope, or the image of the robotic endoscope.
4. The method of claim 1, wherein the images of the robotic endoscope are provided by an ultrasound device.
5. The method of claim 1,
wherein a plurality of cameras is provided around a periphery of a head of the robotic endoscope, and
wherein a field of view of the plurality of cameras includes a point of view (PoV) image region.
6. The method of claim 5, further comprising:
displaying the PoV image region of the plurality of cameras with the AR anatomical images.
7. The method of claim 6, wherein the displaying of the field of view of the plurality of cameras includes:
displaying the field of view of the plurality of cameras in a subwindow of the AR anatomical images.
8. The method of claim 6, wherein the displaying of the field of view of the plurality of cameras includes:
automatically changing a size of a viewing window based on a distance of the robotic endoscope from one or more internal organs.
9. The method of claim 1, wherein the guiding of the robotic endoscope is based on at least one of a point of view (PoV) of the robotic endoscope, a location of the robotic endoscope, or a distance of the robotic endoscope from one or more internal organs.
10. The method of claim 1, wherein the distance-based rules include using a distance between one or more internal organs.
11. A non-transitory computer-readable medium having executable instructions stored thereon that, when executed, cause one or more processors to:
receive an identification of a region of interest of an affected body part of a subject patient;
create distance-based rules for the region of interest for generating alerts during a surgical procedure;
perform an endoscope image referencing by displaying captured real images of the subject patient from the robotic endoscope with augmented reality (AR) anatomical images of the region of interest;
wherein after the surgical procedure including inserting the robotic endoscope into the subject patient has been performed, guide the robotic endoscope based on the captured real images and the AR anatomical images.
12. A system for a robotic endoscope, the system comprising:
a robotic endoscope;
a measurement recognition module to receive an identification of a region of interest of an affected body part of a subject patient, wherein the measurement recognition module is configured to:
create distance-based rules for the region of interest for generating alerts during a surgical procedure, and
perform an endoscope image referencing by displaying captured real images of the subject patient from the robotic endoscope with augmented reality (AR) anatomical images of the region of interest; and
an endoscope control system configured to guide the robotic endoscope based on the captured real images and the AR anatomical images.
13. The system of claim 12, wherein the measurement recognition module is further configured to:
display images of the robotic endoscope with the AR anatomical images.
14. The system of claim 13, wherein the distance-based rules are created based on at least one of the AR anatomical image, the real images from the robotic endoscope, or the image of the robotic endoscope.
15. The system of claim 12, wherein a plurality of cameras is provided around a periphery of a head of the robotic endoscope, wherein a field of view of the plurality of cameras includes a point of view (PoV) image region.
16. The system of claim 15, wherein the measurement recognition module is further configured to:
display the PoV image region of the plurality of cameras with the AR anatomical images.
17. The system of claim 16, wherein the displaying the field of view of the plurality of cameras includes at least one of:
displaying the field of view of the plurality of cameras in a subwindow of the AR anatomical images; or
automatically changing a size of a viewing window based on a distance of the robotic endoscope from one or more internal organs.
US18/174,287 2017-06-29 2023-02-24 Guiding a robotic surgical system to perform a surgical procedure Pending US20230225810A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/174,287 US20230225810A1 (en) 2017-06-29 2023-02-24 Guiding a robotic surgical system to perform a surgical procedure

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201762526439P 2017-06-29 2017-06-29
US201762528102P 2017-07-02 2017-07-02
US201762529019P 2017-07-06 2017-07-06
US16/023,014 US11589933B2 (en) 2017-06-29 2018-06-29 Guiding a robotic surgical system to perform a surgical procedure
US18/174,287 US20230225810A1 (en) 2017-06-29 2023-02-24 Guiding a robotic surgical system to perform a surgical procedure

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/023,014 Continuation US11589933B2 (en) 2017-06-29 2018-06-29 Guiding a robotic surgical system to perform a surgical procedure

Publications (1)

Publication Number Publication Date
US20230225810A1 true US20230225810A1 (en) 2023-07-20

Family

ID=64735117

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/023,014 Active 2040-07-25 US11589933B2 (en) 2017-06-29 2018-06-29 Guiding a robotic surgical system to perform a surgical procedure
US18/174,287 Pending US20230225810A1 (en) 2017-06-29 2023-02-24 Guiding a robotic surgical system to perform a surgical procedure

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/023,014 Active 2040-07-25 US11589933B2 (en) 2017-06-29 2018-06-29 Guiding a robotic surgical system to perform a surgical procedure

Country Status (1)

Country Link
US (2) US11589933B2 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10991070B2 (en) 2015-12-18 2021-04-27 OrthoGrid Systems, Inc Method of providing surgical guidance
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US11540794B2 (en) 2018-09-12 2023-01-03 Orthogrid Systesm Holdings, LLC Artificial intelligence intra-operative surgical guidance system and method of use
US11589928B2 (en) 2018-09-12 2023-02-28 Orthogrid Systems Holdings, Llc Artificial intelligence intra-operative surgical guidance system and method of use
CN109700550B (en) * 2019-01-22 2020-06-26 雅客智慧(北京)科技有限公司 Augmented reality method and device for dental surgery
CN109758231A (en) * 2019-03-05 2019-05-17 钟文昭 Operation piloting method and system in thoracic cavity based on mixed reality
CN110200676B (en) * 2019-06-05 2020-08-28 解涛 Neurosurgery is with boring cranium ware based on VR technique
KR102362149B1 (en) * 2019-12-06 2022-02-10 서울대학교산학협력단 Augmented reality tool for performing implant operation and method for visualizing information thereof
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11607277B2 (en) 2020-04-29 2023-03-21 Globus Medical, Inc. Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11510733B1 (en) * 2021-12-10 2022-11-29 Ix Innovation Llc Placement of surgical implants
US11918296B1 (en) 2022-08-26 2024-03-05 Ix Innovation Llc Digital image analysis for in vivo robotic assembly of multi-component implants

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20110238079A1 (en) * 2010-03-18 2011-09-29 SPI Surgical, Inc. Surgical Cockpit Comprising Multisensory and Multimodal Interfaces for Robotic Surgery and Methods Related Thereto
US20130046523A1 (en) * 2009-08-18 2013-02-21 Paul Van Dinther Endoscope Simulator
US20170172662A1 (en) * 2014-03-28 2017-06-22 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030179308A1 (en) * 2002-03-19 2003-09-25 Lucia Zamorano Augmented tracking using video, computed data and/or sensing technologies
US9469034B2 (en) * 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
CN103764061B (en) * 2011-06-27 2017-03-08 内布拉斯加大学评议会 Tracing system and Computer Aided Surgery method that instrument carries
US10507066B2 (en) * 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
EP3753518A1 (en) * 2014-09-04 2020-12-23 Memic Innovative Surgery Ltd. Control of device including mechanical arms
EP3273854B1 (en) * 2015-03-26 2021-09-22 Universidade de Coimbra Systems for computer-aided surgery using intra-operative video acquired by a free moving camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20130046523A1 (en) * 2009-08-18 2013-02-21 Paul Van Dinther Endoscope Simulator
US20110238079A1 (en) * 2010-03-18 2011-09-29 SPI Surgical, Inc. Surgical Cockpit Comprising Multisensory and Multimodal Interfaces for Robotic Surgery and Methods Related Thereto
US20170172662A1 (en) * 2014-03-28 2017-06-22 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view

Also Published As

Publication number Publication date
US20190000570A1 (en) 2019-01-03
US11589933B2 (en) 2023-02-28

Similar Documents

Publication Publication Date Title
US20230225810A1 (en) Guiding a robotic surgical system to perform a surgical procedure
US11717376B2 (en) System and method for dynamic validation, correction of registration misalignment for surgical navigation between the real and virtual images
CN110169822B (en) Augmented reality navigation system for use with robotic surgical system and method of use thereof
US11514576B2 (en) Surgical system with combination of sensor-based navigation and endoscopy
US11666385B2 (en) Systems and methods for augmented reality guidance
US11278359B2 (en) Graphical user interface for use in a surgical navigation system with a robot arm
US10925676B2 (en) Method, system and apparatus for controlling a surgical navigation system
JP7233841B2 (en) Robotic Navigation for Robotic Surgical Systems
US20240033014A1 (en) Guidance for placement of surgical ports
JP2024014885A (en) Teleoperated surgical system with scanning-based positioning
US9918798B2 (en) Accurate three-dimensional instrument positioning
WO2019181632A1 (en) Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system
US20210290317A1 (en) Systems and methods for tracking a position of a robotically-manipulated surgical instrument
KR20200097747A (en) Systems and methods that support visualization during surgery
JP2016503676A (en) Positioning and navigation using 3D tracking sensors
CN110494096B (en) Method and system for determining one or more points on a surgical path
US10154882B2 (en) Global laparoscopy positioning systems and methods
US20230210627A1 (en) Three-dimensional instrument pose estimation
WO2023161848A1 (en) Three-dimensional reconstruction of an instrument and procedure site
WO2023018685A1 (en) Systems and methods for a differentiated interaction environment
WO2023126754A1 (en) Three-dimensional model reconstruction
WO2018154601A1 (en) Multi-camera imaging and visualization system for minimally invasive surgery

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED