US20160038253A1 - Method, system and apparatus for controlling a surgical navigation system - Google Patents

Method, system and apparatus for controlling a surgical navigation system Download PDF

Info

Publication number
US20160038253A1
US20160038253A1 US14/775,192 US201414775192A US2016038253A1 US 20160038253 A1 US20160038253 A1 US 20160038253A1 US 201414775192 A US201414775192 A US 201414775192A US 2016038253 A1 US2016038253 A1 US 2016038253A1
Authority
US
United States
Prior art keywords
processor
identifier
display
computing device
output data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/775,192
Inventor
Cameron Anthony Piron
Michael Frank Gunter WOOD
Gal Sela
Joshua Lee RICHMOND
Murugathas Yuwaraj
Stephen B.E. MCFADYEN
Kelly Noel Dyer
Monroe M. Thomas
Wes Hodges
Simon Alexander
David Gallop
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Synaptive Medical Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/775,192 priority Critical patent/US20160038253A1/en
Assigned to SYNAPTIVE MEDICAL (BARBADOS) INC. reassignment SYNAPTIVE MEDICAL (BARBADOS) INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALEXANDER, SIMON, DYER, Kelly, GALLOP, David, HODGES, Wes, MCFADYEN, STEPHEN, PIRON, CAMERON, RICHMOND, JOSHUA, SELA, GAL, THOMAS, MONROE M., WOOD, MICHAEL, YUWARAJ, MURUGATHAS
Publication of US20160038253A1 publication Critical patent/US20160038253A1/en
Assigned to Synaptive Medical Inc. reassignment Synaptive Medical Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SYNAPTIVE MEDICAL (BARBADOS) INC.
Assigned to ESPRESSO CAPITAL LTD. reassignment ESPRESSO CAPITAL LTD. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Synaptive Medical Inc.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B19/5244
    • A61B19/2203
    • A61B19/56
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2074Interface software
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B2090/103Cranial plugs for access to brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0475Special features of memory means, e.g. removable memory cards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms

Definitions

  • the specification relates generally to navigation systems, and specifically to a method, system and apparatus for navigation systems for use in image guided medical procedures.
  • An aspect of the specification provides a method of controlling a surgical navigation system, comprising: receiving, at a processor, an identifier of a surgical instrument within a field of view of a tracking system; generating, at the processor, output data based on the identifier of the surgical instrument; and transmitting the output data to an output device connected to the processor, for controlling the output device.
  • Further aspects of the specification include a computing device configured to perform the above method, and a non-transitory computer-readable medium storing a plurality of computer readable instructions executable by a processor for implementing the above method.
  • FIG. 1 depicts a navigation system in use in a surgical procedure, according to a non-limiting embodiment
  • FIG. 2 depicts a schematic diagram of the navigation system of FIG. 1 , according to a non-limiting embodiment
  • FIG. 3 depicts a schematic diagram of the navigation system of FIG. 1 , according to another non-limiting embodiment
  • FIG. 4 depicts a computing device of the system of FIG. 1 , according to a non-limiting embodiment
  • FIG. 5 depicts a method of controlling a surgical navigation system, according to a non-limiting embodiment
  • FIG. 6 depicts an example performance of block 505 of the method of FIG. 5 , according to a non-limiting embodiment
  • FIG. 7 depicts instrument definitions stored by the computing device of FIG. 4 , according to a non-limiting embodiment
  • FIG. 8 depicts gesture definitions stored by the computing device of FIG. 4 , according to a non-limiting embodiment
  • FIG. 9 depicts output control rule definitions stored by the computing device of FIG. 4 , according to a non-limiting embodiment
  • FIG. 10 depicts an example performance of block 530 of the method of FIG. 5 , according to a non-limiting embodiment
  • FIG. 11 depicts another example performance of block 530 of the method of FIG. 5 , according to a non-limiting embodiment.
  • FIG. 12 depicts a further example performance of block 530 of the method of FIG. 5 , according to a non-limiting embodiment
  • FIG. 13 depicts a further example performance of block 530 of the method of FIG. 5 , according to a non-limiting embodiment
  • FIG. 14 depicts a further example performance of block 530 of the method of FIG. 5 , according to a non-limiting embodiment.
  • FIGS. 15A , 15 B and 15 C depict further example performances of block 530 of the method of FIG. 5 , according to a non-limiting embodiment.
  • FIG. 1 depicts a surgeon 104 conducting a minimally invasive port-based surgical procedure on a patient 108 in an operating room (OR) environment.
  • the surgical procedure is supported by a navigation system 112 , including a computing device connected to a variety of input devices (e.g. a tracking sensor such as a camera, a keyboard and mouse and the like) and controlling a variety of output devices (e.g. a display, illumination equipment and the like).
  • System 112 also includes a variety of surgical instruments, whose motions may be tracked by system 112 .
  • An assistant or operator 116 can also be present, and both surgeon 104 and assistant 116 can operate system 112 .
  • system 112 is configured to control the output devices based on input from a variety of sources, including not only the above mentioned input devices, but also the tracked surgical instruments that are manipulated by surgeon 104 during the procedure.
  • system 112 includes an equipment tower 200 supporting a computing device 204 , along with other equipment.
  • Equipment tower 200 is mounted on a rack, cart, or the like, and may also support a power supply for the remaining components of system 112 .
  • Computing device 204 is connected to output devices including a display, such as displays 208 and 212 , and a robotic arm 216 .
  • a display such as displays 208 and 212
  • Each of displays 208 and 212 can be based on any suitable display technology.
  • display 208 can be a flat panel display comprising any one of, or any suitable combination of, a Liquid Crystal Display (LCD), a plasma display, an Organic Light Emitting Diode (OLED) display, and the like.
  • LCD Liquid Crystal Display
  • plasma display a plasma display
  • OLED Organic Light Emitting Diode
  • Tracking camera 224 may be configured to receive visible light, IR, or both.
  • tracking camera 224 is discussed herein as an example tracking sensor, it is to be understood that other tracking sensors may also be used instead of, or in addition to, tracking camera 224 .
  • any references to tracking camera 224 below may also refer, in other embodiments, to any of a variety of suitable tracking systems known to those skilled in the art.
  • Minimally invasive brain surgery using access ports is a recently conceived method of performing surgery on brain tumors previously considered inoperable. Such minimally invasive procedures are performed through a relatively small opening in a patient's skull.
  • system 112 also includes an access port 228 for insertion through the skull of patient 108 —which is immobilized by a holder 230 —and into the brain of patient 108 .
  • An introducer 234 with an atraumatic tip (for reducing damage to brain tissue during the insertion of access port 228 ) is inserted into access port 228 , and access port 228 and introducer 234 together are inserted into the skull of patient 108 .
  • Introducer 234 includes fiduciary markers 236 such as IR-reflecting markers, that are detectable by tracking camera 224 .
  • tracking camera 224 can emit infrared light, which is reflected by markers 236 and permits tracking camera 224 (which is sensitive to IR radiation) to capture images from which markers 236 can readily be isolated.
  • robotic arm 216 and other instrumentation can also carry fiduciary markers.
  • Camera 224 in conjunction with computing device 204 can determine the spatial positions of markers 236 using conventional motion tracking algorithms. Computing device 204 is therefore configured to track the position of markers 236 (and by extension, the position of introducer 228 ) as introducer 234 is moved within the field of view of tracking camera 224 .
  • the spatial position of patient 108 's skull was previously determined and stored by computing device 204 .
  • markers 236 allow computing device 204 to track not only introducer 234 , but also access port 228 itself, even if access port 228 does not carry any markers.
  • the tracked position of introducer 234 relative to the known position of the skull of patient 108 can be presented on one or both of displays 208 and 212 .
  • Various views e.g. axial, sagittal, coronal, perpendicular to tool tip, in-plane of tool shaft, and the like) of the relative positions of introducer 234 , access port 228 and the skull can be presented on displays 208 and 212 .
  • introducer 234 and access port 228 may be removed from access port 228 to allow access to the brain tissue through a central opening of access port 228 .
  • access port 228 does not carry any fiduciary markers, and therefore may not be able to be directly tracked after the removal of introducer 234 .
  • other surgical instruments carrying markers can be used to indirectly track access port 228 .
  • access port 228 itself can carry fiduciary markers 236 .
  • System 112 can also include an articulated arm 238 anchored at one end to holder 230 , and having at an opposite end a clamp for engaging access port 228 .
  • Arm 238 may be employed to fix the position of access port 228 after insertion.
  • Arm 238 may also have locked and unlocked positions, such that in the locked position access port 228 is not permitted to move, while in the unlocked position movement (at least in certain axes) by access portion 228 is permitted.
  • FIG. 3 another depiction of system 112 is illustrated, in which only display 208 is included. Additional surgical instruments 300 are also shown (such as a probing instrument and a suction instrument, for example), each carrying fiduciary markers 236 . Further, as mentioned above, scope 220 also carries markers 236 in FIG. 3 .
  • computing device 204 can control the output devices of system 112 based on those tracked movements.
  • the control of output devices need not be based only on tracked movements—output control can also be based on other contextual data, including the specific identity of the tracked instruments, as well as surgical planning data.
  • the surgical planning data can include an identifier of the current phase or stage of the surgical procedure, which can be determined at computing device 204 either via receipt of an input from an operator (e.g. surgeon 104 ), or by other triggers automatically detected by computing device 204 .
  • Those triggers can include detection of a tip of access port 228 traversing the outer boundary of the skull, indicating that cannulation is occurring.
  • displays 208 and 212 can be controlled to present various selectable interface elements (including menus) based on instrument identities and movements. The components and operation of computing device 204 will now be discussed in greater detail.
  • Computing device 204 includes a central processing unit (also referred to as a microprocessor or simply a processor) 400 interconnected with a non-transitory computer readable storage medium such as a memory 404 .
  • processor 400 and memory 404 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided).
  • Memory 404 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory.
  • RAM Random Access Memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • flash memory magnetic computer storage device, or optical disc
  • memory 404 includes both a volatile memory and a non-volatile memory.
  • Other types of non-transitory computer readable storage medium are also contemplated, such as compact discs (CD-ROM, CD-RW) and digital video discs (DVD).
  • Computing device 204 also includes a network interface 408 interconnected with processor 400 .
  • Network interface 408 allows computing device 204 to communicate with other computing devices via a network (e.g. a local area network (LAN), a wide area network (WAN) or any suitable combination thereof).
  • Network interface 408 thus includes any necessary hardware for communicating over such networks.
  • Computing device 204 also includes an input/output interface 412 , including the necessary hardware for interconnecting processor 400 with various input and output devices.
  • Interface 412 can include, among other components, a Universal Serial Bus (USB) port, an audio port for sending and receiving audio data, a Video Graphics Array (VGA), Digital Visual Interface (DVI) or other port for sending and receiving display data, and any other suitable components.
  • USB Universal Serial Bus
  • VGA Video Graphics Array
  • DVI Digital Visual Interface
  • computing device 204 is connected to input devices including a keyboard and mouse 416 , a microphone 420 , as well as scope 220 and tracking camera 224 , mentioned above. Also via interface 412 , computing device 204 is connected to output devices including illumination or projection components (e.g. lights, projectors and the like), as well as display 208 and robotic arm 216 mentioned above. Other input (e.g. touch screens) and output devices (e.g. speakers) will also occur to those skilled in the art.
  • input devices including a keyboard and mouse 416 , a microphone 420 , as well as scope 220 and tracking camera 224 , mentioned above.
  • output devices including illumination or projection components (e.g. lights, projectors and the like), as well as display 208 and robotic arm 216 mentioned above.
  • Other input e.g. touch screens
  • output devices e.g. speakers
  • Computing device 204 stores, in memory 404 , an interface management application 428 (also referred to herein as application 428 ) comprising a plurality of computer readable instructions executable by processor 400 .
  • application 428 also referred to herein as application 428
  • processor 404 executes the instructions of application 428 (or, indeed, any other application stored in memory 404 )
  • processor 404 performs various functions implemented by those instructions, as will be discussed below.
  • Processor 400 or computing device 204 more generally, is therefore said to be “configured” to perform those functions via the execution of application 428 .
  • Patient data 432 includes a surgical plan defining the various steps of the minimally invasive surgical procedure, as well as image data relating to patient 108 , such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) scans, three-dimensional models of the brain of patient 108 and the like.
  • Instrument definitions 436 includes data defining characteristics of at least one of the surgical instruments to be used in the surgical procedure—such characteristics allow computing device 204 to differentiate between instruments in image data received from tracking camera 224 .
  • Gesture definitions 440 include data defining various movements of the instruments defined in instrument definitions 436 .
  • rules 444 contain associations between the gestures defined in gesture definitions 440 and output operations to be effected by computing device 204 .
  • repositories 432 , 436 , 440 and 444 are shown as databases in FIG. 4 , their data structures are not particularly limited—the data contained within each repository can be stored in any suitable structure.
  • Method 500 a method 500 of controlling a navigation system, such as system 112 , is shown.
  • Method 500 will be described in conjunction with its performance on system 112 , and particularly on computing device 204 , although it is contemplated that method 500 , and variants thereof, can also be adapted to other systems.
  • computing device 204 is configured to control one or more output devices of system 112 .
  • processor 400 is configured to generate output data and transmit the output data, via I/O interface 412 , to the relevant output devices.
  • the nature of the control at block 505 is not particularly limited.
  • processor 400 is configured to control display 208 to present a video feed received from scope 220 on display 208 .
  • An example of such a feed is shown in FIG. 6 , where an image 600 representing a frame of the video feed is presented on display 208 .
  • access port 228 In image 600 , a portion of access port 228 is visible, and brain tissue 604 is visible through access port 228 . Access port 228 and brain tissue 604 may be visible on display 208 at a configurable magnification greater than 1. Also visible in image 600 are the tips of two surgical instruments 300 .
  • an overhead light or projector 424 is controlled by computing device 204 to project white light at a predefined brightness onto access port 228 to illuminate brain tissue 604 .
  • control mechanisms are contemplated, and they need not include overhead lighting in some embodiments.
  • overhead lights may not be controlled by computing device 204 in some embodiments.
  • computing device 204 is configured to identify surgical instruments that are active (that is, present in the field of view of tracking camera 224 ).
  • Computing device 204 receives image data from tracking camera 224 via interface 412 .
  • the received image data contains artifacts representing reflected light from markers 236 , and computing device 204 is configured to compare the image data, including such artifacts, to instrument definitions 436 to determine which surgical instruments, if any, are present within the field of view of tracking camera 224 .
  • Instrument definitions 436 includes a plurality of records 700 , each including an instrument identifier (e.g. “suction”) and one or more instrument characteristics.
  • each record 700 includes an indication of the geometry of markers 236 attached to that particular instrument (that is, the positions of markers 236 relative to each other).
  • computing device 204 is configured to compare the geometry of markers in image data received from tracking camera 224 to the geometries specified in definitions 436 . When the geometry of one or more markers in the image data matches the geometry specified in a given record 700 , the corresponding instrument identifier in that record 700 is selected for further processing.
  • instrument characteristics can be included in records 700 instead of, or in addition to, marker geometry.
  • Other examples of instrument characteristics include marker reflectivity, marker size, and the like.
  • surgical instruments can be equipped with RFID tags or other near-field communication devices that broadcast instrument identities to computing device 204 .
  • tool definitions 436 can be omitted entirely from computing device 204 .
  • tracking camera 224 (or, as mentioned earlier, any other suitable tracking system) can be configured to identify instruments and transmit instrument identifiers and position data to computing device 204 , instead of transmitting image data for computing device 204 to process.
  • computing device 204 can be configured to perform block 513 .
  • computing device 204 is configured to generate updated output data for controlling the output devices of system 112 based on the identities of the active instruments.
  • instrument definitions 436 can include output commands in addition to the instrument identifiers and characteristics. Such output commands can cause computing device 204 to select a particular menu of selectable interface elements for presentation on display 208 , among a plurality of selectable interface elements contained in application 428 .
  • Such output commands can also configure computing device 204 to control illumination and projection equipment 424 in a predefined manner, or to control display 208 to overlay data from repository 432 on image 600 (for example, a three dimensional model of the patient's brain, a CT scan, or the like).
  • each stage of the surgical procedure can contain data identifying the instruments expected to be used for that stage, and specifying output commands for controlling the output devices of system 112 .
  • the identification of instruments matching those in a certain stage of the planning data can indicate that the procedure has reached that certain stage, and computing device 204 can be configured to implement the output commands associated with the stage.
  • the performance of block 513 can be omitted.
  • the performance of method 500 therefore proceeds from either of blocks 510 or 513 to block 515 .
  • computing device 204 is configured to determine whether an input mode has been activated. In an input mode, the movements of the instruments identified at block 510 can control the output devices of system 112 connected to computing device 204 .
  • the determination at block 515 can take a variety of forms. For example, computing device 204 can be configured simply to detect whether one or more of the instruments identified at block 510 is moving, based on image data continually received from tracking camera 224 . If the instruments are stationary (or show movement below a predetermined threshold), the determination at block 515 is negative, and the performance of method 500 returns to block 510 .
  • the determination at block 515 is affirmative, and the performance of method 500 proceeds to block 520 , to be discussed below.
  • the determination at block 515 can be affirmative (that is, the input mode is active) if an instrument remains stationary and within a certain set distance of another instrument for a set amount of time.
  • computing device 204 may be configured to await specific input data, such as audible command (such as a voice command, e.g. “input on”) recorded by microphone 420 .
  • computing device 204 may be configured to await a specific input from keyboard or mouse 416 , or from another input device such as a foot pedal (not shown) available to surgeon 104 .
  • computing device 204 is configured to determine whether the tracked movements of the instruments identified at block 510 match any of the gesture definitions in repository 440 .
  • processor 400 continually receives image data (or instrument identifiers and positions, as mentioned above) from tracking camera 224 and processes such data according to conventional motion-tracking mechanisms to generate motion data (e.g. speed, direction, coordinates) for the instruments substantially in real-time.
  • Processor 400 is therefore configured to compare the motion data to the definitions in repository 440 , and determine whether the motion data matches any of the definitions.
  • Repository 440 includes a plurality of records 800 , each defining a gesture.
  • Each record 800 includes a gesture identifier, and corresponding characteristics of that gesture.
  • a “shake” gesture is defined in the present example as three reversals in movement velocity of an instrument within a time period of one second
  • a “tap” gesture is defined as a minimum of one second of overlap between the positions of two instruments, as determined by processor 400 .
  • a “135 degree” gesture is defined as an instrument being held at an angle of one hundred and thirty five degrees relative to the center of the access port.
  • a wide variety of other gestures can also be defined, and other characteristics can be used to define such gestures.
  • certain gestures can be defined by the relative position of an instrument in comparison to segments of the field of view of scope 220 , such that the presence of an instrument in a certain quadrant of the field of view for a certain time period is interpreted as a gesture by computing device 204 .
  • Other gestures can be defined by the speed or timing of a rotation of the instrument, the distance between the tips of two instruments, and the like.
  • Each record 800 can also specify tolerances (not shown) for the characteristics.
  • the time periods shown in FIG. 8 may have tolerances of 10%, such that three velocity reversals occurring in 1.1 seconds would still be interpreted as a “shake”.
  • Such tolerances, and any other gesture characteristics can also be defined in association with a specific surgeon or surgical procedure. For example, a first surgeon may require gesture definitions with greater tolerances than a second surgeon.
  • computing device 204 is configured to confirm whether or not an input mode remains active, and to monitor for any further movements that may match defined gestures.
  • computing device 204 is configured to select a command corresponding to the gesture detected at block 520 , based on output control rules 444 .
  • Rules 444 include a plurality of records 900 each defining an output control rule.
  • Each record 900 includes a command definition for controlling one or more output devices of system 112 .
  • Each record 900 can also include, corresponding to the command definition, a gesture identifier and an instrument identifier.
  • the first of records 900 defines a command that will cause robotic arm 216 to follow the motion of the suction instrument for a certain time after the suction instrument has registered a “shake” gesture. Such a command can be used to reposition scope 220 .
  • the second of records 900 defines a command that will cause an overhead light 424 to increase in brightness when a probe instrument registers a “shake” gesture.
  • the third of records 900 defines a command that will cause display 208 to be updated to present a menu containing selectable interface elements relevant to tumor resection when the suction and probe instruments register a “tap” gesture.
  • the fourth of records 900 defines a command that will cause a particular selectable element of the resection menu to be selected when the suction device is held at an angle of one hundred thirty five degrees in relation to the center of access port 228 .
  • application 428 can contain a plurality of menus, each including various selectable elements.
  • Rules 444 can contain one or more records defining conditions under which each of the plurality of menus is to be selected for presentation on display 208 .
  • additional parameters corresponding to the command definition can be included in a record 900 , while in other embodiments some parameters can be omitted.
  • other parameters include a stage of the surgical procedure (as defined in patient data 432 ); an identifier of a surgeon; characteristics of the image currently shown on display 208 (for example, image characteristics indicative of tumor tissue, such as brightness, contrast, or colour values); and other output data already provided to the output devices, such as which menu is currently presented on display 208 .
  • rules 444 define associations between the context in which surgical instruments are being used, and commands to control the output devices of system 112 .
  • computing device 204 is configured to compare the identities of the instruments identified at block 510 , the context of use of those instruments (e.g. gestures detected at block 520 , stage of the procedure, identity of the surgeon), to rules 444 and select a rule that matches the current context.
  • the command of that particular rule is the command selected at block 525 .
  • computing device 204 is configured to update the control of the output devices of system 112 based on the selected command.
  • the nature of the control effected at block 530 is defined by the particular command selected at block 525 , and can therefore vary greatly.
  • An example of a performance of block 530 is shown in FIG. 10 .
  • FIG. 10 depicts an updated interface presented on display 208 , in which image 600 is shown following a “tap” gesture with the suction and probe instruments.
  • a menu 1000 is presented on display 208 .
  • Menu 100 is one of the plurality of menus within application 428 , and includes a plurality of selectable elements. Each element is selectable for causing computing device 204 to execute a specific operation implemented by the instructions of application 428 .
  • a record element 1004 causes computing device 204 to begin (or cease, if recording is already underway) storing the feed shown on display 208 in memory 404 as a video file.
  • An annotation element 1005 allows text input for annotating image 600 .
  • a panning element 1006 allows image 600 to be panned in a plane parallel to the page of FIG. 10 .
  • a reset element 1007 resets the view shown on display 208 to a previous state (for example, before a recent panning operation).
  • a brightness element 1008 causes computing device to present a further one of the plurality of menus within application 428 on display 208 for adjusting the brightness of display 208 .
  • Also included are a stack element 1010 and a magnification element, which 1012 causes computing device to present a still further one of the plurality of menus within application 428 on display 208 for adjusting the magnification of the video feed from scope 220 .
  • selectable elements include a tool selection element for selecting one of a plurality of tools identified by computing device 204 . Such a selection may be used to restrict output control to the movements of a particular tool, for example.
  • a port visibility element 1014 allows a rendering of access port 208 on display 208 to be toggled on and off (this functionality may also be extended to other tools).
  • a region of interest element 1016 causes computing device 204 to begin tracking the movement of a given surgical instrument to draw a region of interest on image 600 .
  • a tract visibility element 1018 turns the presentation of fluid flow tracts (e.g. nerve fibre tracts, vasculature, and the like) on display 208 on and off.
  • a skull stripping toggle element 1020 and a 2D/3D mode toggle element 1022 can be provided.
  • computing device 204 is configured to return to block 510 and continue monitoring the movements of any active instruments. Assuming that the instruments detected in the previous iteration of method 500 have not been removed from the field of view of tracking camera 224 , the performance of blocks 510 , 513 (optionally) and 515 will not effect any changes, and the performance of block 520 will determine whether any further input gestures have been made. Such input gestures may include a selection of an element of menu 1000 (for example, as specified in the fourth record 900 of FIG. 9 ). In response to selection of a menu element, computing device 204 is configured to generate further updated output data to enable the function corresponding to the selected element. As will now be apparent to those skilled in the art, numerous iterations of method 500 can be performed to control system 112 , while reducing or avoiding the need for surgeon 104 to abandon the surgical instruments in favour of more conventional input devices (such as keyboard and mouse 416 ).
  • FIGS. 11 and 12 provide further examples of output device control during the performance of method 500 .
  • FIG. 11 depicts display 208 presenting a menu 1100 (containing the same selectable elements as menu 1000 , although in a different orientation), and image data 1104 retrieved from repository 432 , in addition to image 600 as discussed above.
  • FIG. 12 depicts image 600 on display 208 . Menus 1000 and 1100 are no longer presented in FIG. 12 (they may be dismissed by certain instrument gestures, or by the selection of certain elements of menus 1000 or 1100 . However, two regions of interest 1200 and 1204 are highlighted within image 600 on display 208 .
  • Regions of interest 1200 and 1204 are the result of further performances of method 500 , in which a region of interest element such as element 1016 was selected, and further instrument gestures were detected to draw the regions.
  • Computing device 204 can be configured to take various actions in connection with regions of interest 1200 and 1204 . For example computing device 204 can apply a mask to image 600 to hide all of image 600 with the exception of regions 1200 and 1204 .
  • output device control achieved through the performance of method 500 will occur to those skilled in the art.
  • images can be projected onto the patient's skull, and optical properties (e.g. magnification, focus and the like) of scope 220 can be altered.
  • optical properties e.g. magnification, focus and the like
  • individual selectable elements within the menus discussed above can be presented on display 208 in various orders and combinations.
  • a further example of output device control involves masking out one or more portions of the surgical instruments identified at block 510 .
  • scope 220 may have a shallow depth of field, and thus portions of the instruments that extend out of access port 228 towards scope 220 may appear out of focus on display 208 .
  • Computing device 204 can be configured, following the identification of the instruments, to generate output data including a mask of the identified instruments that can be combined with the video feed from scope 220 to obscure the unfocussed portions of the instruments with in-focus images of the instruments.
  • FIG. 13 Another example of output device control, referring now to FIG. 13 , includes activating a display mode at block 530 referred to as radial stacking.
  • computing device 204 is configured to present a three dimensional rendering 1300 of the brain in which a slice 1304 of brain tissue may be selected.
  • Computing device 204 is configured to determine the location and plane of slice 1304 based on, for example, instrument movements matched with known gestures at block 520 .
  • Computing device 204 can also be configured to control display 208 to present a two dimensional cross-section 1308 of three dimensional model 1300 , taken in the plane of slice 1304 .
  • a variety of display layers can be presented on cross section 1308 , again based on further motions of surgical instruments detected by computing device 204 .
  • fluid flow tracts 1312 are presented on cross section 1308 (for example, in response to a selection of a menu element such as element 1018 ).
  • Computing device 204 can also be configured to control display 208 to update the position of slice 1304 , and the appearance of cross section 1308 , based on further movement of surgical instruments.
  • FIG. 14 an updated rendering 1400 is shown, in which an updated slice 1404 is depicted.
  • Slice 1404 may be a rotated version of slice 1304 , the rotation of which can be controlled by movement of surgical instruments.
  • a cross section 1408 is also presented on display 208 , representing an updated version of cross section 1308 resulting from the new position of slice 1404 .
  • Tracts 1412 are also updated in cross section 1408 .
  • displayed elements such as tracts 1312 and 1412 may also be restricted to only certain areas of the current slice, such as an area within the current slice and also within a predetermined distance of a tool tip (detected at block 520 ).
  • Elements such as tracts 1312 and 1412 in cross sections 1308 and 1408 can also have configurable depths; that is, tracts 1312 and 1412 can be displayed not only for the exact plane of slices 1304 and 1404 , but also for a configurable number of adjacent planes parallel to those planes.
  • movements of surgical instruments detected by computing device 204 can be used to present three dimensional renderings of those instruments on display 208 , in addition to or instead of a video feed from scope 220 .
  • a model such as rendering 1300 can be updated to show the position of surgical instruments, including access port 228 , tracking their movements substantially in real time. Additional information can also be presented on such renderings.
  • a rendering of a tumor whose location and size are stored in repository 432 can be presented on display 208 .
  • a rendering of a tumor or other data from repository 432 can be overlaid on a video feed (that is, a non-virtual feed) from scope 220 .
  • the rendering can be located and scaled on display 208 (for example, on image 600 discussed above) based on the current magnification of scope 220 and the location of access port 228 .
  • FIGS. 15A , 15 B and 15 C examples of output data presented on display 208 in this embodiment are shown.
  • FIG. 15 a shows a modified version of the display shown in FIG.
  • a video feed from scope 220 showing access port 228 is supplemented with a (virtual) rendering of a tumour 1500 in two or three dimensions, indicating that the tumour is larger than the field of view into the patient's brain afforded by access port 228 .
  • FIG. 15B shows a rendering of access port 228 and tumour 1500 in an orientation perpendicular to the axis of access port 228 .
  • the display of FIG. 15B is generally virtual rather than being supplemented with video from scope 220 .
  • FIG. 15C shows an additional rendering of tumour 1500 , the patient's brain 1504 , and access port 228 , depicting the scale of access port 228 relative to the entire brain 1504 and tumour 1500 .
  • the views of FIGS. 15A-15C can be controlled through method 500 , and can also be combined on display 208 (for example, in three panes).
  • equipment tower 200 can be omitted entirely or replaced with two or more towers.
  • computing device 204 need not be co-located with the remainder of system 112 . Instead, computing device 204 can be connected to the remainder of system 112 via a network, such as the Internet. In still other variations, computing device 204 can be implemented in a distributed computing framework.
  • markers and tracking technologies other than IR can be employed.
  • markers 236 can include RFID tags, electromagnetic sensors, LEDs or the like.
  • markers 236 can be omitted entirely, and computing device 204 can instead be configured to employ known image processing techniques to locate and identify surgical instruments in the field of view of tracking camera 224 or any other suitable tracking system.

Abstract

A method, system and apparatus for controlling a surgical navigation system are provided. The method 1 comprises receiving image data at a processor from a tracking system; receiving, at a processor, an identifier of a surgical instrument within a field of view of the tracking system; generating, at the processor, output data based on the identifier of the surgical instrument; and transmitting the output data to at least one output device connected to the processor, for controlling the output device.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application claims priority from U.S. Provisional Application Nos. 61/801,530, filed Mar. 15, 2013; 61/800,155, filed Mar. 15, 2013; 61/818,280, filed May 1, 2013; and 61/924,993, filed Jan. 8, 2014. The contents of all the above-mentioned provisional applications are incorporated herein by reference.
  • FIELD
  • The specification relates generally to navigation systems, and specifically to a method, system and apparatus for navigation systems for use in image guided medical procedures.
  • BACKGROUND
  • The performance of surgical procedures often calls for a surgeon to access significant volumes of information. As a result, various surgical assistance systems exist that place some of this information at the surgeon's disposal. However, conventional technologies for navigating such systems to access the desired information may require the surgeon to deposit surgical instruments and manipulate other devices, or to attempt to communicate desired system interactions to an assistant. As a result, although all the required information may be present, access to that information during the procedure may be hampered.
  • SUMMARY
  • An aspect of the specification provides a method of controlling a surgical navigation system, comprising: receiving, at a processor, an identifier of a surgical instrument within a field of view of a tracking system; generating, at the processor, output data based on the identifier of the surgical instrument; and transmitting the output data to an output device connected to the processor, for controlling the output device. Further aspects of the specification include a computing device configured to perform the above method, and a non-transitory computer-readable medium storing a plurality of computer readable instructions executable by a processor for implementing the above method.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • Embodiments are described with reference to the following figures, in which:
  • FIG. 1 depicts a navigation system in use in a surgical procedure, according to a non-limiting embodiment;
  • FIG. 2 depicts a schematic diagram of the navigation system of FIG. 1, according to a non-limiting embodiment;
  • FIG. 3 depicts a schematic diagram of the navigation system of FIG. 1, according to another non-limiting embodiment;
  • FIG. 4 depicts a computing device of the system of FIG. 1, according to a non-limiting embodiment;
  • FIG. 5 depicts a method of controlling a surgical navigation system, according to a non-limiting embodiment;
  • FIG. 6 depicts an example performance of block 505 of the method of FIG. 5, according to a non-limiting embodiment;
  • FIG. 7 depicts instrument definitions stored by the computing device of FIG. 4, according to a non-limiting embodiment;
  • FIG. 8 depicts gesture definitions stored by the computing device of FIG. 4, according to a non-limiting embodiment;
  • FIG. 9 depicts output control rule definitions stored by the computing device of FIG. 4, according to a non-limiting embodiment;
  • FIG. 10 depicts an example performance of block 530 of the method of FIG. 5, according to a non-limiting embodiment;
  • FIG. 11 depicts another example performance of block 530 of the method of FIG. 5, according to a non-limiting embodiment; and
  • FIG. 12 depicts a further example performance of block 530 of the method of FIG. 5, according to a non-limiting embodiment;
  • FIG. 13 depicts a further example performance of block 530 of the method of FIG. 5, according to a non-limiting embodiment;
  • FIG. 14 depicts a further example performance of block 530 of the method of FIG. 5, according to a non-limiting embodiment; and
  • FIGS. 15A, 15B and 15C depict further example performances of block 530 of the method of FIG. 5, according to a non-limiting embodiment.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Various embodiments and aspects are described below. The following description, and the accompanying drawings, are illustrative and are not to be construed as limiting in scope.
  • FIG. 1 depicts a surgeon 104 conducting a minimally invasive port-based surgical procedure on a patient 108 in an operating room (OR) environment. The surgical procedure is supported by a navigation system 112, including a computing device connected to a variety of input devices (e.g. a tracking sensor such as a camera, a keyboard and mouse and the like) and controlling a variety of output devices (e.g. a display, illumination equipment and the like). System 112 also includes a variety of surgical instruments, whose motions may be tracked by system 112. An assistant or operator 116 can also be present, and both surgeon 104 and assistant 116 can operate system 112. In particular, as will be discussed below, system 112 is configured to control the output devices based on input from a variety of sources, including not only the above mentioned input devices, but also the tracked surgical instruments that are manipulated by surgeon 104 during the procedure.
  • Turning now to FIG. 2, a block diagram illustrating certain components of system 112 is depicted. As seen in FIG. 2, system 112 includes an equipment tower 200 supporting a computing device 204, along with other equipment. Equipment tower 200 is mounted on a rack, cart, or the like, and may also support a power supply for the remaining components of system 112.
  • Computing device 204 is connected to output devices including a display, such as displays 208 and 212, and a robotic arm 216. Each of displays 208 and 212 can be based on any suitable display technology. For example, display 208 can be a flat panel display comprising any one of, or any suitable combination of, a Liquid Crystal Display (LCD), a plasma display, an Organic Light Emitting Diode (OLED) display, and the like. Other display technologies on which displays 208 and 212 can be based include projection systems, cathode ray tube (CRT) displays, Computing device 204 is also connected to input devices including an optical scope 220 (also referred to as an exoscope), and a tracking sensor such as a tracking camera 224, which can be a stereoscopic camera. Examples of such cameras, such as the “Polaris” unit available from Northern Digital Imaging (NDI), will occur to those skilled in the art. Tracking camera 224 may be configured to receive visible light, IR, or both. Although tracking camera 224 is discussed herein as an example tracking sensor, it is to be understood that other tracking sensors may also be used instead of, or in addition to, tracking camera 224. Thus, any references to tracking camera 224 below may also refer, in other embodiments, to any of a variety of suitable tracking systems known to those skilled in the art.
  • Minimally invasive brain surgery using access ports is a recently conceived method of performing surgery on brain tumors previously considered inoperable. Such minimally invasive procedures are performed through a relatively small opening in a patient's skull. To that end, system 112 also includes an access port 228 for insertion through the skull of patient 108—which is immobilized by a holder 230—and into the brain of patient 108. An introducer 234 with an atraumatic tip (for reducing damage to brain tissue during the insertion of access port 228) is inserted into access port 228, and access port 228 and introducer 234 together are inserted into the skull of patient 108.
  • Introducer 234 includes fiduciary markers 236 such as IR-reflecting markers, that are detectable by tracking camera 224. In the present embodiment, tracking camera 224 can emit infrared light, which is reflected by markers 236 and permits tracking camera 224 (which is sensitive to IR radiation) to capture images from which markers 236 can readily be isolated. As will be discussed below, robotic arm 216 and other instrumentation can also carry fiduciary markers. Camera 224 in conjunction with computing device 204 can determine the spatial positions of markers 236 using conventional motion tracking algorithms. Computing device 204 is therefore configured to track the position of markers 236 (and by extension, the position of introducer 228) as introducer 234 is moved within the field of view of tracking camera 224. In addition, it is contemplated that the spatial position of patient 108's skull was previously determined and stored by computing device 204.
  • Because introducer 234 is held within access port 228 during insertion of access port 228 into the skull of patient 108, markers 236 allow computing device 204 to track not only introducer 234, but also access port 228 itself, even if access port 228 does not carry any markers. The tracked position of introducer 234 relative to the known position of the skull of patient 108 can be presented on one or both of displays 208 and 212. Various views (e.g. axial, sagittal, coronal, perpendicular to tool tip, in-plane of tool shaft, and the like) of the relative positions of introducer 234, access port 228 and the skull can be presented on displays 208 and 212.
  • Once introducer 234 and access port 228 have been inserted into the brain of patient 108, introducer 234 may be removed from access port 228 to allow access to the brain tissue through a central opening of access port 228. In some embodiments, access port 228 does not carry any fiduciary markers, and therefore may not be able to be directly tracked after the removal of introducer 234. However, other surgical instruments carrying markers can be used to indirectly track access port 228. In other embodiments, including the embodiments discussed in detail below, access port 228 itself can carry fiduciary markers 236.
  • System 112 can also include an articulated arm 238 anchored at one end to holder 230, and having at an opposite end a clamp for engaging access port 228. Arm 238 may be employed to fix the position of access port 228 after insertion. Arm 238 may also have locked and unlocked positions, such that in the locked position access port 228 is not permitted to move, while in the unlocked position movement (at least in certain axes) by access portion 228 is permitted.
  • Turning to FIG. 3, another depiction of system 112 is illustrated, in which only display 208 is included. Additional surgical instruments 300 are also shown (such as a probing instrument and a suction instrument, for example), each carrying fiduciary markers 236. Further, as mentioned above, scope 220 also carries markers 236 in FIG. 3.
  • In general, therefore, the movements of certain components of system 112, particularly surgical instruments, can be tracked in space. As will be discussed below in greater detail, computing device 204 can control the output devices of system 112 based on those tracked movements. The control of output devices need not be based only on tracked movements—output control can also be based on other contextual data, including the specific identity of the tracked instruments, as well as surgical planning data. The surgical planning data can include an identifier of the current phase or stage of the surgical procedure, which can be determined at computing device 204 either via receipt of an input from an operator (e.g. surgeon 104), or by other triggers automatically detected by computing device 204. Those triggers can include detection of a tip of access port 228 traversing the outer boundary of the skull, indicating that cannulation is occurring. For example, as will be discussed below, displays 208 and 212 can be controlled to present various selectable interface elements (including menus) based on instrument identities and movements. The components and operation of computing device 204 will now be discussed in greater detail.
  • Turning to FIG. 4, a schematic diagram of certain components of computing device 204 is shown in relation to other components of system 112. Computing device 204 includes a central processing unit (also referred to as a microprocessor or simply a processor) 400 interconnected with a non-transitory computer readable storage medium such as a memory 404. Processor 400 and memory 404 are generally comprised of one or more integrated circuits (ICs), and can have a variety of structures, as will now occur to those skilled in the art (for example, more than one CPU can be provided).
  • Memory 404 can be any suitable combination of volatile (e.g. Random Access Memory (“RAM”)) and non-volatile (e.g. read only memory (“ROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory, magnetic computer storage device, or optical disc) memory. In the present example, memory 404 includes both a volatile memory and a non-volatile memory. Other types of non-transitory computer readable storage medium are also contemplated, such as compact discs (CD-ROM, CD-RW) and digital video discs (DVD).
  • Computing device 204 also includes a network interface 408 interconnected with processor 400. Network interface 408 allows computing device 204 to communicate with other computing devices via a network (e.g. a local area network (LAN), a wide area network (WAN) or any suitable combination thereof). Network interface 408 thus includes any necessary hardware for communicating over such networks.
  • Computing device 204 also includes an input/output interface 412, including the necessary hardware for interconnecting processor 400 with various input and output devices. Interface 412 can include, among other components, a Universal Serial Bus (USB) port, an audio port for sending and receiving audio data, a Video Graphics Array (VGA), Digital Visual Interface (DVI) or other port for sending and receiving display data, and any other suitable components.
  • Via interface 412, computing device 204 is connected to input devices including a keyboard and mouse 416, a microphone 420, as well as scope 220 and tracking camera 224, mentioned above. Also via interface 412, computing device 204 is connected to output devices including illumination or projection components (e.g. lights, projectors and the like), as well as display 208 and robotic arm 216 mentioned above. Other input (e.g. touch screens) and output devices (e.g. speakers) will also occur to those skilled in the art.
  • Computing device 204 stores, in memory 404, an interface management application 428 (also referred to herein as application 428) comprising a plurality of computer readable instructions executable by processor 400. When processor 404 executes the instructions of application 428 (or, indeed, any other application stored in memory 404), processor 404 performs various functions implemented by those instructions, as will be discussed below. Processor 400, or computing device 204 more generally, is therefore said to be “configured” to perform those functions via the execution of application 428.
  • Also stored in memory 404 are various data repositories, including patient data 432, surgical instrument definitions 436, input gesture definitions 440, and output control rules 444. Patient data 432 includes a surgical plan defining the various steps of the minimally invasive surgical procedure, as well as image data relating to patient 108, such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) scans, three-dimensional models of the brain of patient 108 and the like. Instrument definitions 436 includes data defining characteristics of at least one of the surgical instruments to be used in the surgical procedure—such characteristics allow computing device 204 to differentiate between instruments in image data received from tracking camera 224. Gesture definitions 440 include data defining various movements of the instruments defined in instrument definitions 436. Finally, rules 444 contain associations between the gestures defined in gesture definitions 440 and output operations to be effected by computing device 204. These repositories will be described in further detail below.
  • It is to be understood that although repositories 432, 436, 440 and 444 are shown as databases in FIG. 4, their data structures are not particularly limited—the data contained within each repository can be stored in any suitable structure.
  • Turning now to FIG. 5, a method 500 of controlling a navigation system, such as system 112, is shown. Method 500 will be described in conjunction with its performance on system 112, and particularly on computing device 204, although it is contemplated that method 500, and variants thereof, can also be adapted to other systems.
  • At block 505, computing device 204 is configured to control one or more output devices of system 112. To control the output devices of system 112, processor 400 is configured to generate output data and transmit the output data, via I/O interface 412, to the relevant output devices. The nature of the control at block 505—which output devices are controlled and what output data is generated—is not particularly limited. In the present example performance of method 500, at block 505 processor 400 is configured to control display 208 to present a video feed received from scope 220 on display 208. An example of such a feed is shown in FIG. 6, where an image 600 representing a frame of the video feed is presented on display 208. In image 600, a portion of access port 228 is visible, and brain tissue 604 is visible through access port 228. Access port 228 and brain tissue 604 may be visible on display 208 at a configurable magnification greater than 1. Also visible in image 600 are the tips of two surgical instruments 300.
  • It is also contemplated that at block 505, an overhead light or projector 424 is controlled by computing device 204 to project white light at a predefined brightness onto access port 228 to illuminate brain tissue 604. As will be discussed below, a wide variety of control mechanisms are contemplated, and they need not include overhead lighting in some embodiments. For example, overhead lights may not be controlled by computing device 204 in some embodiments.
  • Proceeding to block 510, computing device 204 is configured to identify surgical instruments that are active (that is, present in the field of view of tracking camera 224). Computing device 204 receives image data from tracking camera 224 via interface 412. The received image data contains artifacts representing reflected light from markers 236, and computing device 204 is configured to compare the image data, including such artifacts, to instrument definitions 436 to determine which surgical instruments, if any, are present within the field of view of tracking camera 224.
  • Turning briefly to FIG. 7, an example of instrument definitions 436 is shown. Instrument definitions 436 includes a plurality of records 700, each including an instrument identifier (e.g. “suction”) and one or more instrument characteristics. In the present example, each record 700 includes an indication of the geometry of markers 236 attached to that particular instrument (that is, the positions of markers 236 relative to each other). Thus, at block 510, computing device 204 is configured to compare the geometry of markers in image data received from tracking camera 224 to the geometries specified in definitions 436. When the geometry of one or more markers in the image data matches the geometry specified in a given record 700, the corresponding instrument identifier in that record 700 is selected for further processing.
  • A wide variety of instrument characteristics can be included in records 700 instead of, or in addition to, marker geometry. Other examples of instrument characteristics include marker reflectivity, marker size, and the like. In still other embodiments, surgical instruments can be equipped with RFID tags or other near-field communication devices that broadcast instrument identities to computing device 204.
  • In some embodiments, tool definitions 436 can be omitted entirely from computing device 204. Instead, tracking camera 224 (or, as mentioned earlier, any other suitable tracking system) can be configured to identify instruments and transmit instrument identifiers and position data to computing device 204, instead of transmitting image data for computing device 204 to process.
  • Having identified active instruments at block 510, computing device 204 can be configured to perform block 513. At block 513, computing device 204 is configured to generate updated output data for controlling the output devices of system 112 based on the identities of the active instruments. For example, instrument definitions 436 can include output commands in addition to the instrument identifiers and characteristics. Such output commands can cause computing device 204 to select a particular menu of selectable interface elements for presentation on display 208, among a plurality of selectable interface elements contained in application 428. Such output commands can also configure computing device 204 to control illumination and projection equipment 424 in a predefined manner, or to control display 208 to overlay data from repository 432 on image 600 (for example, a three dimensional model of the patient's brain, a CT scan, or the like).
  • The above-mentioned output commands need not be specified in instrument definitions 436. Instead, such output commands can be specified in planning data in repository 432. For example, each stage of the surgical procedure can contain data identifying the instruments expected to be used for that stage, and specifying output commands for controlling the output devices of system 112. The identification of instruments matching those in a certain stage of the planning data (or matching certain relative states, such as instrument positions, e.g. probe tip within the skull boundary) can indicate that the procedure has reached that certain stage, and computing device 204 can be configured to implement the output commands associated with the stage.
  • In other embodiments, the performance of block 513 can be omitted. The performance of method 500 therefore proceeds from either of blocks 510 or 513 to block 515.
  • At block 515, computing device 204 is configured to determine whether an input mode has been activated. In an input mode, the movements of the instruments identified at block 510 can control the output devices of system 112 connected to computing device 204. The determination at block 515 can take a variety of forms. For example, computing device 204 can be configured simply to detect whether one or more of the instruments identified at block 510 is moving, based on image data continually received from tracking camera 224. If the instruments are stationary (or show movement below a predetermined threshold), the determination at block 515 is negative, and the performance of method 500 returns to block 510.
  • If, on the other hand, the instruments do show movement beyond zero, or beyond some other predetermined lower bound, the determination at block 515 is affirmative, and the performance of method 500 proceeds to block 520, to be discussed below. Alternatively, the determination at block 515 can be affirmative (that is, the input mode is active) if an instrument remains stationary and within a certain set distance of another instrument for a set amount of time.
  • In other embodiments, the determination by computing device 204 at block 515 can take other forms. For example, at block 515 computing device 204 may be configured to await specific input data, such as audible command (such as a voice command, e.g. “input on”) recorded by microphone 420. In another example, computing device 204 may be configured to await a specific input from keyboard or mouse 416, or from another input device such as a foot pedal (not shown) available to surgeon 104.
  • Having determined that an input mode has been activated, at block 520 computing device 204 is configured to determine whether the tracked movements of the instruments identified at block 510 match any of the gesture definitions in repository 440. As will now be apparent to those skilled in the art, processor 400 continually receives image data (or instrument identifiers and positions, as mentioned above) from tracking camera 224 and processes such data according to conventional motion-tracking mechanisms to generate motion data (e.g. speed, direction, coordinates) for the instruments substantially in real-time. Processor 400 is therefore configured to compare the motion data to the definitions in repository 440, and determine whether the motion data matches any of the definitions.
  • Turning to FIG. 8, an example of gesture definitions repository 440 is shown. Repository 440 includes a plurality of records 800, each defining a gesture. Each record 800 includes a gesture identifier, and corresponding characteristics of that gesture. For example, a “shake” gesture is defined in the present example as three reversals in movement velocity of an instrument within a time period of one second, and a “tap” gesture is defined as a minimum of one second of overlap between the positions of two instruments, as determined by processor 400. A “135 degree” gesture is defined as an instrument being held at an angle of one hundred and thirty five degrees relative to the center of the access port. A wide variety of other gestures can also be defined, and other characteristics can be used to define such gestures. For example, certain gestures can be defined by the relative position of an instrument in comparison to segments of the field of view of scope 220, such that the presence of an instrument in a certain quadrant of the field of view for a certain time period is interpreted as a gesture by computing device 204. Other gestures can be defined by the speed or timing of a rotation of the instrument, the distance between the tips of two instruments, and the like.
  • Each record 800 can also specify tolerances (not shown) for the characteristics. For example, the time periods shown in FIG. 8 may have tolerances of 10%, such that three velocity reversals occurring in 1.1 seconds would still be interpreted as a “shake”. Such tolerances, and any other gesture characteristics, can also be defined in association with a specific surgeon or surgical procedure. For example, a first surgeon may require gesture definitions with greater tolerances than a second surgeon.
  • Returning to FIG. 5, if the determination at block 520 is negative (that is, the motion data representing the movement of the identified instruments does not match any predefined gestures), the performance of method 500 returns to block 515. In other words, computing device 204 is configured to confirm whether or not an input mode remains active, and to monitor for any further movements that may match defined gestures.
  • If, on the other hand, the determination at block 520 is affirmative (that is, the movement of the identified instruments does match a predefined gesture), the performance of method 500 proceeds to block 525.
  • At block 525, computing device 204 is configured to select a command corresponding to the gesture detected at block 520, based on output control rules 444. Turning to FIG. 9, an example of rules 444 is shown. Rules 444 include a plurality of records 900 each defining an output control rule. Each record 900 includes a command definition for controlling one or more output devices of system 112. Each record 900 can also include, corresponding to the command definition, a gesture identifier and an instrument identifier.
  • In the present example, three rules are defined in rules 444. The first of records 900 defines a command that will cause robotic arm 216 to follow the motion of the suction instrument for a certain time after the suction instrument has registered a “shake” gesture. Such a command can be used to reposition scope 220. The second of records 900 defines a command that will cause an overhead light 424 to increase in brightness when a probe instrument registers a “shake” gesture. The third of records 900 defines a command that will cause display 208 to be updated to present a menu containing selectable interface elements relevant to tumor resection when the suction and probe instruments register a “tap” gesture. The fourth of records 900 defines a command that will cause a particular selectable element of the resection menu to be selected when the suction device is held at an angle of one hundred thirty five degrees in relation to the center of access port 228.
  • It will be understood that the rules shown in FIG. 9 are merely examples, and that a wide variety of other rules are also contemplated. As mentioned earlier, application 428 can contain a plurality of menus, each including various selectable elements. Rules 444 can contain one or more records defining conditions under which each of the plurality of menus is to be selected for presentation on display 208.
  • In some embodiments, additional parameters corresponding to the command definition can be included in a record 900, while in other embodiments some parameters can be omitted. Examples of other parameters include a stage of the surgical procedure (as defined in patient data 432); an identifier of a surgeon; characteristics of the image currently shown on display 208 (for example, image characteristics indicative of tumor tissue, such as brightness, contrast, or colour values); and other output data already provided to the output devices, such as which menu is currently presented on display 208. In general, rules 444 define associations between the context in which surgical instruments are being used, and commands to control the output devices of system 112.
  • Thus, at block 525 computing device 204 is configured to compare the identities of the instruments identified at block 510, the context of use of those instruments (e.g. gestures detected at block 520, stage of the procedure, identity of the surgeon), to rules 444 and select a rule that matches the current context. The command of that particular rule is the command selected at block 525.
  • Having selected a command at block 525, at block 530 computing device 204 is configured to update the control of the output devices of system 112 based on the selected command. The nature of the control effected at block 530 is defined by the particular command selected at block 525, and can therefore vary greatly. An example of a performance of block 530 is shown in FIG. 10.
  • FIG. 10 depicts an updated interface presented on display 208, in which image 600 is shown following a “tap” gesture with the suction and probe instruments. In addition to image 600, which represents a frame of the video feed from scope 220 as discussed earlier, a menu 1000 is presented on display 208. Menu 100 is one of the plurality of menus within application 428, and includes a plurality of selectable elements. Each element is selectable for causing computing device 204 to execute a specific operation implemented by the instructions of application 428. For example, a record element 1004 causes computing device 204 to begin (or cease, if recording is already underway) storing the feed shown on display 208 in memory 404 as a video file. An annotation element 1005 allows text input for annotating image 600. A panning element 1006 allows image 600 to be panned in a plane parallel to the page of FIG. 10. A reset element 1007 resets the view shown on display 208 to a previous state (for example, before a recent panning operation). A brightness element 1008 causes computing device to present a further one of the plurality of menus within application 428 on display 208 for adjusting the brightness of display 208. Also included are a stack element 1010 and a magnification element, which 1012 causes computing device to present a still further one of the plurality of menus within application 428 on display 208 for adjusting the magnification of the video feed from scope 220.
  • Other examples of selectable elements include a tool selection element for selecting one of a plurality of tools identified by computing device 204. Such a selection may be used to restrict output control to the movements of a particular tool, for example. A port visibility element 1014 allows a rendering of access port 208 on display 208 to be toggled on and off (this functionality may also be extended to other tools). A region of interest element 1016 causes computing device 204 to begin tracking the movement of a given surgical instrument to draw a region of interest on image 600. A tract visibility element 1018 turns the presentation of fluid flow tracts (e.g. nerve fibre tracts, vasculature, and the like) on display 208 on and off. In addition, a skull stripping toggle element 1020 and a 2D/3D mode toggle element 1022 can be provided.
  • With menu 1000 presented on display 208, computing device 204 is configured to return to block 510 and continue monitoring the movements of any active instruments. Assuming that the instruments detected in the previous iteration of method 500 have not been removed from the field of view of tracking camera 224, the performance of blocks 510, 513 (optionally) and 515 will not effect any changes, and the performance of block 520 will determine whether any further input gestures have been made. Such input gestures may include a selection of an element of menu 1000 (for example, as specified in the fourth record 900 of FIG. 9). In response to selection of a menu element, computing device 204 is configured to generate further updated output data to enable the function corresponding to the selected element. As will now be apparent to those skilled in the art, numerous iterations of method 500 can be performed to control system 112, while reducing or avoiding the need for surgeon 104 to abandon the surgical instruments in favour of more conventional input devices (such as keyboard and mouse 416).
  • FIGS. 11 and 12 provide further examples of output device control during the performance of method 500. FIG. 11 depicts display 208 presenting a menu 1100 (containing the same selectable elements as menu 1000, although in a different orientation), and image data 1104 retrieved from repository 432, in addition to image 600 as discussed above. FIG. 12 depicts image 600 on display 208. Menus 1000 and 1100 are no longer presented in FIG. 12 (they may be dismissed by certain instrument gestures, or by the selection of certain elements of menus 1000 or 1100. However, two regions of interest 1200 and 1204 are highlighted within image 600 on display 208. Regions of interest 1200 and 1204 are the result of further performances of method 500, in which a region of interest element such as element 1016 was selected, and further instrument gestures were detected to draw the regions. Computing device 204 can be configured to take various actions in connection with regions of interest 1200 and 1204. For example computing device 204 can apply a mask to image 600 to hide all of image 600 with the exception of regions 1200 and 1204.
  • Still other examples of output device control achieved through the performance of method 500 will occur to those skilled in the art. For example, images can be projected onto the patient's skull, and optical properties (e.g. magnification, focus and the like) of scope 220 can be altered. Further, individual selectable elements within the menus discussed above can be presented on display 208 in various orders and combinations.
  • A further example of output device control, particularly (although not exclusively) at block 513, involves masking out one or more portions of the surgical instruments identified at block 510. For example, scope 220 may have a shallow depth of field, and thus portions of the instruments that extend out of access port 228 towards scope 220 may appear out of focus on display 208. Computing device 204 can be configured, following the identification of the instruments, to generate output data including a mask of the identified instruments that can be combined with the video feed from scope 220 to obscure the unfocussed portions of the instruments with in-focus images of the instruments.
  • Another example of output device control, referring now to FIG. 13, includes activating a display mode at block 530 referred to as radial stacking. In this display mode, computing device 204 is configured to present a three dimensional rendering 1300 of the brain in which a slice 1304 of brain tissue may be selected. Computing device 204 is configured to determine the location and plane of slice 1304 based on, for example, instrument movements matched with known gestures at block 520. Computing device 204 can also be configured to control display 208 to present a two dimensional cross-section 1308 of three dimensional model 1300, taken in the plane of slice 1304. A variety of display layers can be presented on cross section 1308, again based on further motions of surgical instruments detected by computing device 204. In the present example, fluid flow tracts 1312 are presented on cross section 1308 (for example, in response to a selection of a menu element such as element 1018).
  • Computing device 204 can also be configured to control display 208 to update the position of slice 1304, and the appearance of cross section 1308, based on further movement of surgical instruments. Turning to FIG. 14, an updated rendering 1400 is shown, in which an updated slice 1404 is depicted. Slice 1404 may be a rotated version of slice 1304, the rotation of which can be controlled by movement of surgical instruments. Similarly, a cross section 1408 is also presented on display 208, representing an updated version of cross section 1308 resulting from the new position of slice 1404. Tracts 1412 are also updated in cross section 1408. It will now be apparent that through manipulation of surgical instruments, it is possible to cause slice 1304 to sweep through a full 360° rotation, or to relocate to any location or angle within the brain. In some embodiments, displayed elements such as tracts 1312 and 1412 may also be restricted to only certain areas of the current slice, such as an area within the current slice and also within a predetermined distance of a tool tip (detected at block 520). Elements such as tracts 1312 and 1412 in cross sections 1308 and 1408 can also have configurable depths; that is, tracts 1312 and 1412 can be displayed not only for the exact plane of slices 1304 and 1404, but also for a configurable number of adjacent planes parallel to those planes.
  • In still further embodiments, movements of surgical instruments detected by computing device 204 can be used to present three dimensional renderings of those instruments on display 208, in addition to or instead of a video feed from scope 220. For example, a model such as rendering 1300 can be updated to show the position of surgical instruments, including access port 228, tracking their movements substantially in real time. Additional information can also be presented on such renderings.
  • For example, a rendering of a tumor whose location and size are stored in repository 432 can be presented on display 208. In some examples, a rendering of a tumor or other data from repository 432 can be overlaid on a video feed (that is, a non-virtual feed) from scope 220. The rendering can be located and scaled on display 208 (for example, on image 600 discussed above) based on the current magnification of scope 220 and the location of access port 228. Referring to FIGS. 15A, 15B and 15C, examples of output data presented on display 208 in this embodiment are shown. FIG. 15 a shows a modified version of the display shown in FIG. 6, in which a video feed from scope 220 showing access port 228 is supplemented with a (virtual) rendering of a tumour 1500 in two or three dimensions, indicating that the tumour is larger than the field of view into the patient's brain afforded by access port 228.
  • FIG. 15B shows a rendering of access port 228 and tumour 1500 in an orientation perpendicular to the axis of access port 228. The display of FIG. 15B is generally virtual rather than being supplemented with video from scope 220.
  • FIG. 15C shows an additional rendering of tumour 1500, the patient's brain 1504, and access port 228, depicting the scale of access port 228 relative to the entire brain 1504 and tumour 1500. The views of FIGS. 15A-15C can be controlled through method 500, and can also be combined on display 208 (for example, in three panes).
  • Variations to the above systems and methods are contemplated. For example, in some embodiments equipment tower 200 can be omitted entirely or replaced with two or more towers. Additionally, in some embodiments computing device 204 need not be co-located with the remainder of system 112. Instead, computing device 204 can be connected to the remainder of system 112 via a network, such as the Internet. In still other variations, computing device 204 can be implemented in a distributed computing framework.
  • In still further variations, markers and tracking technologies other than IR can be employed. For example, markers 236 can include RFID tags, electromagnetic sensors, LEDs or the like. In still other variations, markers 236 can be omitted entirely, and computing device 204 can instead be configured to employ known image processing techniques to locate and identify surgical instruments in the field of view of tracking camera 224 or any other suitable tracking system.
  • A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by any one the patent document or patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyrights whatsoever.
  • Persons skilled in the art will appreciate that there are yet more alternative implementations and modifications possible for implementing the embodiments, and that the above implementations and examples are only illustrations of one or more embodiments. The scope of the claims should not be limited by the embodiments set forth above, but should be given the broadest interpretation consistent with the description as a whole.

Claims (26)

We claim:
1. A method of controlling a surgical navigation system, comprising:
receiving, at a processor, an identifier of a surgical instrument within a field of view of a tracking system;
generating, at the processor, output data based on the identifier of the surgical instrument; and
transmitting the output data to an output device connected to the processor, for controlling the output device.
2. The method of claim 1, further comprising:
receiving image data at a processor from a tracking system;
storing a surgical instrument definition; wherein receiving the identifier comprises determining the identifier of the surgical instrument by comparing the image data with the surgical instrument definition.
3. The method of claim 1 or claim 2, further comprising:
storing a plurality of selectable interface elements;
wherein generating the output data comprises retrieving a subset of the selectable interface elements; and
wherein transmitting the output data comprises presenting the subset on a display.
4. The method of claim 3, further comprising:
selecting one of the subset of elements presented on the display based on the identifier; and
generating further output data based on the selected one of the subset.
5. The method of any one of claims 1 to 4, further comprising:
storing gesture definitions;
receiving motion data representing movement of the surgical instrument; and
determining whether the motion data matches one of the gesture definitions.
6. The method of claim 5, comprising:
when the motion data matches one of the gesture definitions, generating the output data based on the identifier of the surgical instrument and on the one of the gesture definitions.
7. The method of any one of claims 1 to 6, further comprising:
storing a plurality of output control rules each including an instrument identifier; wherein generating the output data comprises selecting one of the control rules having an instrument identifier matching the determined identifier.
8. The method of any one of claims 1 to 7, wherein transmitting the output data comprises transmitting respective portions of the output data to at least one of a display, a projector, and a robotic arm.
9. The method of claim 1, wherein receiving the identifier of the surgical instrument at the processor comprises receiving the identifier from the tracking system coupled to the processor.
10. The method of claim 1, wherein the output device comprises a display, and wherein transmitting the output data comprises controlling the display to present a rotatable three dimensional slice in a model of brain tissue.
11. The method of claim 10, wherein transmitting the output data further comprises controlling the display to rotate the slice about an axis based on the identifier of the surgical instrument.
12. The method of claim 1, wherein the output device comprises a display; the method further comprising:
storing a tumour definition in the memory;
transmitting the output data by controlling the display to present a model of the tumour in conjunction with one of a video feed of an access port instrument, and a model of the access port instrument.
13. The method of claim 1,
Figure US20160038253A1-20160211-P00999
14. A computing device, comprising:
a memory;
a processor connected to the memory;
an interface connecting the processor to a tracking system; and
an output device connected to the processor;
the processor configured to:
receive an identifier of a surgical instrument within a field of view of the tracking system;
generate output data based on the identifier of the surgical instrument; and
transmit the output data to the output device for controlling the output device.
15. The computing device of claim 14, wherein the memory stores store a surgical instrument definition, and wherein the processor is further configured to:
receive image data at a processor from a tracking system; and
determine the received identifier of the surgical instrument by comparing the image data with the surgical instrument definition.
16. The computing device of claim 14 or claim 15, wherein the memory stores a plurality of selectable interface elements; wherein the output device includes a display, and wherein the processor is further configured to:
generate the output data by retrieving a subset of the selectable interface elements from the memory; and
transmit the output data by presenting the subset on the display.
17. The computing device of claim 16, the processor further configured to:
select one of the subset of elements presented on the display based on the identifier; and
generate further output data based on the selected one of the subset.
18. The computing device of any one of claims 14 to 17, wherein the memory stores gesture definitions, and wherein the processor is further configured to:
receive motion data representing movement of the surgical instrument; and
determine whether the motion data matches one of the gesture definitions.
19. The computing device of claim 18, the processor further configured to:
when the motion data matches one of the gesture definitions, generate the output data based on the identifier of the surgical instrument and on the one of the gesture definitions.
20. The computing device of any one of claims 14 to 19, wherein the memory stores a plurality of output control rules each including an instrument identifier, and wherein the processor is further configured to:
generate the output data by selecting one of the control rules having an instrument identifier matching the determined identifier.
21. The computing device of any one of claims 14 to 20, wherein the output device includes one or more of a display, a projector and a robotic arm; the processor further configured to transmit the output data by transmitting respective portions of the output data to at least one of the display, the projector, and the robotic arm.
22. The computing device of claim 14, the processor further configured to receive the identifier of the surgical instrument by receiving the identifier from the tracking system.
23. The computing device of claim 14, wherein the output device comprises a display, the processor further configured to transmit the output data by controlling the display to present a rotatable three dimensional slice in a model of brain tissue.
24. The computing device of claim 23, the processor further configured to transmit the output data further by controlling the display to rotate the slice about an axis based on the identifier of the surgical instrument.
25. The computing device of claim 14, wherein the memory stores a tumour definition, and wherein the output device comprises a display; the processor further configured to transmit the output data by controlling the display to present a model of the tumour in conjunction with one of a video feed of an access port instrument, and a model of the access port instrument.
26. A non-transitory computer-readable medium storing a plurality of computer readable instructions executable by a processor for implementing the method of any one of claims 1 to 13.
US14/775,192 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system Abandoned US20160038253A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/775,192 US20160038253A1 (en) 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201361801530P 2013-03-15 2013-03-15
US201361800155P 2013-03-15 2013-03-15
US201361818280P 2013-05-01 2013-05-01
US201461924993P 2014-01-08 2014-01-08
US14/775,192 US20160038253A1 (en) 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system
PCT/CA2014/000247 WO2014138916A1 (en) 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2014/000247 A-371-Of-International WO2014138916A1 (en) 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/694,241 Continuation US10925676B2 (en) 2013-03-15 2017-09-01 Method, system and apparatus for controlling a surgical navigation system

Publications (1)

Publication Number Publication Date
US20160038253A1 true US20160038253A1 (en) 2016-02-11

Family

ID=51535674

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/775,192 Abandoned US20160038253A1 (en) 2013-03-15 2014-03-14 Method, system and apparatus for controlling a surgical navigation system
US15/694,241 Active 2035-09-12 US10925676B2 (en) 2013-03-15 2017-09-01 Method, system and apparatus for controlling a surgical navigation system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/694,241 Active 2035-09-12 US10925676B2 (en) 2013-03-15 2017-09-01 Method, system and apparatus for controlling a surgical navigation system

Country Status (3)

Country Link
US (2) US20160038253A1 (en)
CA (1) CA2904766C (en)
WO (1) WO2014138916A1 (en)

Cited By (125)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160324583A1 (en) * 2014-10-14 2016-11-10 Leila KHERADPIR Patient reference tool
US20180014892A1 (en) * 2013-03-15 2018-01-18 Cameron Anthony Piron Method, system and apparatus for controlling a surgical navigation system
US9924871B2 (en) * 2015-03-05 2018-03-27 Synaptive Medical (Barbados) Inc. Optical coherence tomography system including a planarizing transparent material
US20190201123A1 (en) * 2017-12-28 2019-07-04 Ethicon Llc Surgical systems with autonomously adjustable control programs
US10755813B2 (en) 2017-12-28 2020-08-25 Ethicon Llc Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform
CN111787882A (en) * 2017-12-28 2020-10-16 爱惜康有限责任公司 Image acquisition of extra-abdominal region to improve placement and control of surgical device in use
US10849697B2 (en) 2017-12-28 2020-12-01 Ethicon Llc Cloud interface for coupled surgical devices
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892899B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Self describing data packets generated at an issuing instrument
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US10932806B2 (en) 2017-10-30 2021-03-02 Ethicon Llc Reactive algorithm for surgical system
US10943454B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US11013563B2 (en) 2017-12-28 2021-05-25 Ethicon Llc Drive arrangements for robot-assisted surgical platforms
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US11058498B2 (en) 2017-12-28 2021-07-13 Cilag Gmbh International Cooperative surgical actions for robot-assisted surgical platforms
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11114195B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Surgical instrument with a tissue marking assembly
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11129611B2 (en) 2018-03-28 2021-09-28 Cilag Gmbh International Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11179204B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US20220104911A1 (en) * 2020-10-02 2022-04-07 Ethicon Llc Cooperative surgical displays
US11298148B2 (en) 2018-03-08 2022-04-12 Cilag Gmbh International Live time tissue classification using electrical parameters
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11304777B2 (en) 2011-10-28 2022-04-19 Navigate Surgical Technologies, Inc System and method for determining the three-dimensional location and orientation of identification markers
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11589932B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11596291B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11696760B2 (en) 2017-12-28 2023-07-11 Cilag Gmbh International Safety systems for smart powered surgical stapling
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11931027B2 (en) 2021-08-16 2024-03-19 Cilag Gmbh Interntional Surgical instrument comprising an adaptive control system

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11278353B2 (en) 2016-03-16 2022-03-22 Synaptive Medical Inc. Trajectory alignment system and methods
US10149618B1 (en) 2014-03-12 2018-12-11 The Board Of Regents Of The University Of Texas System Subdural electrode localization and visualization using parcellated, manipulable cerebral mesh models
US20180011983A1 (en) * 2015-02-02 2018-01-11 Think Surgical, Inc. Method and system for managing medical data
WO2016203295A1 (en) * 2015-06-19 2016-12-22 Synaptive Medical (Barbados) Inc. A medical imaging system for determining a scan orientation
WO2017016947A1 (en) * 2015-07-24 2017-02-02 Navigate Surgical Technologies, Inc. Surgical systems and associated methods using gesture control
JP6835850B2 (en) * 2015-12-29 2021-02-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Systems, control units, and methods for controlling surgical robots
US11033341B2 (en) 2017-05-10 2021-06-15 Mako Surgical Corp. Robotic spine surgery system and methods
US11065069B2 (en) 2017-05-10 2021-07-20 Mako Surgical Corp. Robotic spine surgery system and methods
US11443501B2 (en) * 2018-12-05 2022-09-13 Verily Life Sciences Llc Robotic surgical safety via video processing
CN116919596B (en) * 2023-09-14 2024-01-09 武汉联影智融医疗科技有限公司 Instrument navigation method, system, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070016008A1 (en) * 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US20070078340A1 (en) * 2005-09-30 2007-04-05 Siemens Medical Solutions Usa, Inc. Method and apparatus for controlling ultrasound imaging systems having positionable transducers
US20080200926A1 (en) * 2007-02-19 2008-08-21 Laurent Verard Automatic identification of instruments used with a surgical navigation system
US7643862B2 (en) * 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US20100210938A1 (en) * 2002-11-19 2010-08-19 Medtronic Navigation, Inc Navigation System for Cardiac Therapies

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7831292B2 (en) 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
EP1595234A4 (en) * 2003-02-21 2007-01-03 Zachry Construction Corp Tagging and tracking system for assets and personnel of a commercial enterprise
WO2006063156A1 (en) * 2004-12-09 2006-06-15 Stryker Corporation Wireless system for providing instrument and implant data to a surgical navigation unit
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
EP2542296A4 (en) * 2010-03-31 2014-11-26 St Jude Medical Atrial Fibrill Intuitive user interface control for remote catheter navigation and 3d mapping and visualization systems
CA2904766C (en) * 2013-03-15 2022-02-08 Synaptive Medical (Barbados) Inc. Method, system and apparatus for controlling a surgical navigation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100210938A1 (en) * 2002-11-19 2010-08-19 Medtronic Navigation, Inc Navigation System for Cardiac Therapies
US20070016008A1 (en) * 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US7643862B2 (en) * 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US20070078340A1 (en) * 2005-09-30 2007-04-05 Siemens Medical Solutions Usa, Inc. Method and apparatus for controlling ultrasound imaging systems having positionable transducers
US20080200926A1 (en) * 2007-02-19 2008-08-21 Laurent Verard Automatic identification of instruments used with a surgical navigation system

Cited By (205)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11304777B2 (en) 2011-10-28 2022-04-19 Navigate Surgical Technologies, Inc System and method for determining the three-dimensional location and orientation of identification markers
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US10925676B2 (en) * 2013-03-15 2021-02-23 Synaptive Medical Inc. Method, system and apparatus for controlling a surgical navigation system
US20180014892A1 (en) * 2013-03-15 2018-01-18 Cameron Anthony Piron Method, system and apparatus for controlling a surgical navigation system
US9737370B2 (en) * 2014-10-14 2017-08-22 Synaptive Medical (Barbados) Inc. Patient reference tool
US20160324583A1 (en) * 2014-10-14 2016-11-10 Leila KHERADPIR Patient reference tool
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US9924871B2 (en) * 2015-03-05 2018-03-27 Synaptive Medical (Barbados) Inc. Optical coherence tomography system including a planarizing transparent material
US10182724B2 (en) * 2015-03-05 2019-01-22 Synaptive Medical (Barbados) Inc. Optical coherence tomography system including a planarizing transparent material
US10980560B2 (en) 2017-10-30 2021-04-20 Ethicon Llc Surgical instrument systems comprising feedback mechanisms
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11759224B2 (en) 2017-10-30 2023-09-19 Cilag Gmbh International Surgical instrument systems comprising handle arrangements
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11696778B2 (en) 2017-10-30 2023-07-11 Cilag Gmbh International Surgical dissectors configured to apply mechanical and electrical energy
US10932806B2 (en) 2017-10-30 2021-03-02 Ethicon Llc Reactive algorithm for surgical system
US11648022B2 (en) 2017-10-30 2023-05-16 Cilag Gmbh International Surgical instrument systems comprising battery arrangements
US11819231B2 (en) 2017-10-30 2023-11-21 Cilag Gmbh International Adaptive control programs for a surgical system comprising more than one type of cartridge
US10959744B2 (en) 2017-10-30 2021-03-30 Ethicon Llc Surgical dissectors and manufacturing techniques
US11602366B2 (en) 2017-10-30 2023-03-14 Cilag Gmbh International Surgical suturing instrument configured to manipulate tissue using mechanical and electrical power
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11109878B2 (en) 2017-10-30 2021-09-07 Cilag Gmbh International Surgical clip applier comprising an automatic clip feeding system
US11564703B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Surgical suturing instrument comprising a capture width which is larger than trocar diameter
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11026713B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Surgical clip applier configured to store clips in a stored state
US11026712B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Surgical instruments comprising a shifting mechanism
US11793537B2 (en) 2017-10-30 2023-10-24 Cilag Gmbh International Surgical instrument comprising an adaptive electrical system
US11413042B2 (en) 2017-10-30 2022-08-16 Cilag Gmbh International Clip applier comprising a reciprocating clip advancing member
US11406390B2 (en) 2017-10-30 2022-08-09 Cilag Gmbh International Clip applier comprising interchangeable clip reloads
US11045197B2 (en) 2017-10-30 2021-06-29 Cilag Gmbh International Clip applier comprising a movable clip magazine
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11051836B2 (en) 2017-10-30 2021-07-06 Cilag Gmbh International Surgical clip applier comprising an empty clip cartridge lockout
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11925373B2 (en) 2017-10-30 2024-03-12 Cilag Gmbh International Surgical suturing instrument comprising a non-circular needle
US11291465B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Surgical instruments comprising a lockable end effector socket
US11071560B2 (en) 2017-10-30 2021-07-27 Cilag Gmbh International Surgical clip applier comprising adaptive control in response to a strain gauge circuit
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11207090B2 (en) 2017-10-30 2021-12-28 Cilag Gmbh International Surgical instruments comprising a biased shifting mechanism
US11141160B2 (en) 2017-10-30 2021-10-12 Cilag Gmbh International Clip applier comprising a motor controller
US11129636B2 (en) 2017-10-30 2021-09-28 Cilag Gmbh International Surgical instruments comprising an articulation drive that provides for high articulation angles
US11103268B2 (en) 2017-10-30 2021-08-31 Cilag Gmbh International Surgical clip applier comprising adaptive firing control
US11123070B2 (en) 2017-10-30 2021-09-21 Cilag Gmbh International Clip applier comprising a rotatable clip magazine
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11114195B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Surgical instrument with a tissue marking assembly
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US20190201123A1 (en) * 2017-12-28 2019-07-04 Ethicon Llc Surgical systems with autonomously adjustable control programs
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11918302B2 (en) 2017-12-28 2024-03-05 Cilag Gmbh International Sterile field interactive control displays
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11179204B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US10755813B2 (en) 2017-12-28 2020-08-25 Ethicon Llc Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11903587B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Adjustment to the surgical stapling control based on situational awareness
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11213359B2 (en) 2017-12-28 2022-01-04 Cilag Gmbh International Controllers for robot-assisted surgical platforms
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11890065B2 (en) 2017-12-28 2024-02-06 Cilag Gmbh International Surgical system to limit displacement
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
CN111787882A (en) * 2017-12-28 2020-10-16 爱惜康有限责任公司 Image acquisition of extra-abdominal region to improve placement and control of surgical device in use
US11864845B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Sterile field interactive control displays
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11844579B2 (en) 2017-12-28 2023-12-19 Cilag Gmbh International Adjustments based on airborne particle properties
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11832899B2 (en) * 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US10849697B2 (en) 2017-12-28 2020-12-01 Ethicon Llc Cloud interface for coupled surgical devices
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10892899B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Self describing data packets generated at an issuing instrument
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11058498B2 (en) 2017-12-28 2021-07-13 Cilag Gmbh International Cooperative surgical actions for robot-assisted surgical platforms
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11304763B2 (en) * 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US11601371B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11779337B2 (en) 2017-12-28 2023-10-10 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11775682B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US11751958B2 (en) 2017-12-28 2023-09-12 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11737668B2 (en) 2017-12-28 2023-08-29 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11712303B2 (en) 2017-12-28 2023-08-01 Cilag Gmbh International Surgical instrument comprising a control circuit
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11382697B2 (en) 2017-12-28 2022-07-12 Cilag Gmbh International Surgical instruments comprising button circuits
US11701185B2 (en) 2017-12-28 2023-07-18 Cilag Gmbh International Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11696760B2 (en) 2017-12-28 2023-07-11 Cilag Gmbh International Safety systems for smart powered surgical stapling
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US11678881B2 (en) 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11045591B2 (en) 2017-12-28 2021-06-29 Cilag Gmbh International Dual in-series large and small droplet filters
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11672605B2 (en) 2017-12-28 2023-06-13 Cilag Gmbh International Sterile field interactive control displays
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US10943454B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11013563B2 (en) 2017-12-28 2021-05-25 Ethicon Llc Drive arrangements for robot-assisted surgical platforms
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11612408B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Determining tissue composition via an ultrasonic system
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11589932B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11596291B2 (en) 2017-12-28 2023-03-07 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying of the location of the tissue within the jaws
US11701162B2 (en) 2018-03-08 2023-07-18 Cilag Gmbh International Smart blade application for reusable and disposable devices
US11701139B2 (en) 2018-03-08 2023-07-18 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11589915B2 (en) 2018-03-08 2023-02-28 Cilag Gmbh International In-the-jaw classifier based on a model
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11844545B2 (en) 2018-03-08 2023-12-19 Cilag Gmbh International Calcified vessel identification
US11534196B2 (en) 2018-03-08 2022-12-27 Cilag Gmbh International Using spectroscopy to determine device use state in combo instrument
US11839396B2 (en) 2018-03-08 2023-12-12 Cilag Gmbh International Fine dissection mode for tissue classification
US11617597B2 (en) 2018-03-08 2023-04-04 Cilag Gmbh International Application of smart ultrasonic blade technology
US11298148B2 (en) 2018-03-08 2022-04-12 Cilag Gmbh International Live time tissue classification using electrical parameters
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11464532B2 (en) 2018-03-08 2022-10-11 Cilag Gmbh International Methods for estimating and controlling state of ultrasonic end effector
US11457944B2 (en) 2018-03-08 2022-10-04 Cilag Gmbh International Adaptive advanced tissue treatment pad saver mode
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11678901B2 (en) 2018-03-08 2023-06-20 Cilag Gmbh International Vessel sensing for adaptive advanced hemostasis
US11678927B2 (en) 2018-03-08 2023-06-20 Cilag Gmbh International Detection of large vessels during parenchymal dissection using a smart blade
US11344326B2 (en) 2018-03-08 2022-05-31 Cilag Gmbh International Smart blade technology to control blade instability
US11707293B2 (en) 2018-03-08 2023-07-25 Cilag Gmbh International Ultrasonic sealing algorithm with temperature control
US11399858B2 (en) 2018-03-08 2022-08-02 Cilag Gmbh International Application of smart blade technology
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11406382B2 (en) 2018-03-28 2022-08-09 Cilag Gmbh International Staple cartridge comprising a lockout key configured to lift a firing member
US11589865B2 (en) 2018-03-28 2023-02-28 Cilag Gmbh International Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems
US11129611B2 (en) 2018-03-28 2021-09-28 Cilag Gmbh International Surgical staplers with arrangements for maintaining a firing member thereof in a locked configuration unless a compatible cartridge has been installed therein
US11166716B2 (en) 2018-03-28 2021-11-09 Cilag Gmbh International Stapling instrument comprising a deactivatable lockout
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US11197668B2 (en) 2018-03-28 2021-12-14 Cilag Gmbh International Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11213294B2 (en) 2018-03-28 2022-01-04 Cilag Gmbh International Surgical instrument comprising co-operating lockout features
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11931110B2 (en) 2018-12-14 2024-03-19 Cilag Gmbh International Surgical instrument comprising a control system that uses input from a strain gage circuit
US11331101B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Deactivator element for defeating surgical stapling device lockouts
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11291444B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a closure lockout
US11291445B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical staple cartridges with integral authentication keys
US11517309B2 (en) 2019-02-19 2022-12-06 Cilag Gmbh International Staple cartridge retainer with retractable authentication key
US11298130B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Staple cartridge retainer with frangible authentication key
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11272931B2 (en) 2019-02-19 2022-03-15 Cilag Gmbh International Dual cam cartridge based feature for unlocking a surgical stapler lockout
US11925350B2 (en) 2019-02-19 2024-03-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11298129B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11331100B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Staple cartridge retainer system with authentication keys
US11751872B2 (en) 2019-02-19 2023-09-12 Cilag Gmbh International Insertable deactivator element for surgical stapler lockouts
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11737696B2 (en) 2019-03-22 2023-08-29 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
US20220104911A1 (en) * 2020-10-02 2022-04-07 Ethicon Llc Cooperative surgical displays
US11931027B2 (en) 2021-08-16 2024-03-19 Cilag Gmbh Interntional Surgical instrument comprising an adaptive control system

Also Published As

Publication number Publication date
US20180014892A1 (en) 2018-01-18
CA2904766A1 (en) 2014-09-18
CA2904766C (en) 2022-02-08
WO2014138916A1 (en) 2014-09-18
US10925676B2 (en) 2021-02-23

Similar Documents

Publication Publication Date Title
US10925676B2 (en) Method, system and apparatus for controlling a surgical navigation system
US20230225810A1 (en) Guiding a robotic surgical system to perform a surgical procedure
US20220192611A1 (en) Medical device approaches
US10390890B2 (en) Navigational feedback for intraoperative waypoint
US10265854B2 (en) Operating room safety zone
Sielhorst et al. Advanced medical displays: A literature review of augmented reality
Gavaghan et al. A portable image overlay projection device for computer-aided open liver surgery
US9805469B2 (en) Marking and tracking an area of interest during endoscopy
US10758212B2 (en) Automatic depth scrolling and orientation adjustment for semi-automated path planning
US20160166334A1 (en) Image annotation in image-guided medical procedures
US20220163785A1 (en) Systems and methods for displaying medical video images and/or medical 3d models
JP2017525418A (en) Intelligent display
US20170296293A1 (en) Method, system and apparatus for tracking surgical imaging devices
US20180235714A1 (en) Method, system and apparatus for maintaining patient registration in a surgical navigation system
JP2022513013A (en) Systematic placement of virtual objects for mixed reality
Liu et al. Toward intraoperative image-guided transoral robotic surgery
WO2020205714A1 (en) Surgical planning, surgical navigation and imaging system
WO2022147161A1 (en) Alignment of medical images in augmented reality displays
US20210117009A1 (en) Gesture control of medical displays
Gard et al. Image-based measurement by instrument tip tracking for tympanoplasty using digital surgical microscopy
CA2976320C (en) Method, system and apparatus for adjusting image data to compensate for modality-induced distortion
De Paolis et al. Visualization and interaction systems for surgical planning
Zhao et al. Guidance system development for radial-probe endobronchial ultrasound bronchoscopy
US20240062387A1 (en) An Augmented Reality System, an Augmented Reality HMD, an Augmented Reality Method and a Computer Program
US20230147826A1 (en) Interactive augmented reality system for laparoscopic and video assisted surgeries

Legal Events

Date Code Title Description
AS Assignment

Owner name: SYNAPTIVE MEDICAL (BARBADOS) INC., BARBADOS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PIRON, CAMERON;WOOD, MICHAEL;SELA, GAL;AND OTHERS;REEL/FRAME:036716/0957

Effective date: 20140624

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SYNAPTIVE MEDICAL INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SYNAPTIVE MEDICAL (BARBADOS) INC.;REEL/FRAME:054528/0770

Effective date: 20200902

AS Assignment

Owner name: ESPRESSO CAPITAL LTD., CANADA

Free format text: SECURITY INTEREST;ASSIGNOR:SYNAPTIVE MEDICAL INC.;REEL/FRAME:054922/0791

Effective date: 20201223