WO2008091917A2 - Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control - Google Patents

Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control Download PDF

Info

Publication number
WO2008091917A2
WO2008091917A2 PCT/US2008/051745 US2008051745W WO2008091917A2 WO 2008091917 A2 WO2008091917 A2 WO 2008091917A2 US 2008051745 W US2008051745 W US 2008051745W WO 2008091917 A2 WO2008091917 A2 WO 2008091917A2
Authority
WO
WIPO (PCT)
Prior art keywords
instrument
computer
neural structure
surgical
anatomical
Prior art date
Application number
PCT/US2008/051745
Other languages
French (fr)
Other versions
WO2008091917A4 (en
WO2008091917A3 (en
Inventor
William K. Adcox
Eric J. Ryterski
Original Assignee
Warsaw Orthopedic, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Warsaw Orthopedic, Inc filed Critical Warsaw Orthopedic, Inc
Priority to AU2008207954A priority Critical patent/AU2008207954A1/en
Priority to EP08713917.6A priority patent/EP2124735B1/en
Priority to JP2009547389A priority patent/JP2010516406A/en
Priority to CN200880007403A priority patent/CN101677778A/en
Publication of WO2008091917A2 publication Critical patent/WO2008091917A2/en
Publication of WO2008091917A3 publication Critical patent/WO2008091917A3/en
Publication of WO2008091917A4 publication Critical patent/WO2008091917A4/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/02Details
    • A61N1/04Electrodes
    • A61N1/05Electrodes for implantation or insertion into the body, e.g. heart electrode
    • A61N1/0551Spinal or peripheral nerve electrodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1655Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans for tapping
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/56Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
    • A61B17/58Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
    • A61B17/88Osteosynthesis instruments; Methods or means for implanting or extracting internal or external fixation devices
    • A61B17/8875Screwdrivers, spanners or wrenches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/02Surgical instruments, devices or methods, e.g. tourniquets for holding wounds open; Tractors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1662Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans for particular parts of the body
    • A61B17/1671Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans for particular parts of the body for the spine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00026Conductivity or impedance, e.g. of tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00039Electric or electromagnetic phenomena other than conductivity, e.g. capacity, inductivity, Hall effect
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00734Aspects not otherwise provided for battery operated
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/067Measuring instruments not otherwise provided for for measuring angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems

Definitions

  • Surgical procedures and, in particular, neuro-related procedures are often assisted by a surgical navigational system to assist a surgeon in translating and positioning a surgical tool or probe.
  • Conventional surgical navigational systems use reflectors and/or markers to provide positional information of the surgical tool relative to a preoperative rendering of a patient anatomy.
  • Surgical navigational systems do not carry out neuromonitoring functions to determine the integrity of a neural structure or the proximity of the surgical tool to that neural structure.
  • neural integrity monitoring systems are designed to use electrostimulation to identify nerve location for predicting and preventing neurological injury.
  • neural integrity monitoring systems do not provide visual navigational assistance.
  • an integrated neuromonitoring and surgical navigational system that is capable of visually assisting a surgeon in navigating a surgical tool or probe as well as being capable of neuromonitoring to evaluate surgical tool proximity to a neural structure and/or the integrity of the neural structure.
  • this disclosure is directed to an apparatus that includes an instrument tracking system configured to track movement of an instrument and a database containing technical information regarding a surgical procedure and patient anatomy.
  • the apparatus also includes a computer operatively linked with the instrument tracking system and the database.
  • the computer is programmed to determine an anatomical structure proximate the instrument and determine a portion of the technical information contained on the database that relates to the anatomical structure.
  • the computer is further programmed to generate and display identifiers for the portion of the technical information in a user- selectable manner to allow a user to selectively obtain technical information relating to one of the surgical procedure and the anatomical structure.
  • the disclosure is directed to a method that involves tracking a surgical instrument and applying electrostimulation at a given surgical instrument position.
  • the method also includes determining a location of a neural structure relative to the surgical instrument position from a neurological response of the neural structure to the electrostimulation.
  • the disclosure includes an apparatus having a computer programmed to determine a location of a neuromonitoring probe designed to apply electrostimulation to a patient.
  • the computer is further programmed to compare the determined location to an anatomical framework of the patient, wherein the anatomical framework provides a general localization of a neural structure.
  • the computer is also programmed to automatically determine one of electrostimulation intensity and electrostimulation pattern for electrostimulating the neural structure based on the position of the neuromonitoring probe and the neural structure.
  • the disclosure is directed to a computer readable storage medium having instructions thereon that when executed by a computer causes the computer to access an anatomical visualization of a patient.
  • the instructions also causes the computer to access neurological information acquired from the patient and update the anatomical visualization to incorporate the neurological information.
  • the invention is directed to a surgical method that includes translating a surgical tool relative to patient anatomy containing a neural structure and applying an electrical stimulus to the neural structure with the surgical tool.
  • the surgical method also includes determining a position of the neural structure relative to other anatomical structures of the patient anatomy through inspection of a GUI displaying a visualization of the patient anatomy and the surgical tool.
  • Figure 1 is a pictorial view of an integrated surgical navigational and neuromonitoring system.
  • Figure 2 is a pictorial view of a surgical suite incorporating the integrated surgical navigational and neuromonitoring system of Fig. 1.
  • Figure 3 is a block diagram of the integrated surgical navigational and neuromonitoring system of Fig. 1.
  • Figure 4 is a front view of a GUI displayed by the integrated surgical navigational and neuromonitoring system of Figs 1-3.
  • Figure 5 is a front view of a portion of the GUI shown in Fig. 4.
  • Figure 6 is a block diagram of a wireless instrument tracking system for use with the integrated surgical navigational and neuromonitoring system of Figs. 1-3.
  • Figure 7 is a side view of surgical probe according to one aspect of the present disclosure.
  • Figure 8 is a side view of a cordless retractor capable of applying electrostimulation according to one aspect of the present disclosure.
  • Figure 9 is a side view of a corded retractor capable of applying electrostimulation according to one aspect of the present disclosure.
  • Figure 10 is a side view of a cordless bone screwdriver capable of applying electrostimulation according to one aspect of the present disclosure.
  • Figure 11 is a side view of a surgical tap capable of applying electrostimulation according to another aspect of the present disclosure.
  • Figure 12 is a side view of a surgical probe according to another aspect of the present disclosure.
  • Figure 13 is a cross-sectional view of the surgical probe of Fig. 12 taken along lines 13-13 thereof.
  • Figure 14 is an end view of the surgical probe shown in Figs. 12-13.
  • Figure 15 is a flow chart setting forth the steps signaling instrument proximity to an anatomical structure according to one aspect of the present disclosure.
  • Figure 16 is a flow chart setting forth the steps of accessing and publishing technical resources according to an aspect of the present disclosure.
  • Figure 17 is a flow chart setting forth the steps of determining neural structure integrity according to one aspect of the invention.
  • the present disclosure relates generally to the field of neuro-related surgery, and more particularly to systems and methods for integrated surgical navigation and neuromonitoring.
  • systems and methods for integrated surgical navigation and neuromonitoring For the purposes of promoting an understanding of the principles of the invention, reference will now be made to embodiments or examples illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alteration and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the disclosure relates.
  • the integrated image-based surgical navigation and neuromonitoring system 10 enables a surgeon to generate and display on monitor 12 the trajectory of instrument 14, which is preferably a surgical instrument also capable of facilitating the acquisition of neurological information, relative to a visualization of patient anatomy.
  • Data representing one or more pre-acquired images 16 is fed to computer 18.
  • Computer 18 tracks the position of instrument 14 in real-time utilizing detector 20.
  • Computer 18 registers and displays the trajectory of instrument 14 with images 16 in real-time.
  • An icon representing the trajectory of instrument 14 is superimposed on the pre-acquired images 16 and shown on monitor 12.
  • the real-time trajectory of instrument 14 can be stored in computer 18.
  • This command also creates a new static icon representing the trajectory of the instrument on display 12 at the time the surgeon's command was issued.
  • the surgeon has the option of issuing additional commands, each one storing a real-time trajectory and creating a new static icon for display by default. The surgeon can override this default and choose to not display any static icon.
  • the surgeon also has the option to perform a number of geometric measurements using the real-time and stored instrument trajectories.
  • computer system 18 In addition to displaying and storing a trajectory of instrument 14 relative to patient anatomy, computer system 18 also updates the visualization of patient anatomy shown on display 12 with indicators representative of neurological information acquired from the patient.
  • the neurological indicators can include color coding of certain anatomical structures, textual or graphical annotations superimposed on the pre-acquired images or visualization thereof, or other identifying markers.
  • Reference to a visualization of patient anatomy herein may include a pre- acquired image, a graphical representation derived from one or more pre-acquired images, atlas information, or a combination thereof.
  • a surgical suite 22 incorporating the image-based surgical navigation and neuromonitoring system 10 is shown.
  • Pre-acquired images of patient 24 are collected when a patient, lying on table 26, is placed within C-arm imaging device 28.
  • the term "pre-acquired,” as used herein, does not imply any specified time sequence.
  • the images are taken at some time prior to when surgical navigation is performed.
  • images are taken from two substantially orthogonal directions, such as anterior-posterior (A-P) and lateral, of the anatomy of interest.
  • the imaging device 28 includes x-ray source 30 and x-ray receiving section 32.
  • Receiving section 32 includes target tracking markers 34. Operation of the C-arm imaging device 28 is controlled by a physician or other user by C-arm control computer 36.
  • C-arm imaging device 28 is shown for the acquisition of images from patient 24, it is understood that other imaging devices may be used to acquire anatomical and/or functional images of the patient.
  • images may be acquired using computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), ultrasound, and single photon emission computed tomography (SPECT).
  • CT computed tomography
  • MR magnetic resonance
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • An O- arm imaging system may also be used for image acquisition.
  • images may be acquired preoperatively with one type of imaging modality remote from the surgical suite 22 and acquired preoperatively or intraoperatively at the surgical suite 22 with another type of imaging modality. These multi-modality images can be registered using known registration techniques.
  • Acquired images are transmitted to computer 36 where they may be forwarded to surgical navigation computer 18.
  • Computer 18 provides the ability to display the received images via monitor 12.
  • Other devices for example, such as heads up displays, may also be used to display the images.
  • system 10 generally performs the real-time tracking of instrument 14, and may also track the position of receiver section 32 and reference frame 38.
  • Detector 20 senses the presence of tracking markers on each object to be tracked.
  • Detector 20 is coupled to computer 18 which is programmed with software modules that analyze the signals transmitted by detector 20 to determine the position of each object in detector space. The manner in which the detector localizes the object is known in the art.
  • instrument 14 is tracked by the detector, which is part of an optical tracking system (not shown) using attached tracking markers 40, such as reflectors, in order for its three-dimensional position to be determined in detector space.
  • Computer 18 is communicatively linked with the optical tracking system and integrates this information with the pre-acquired images of patient 24 to produce a display which assists surgeon 42 when performing surgical procedures.
  • An iconic representation of the trajectory of instrument 14 is simultaneously overlaid on the pre-acquired images of patient 24 and displayed on monitor 12. In this manner, surgeon 42 is able to see the trajectory of the instrument relative to the patient's anatomy in real-time.
  • the system according to the invention preferably has the ability to save the dynamic real-time trajectory of instrument 14.
  • computer 18 receives a signal to store the real-time trajectory of the instrument in the memory of computer 18.
  • the surgeon or other user may issue the command using other input devices, such as a push-button on the instrument, voice command, touchpad/touch screen input, and the like.
  • This "storage command” also instructs computer 18 to generate a new static icon representing the saved trajectory of the instrument, essentially “freezing" the icon at the point when the input was received.
  • the static icon, along with the icon representing the real-time trajectory of the instrument can be simultaneously superimposed over the pre-acquired image.
  • both static and real-time icons can be superimposed on all of the displayed images.
  • Other means of issuing the storage command such as, for example, through a GUI, may also be used.
  • the surgeon also has the option of storing multiple instrument trajectories. Each time a desired storage command is issued, the real-time trajectory of the instrument is stored and a new static icon representing the stored trajectory is displayed on the pre-acquired image, or if more than one image is being displayed, on all the pre-acquired images.
  • the system according to the invention preferably has the additional capability to measure angles between the real-time trajectory and one or more of the stored trajectories, or between stored trajectories, in a manner similar to that described in U.S. Pat. No. 6,920,347, the disclosure of which is incorporated herein.
  • neurological information can be acquired from the patient and that information that can be represented in a visible form that can be shown on display 12.
  • surgeon 42 may move the instrument 14 in a guided manner to an anatomical region containing neural structures and using instrument 14 or other neuro logically stimulating device together with electrodes (not shown) may then acquire neurological information from the neural structures.
  • the acquired neurological information is then passed to computer 18 which registers the neurological information with the neural structure from which the neurological information was acquired.
  • computer 18 can determine the location of the neural structure that was stimulated and then update the visualization of that neural structure on display 12 to include markers or other indices representative of the acquired neurological information.
  • computer 18 can determine the class of the stimulated neural structure and add an annotation to the visualization of the neural structure on display 12.
  • the neural structure may be assigned a designated color in the visualization on display 12 based on its class or other defining characteristics.
  • computer 18, together with positional information of the neural structure may also predict the structure of the nerve and graphically display that predicted structure to the surgeon on display 12.
  • a portion of a nerve may be stimulated, but the entire nerve structure predicted and graphically displayed.
  • the pre-acquired images and/or visualizations thereof provide the surgeon with a general understanding of the patient anatomy relative to the tracked instrument, the acquired neurological information supplements that understanding with greater precision with respect to neural structures.
  • the integrated system enhances the surgeon's understanding of the anatomy for the particular patient.
  • viewable or audible indicators may be automatically given by the computer 18 to the surgeon when the instrument 14 is in proximity to a neural structure.
  • the indicators may be tailored to coincide with the class, position, or other characteristic of the neural structure.
  • surgeon 42 or other user may also add notes regarding the neural structure from which a neurological response was measured. Those notes may then be stored in memory of computer 18.
  • surgeon 42 wears a headphone 46 and microphone 48 to facilitate hands-free note making during the surgical procedure.
  • computer 18 may also broadcast on-demand audio information to the surgeon via an audio system connected to the headphone or other speakers.
  • Computer 18 includes a GUI system operating in conjunction with a display screen of display monitor 12.
  • the GUI system is implemented in conjunction with operating system 46 running computer 18.
  • the GUI is implemented as part of the computer 18 to receive input data and commands from a user interface 47 such as a keyboard, mouse, lightwand, touchpad, touch screen, voice recognition module, foot switch, joystick, and the like.
  • a user interface 47 such as a keyboard, mouse, lightwand, touchpad, touch screen, voice recognition module, foot switch, joystick, and the like.
  • a computer program used to implement the various steps of the present invention is generally located in memory unit 48, and the processes of the present invention are carried out through the use of a central processing unit (CPU) 50.
  • the memory unit 48 is representative of both read-only memory and random access memory.
  • the memory unit also contains a database 52 that stores data, for example, image data and tables, including such information as stored instrument positions, extension values, and geometric transform parameters, used in conjunction with the present invention.
  • Database 52 can also be used to store data, such as quantitative and qualitative assessments, of monitored neurological structures.
  • the memory unit further contains a technical data database 53 that stores data pertaining to, for example, surgical procedures, general anatomical structure information, videos, publications, tutorials, presentations, anatomical illustrations, surgical guides, and the like, that can be accessed by a surgeon or other user preoperatively, intraoperatively, or postoperatively to assist with diagnosis and treatment.
  • a communication software module 60 that facilitates communication, via modem 62, of the computer 18 to remote databases, e.g., technical data database 64.
  • computer 18 may access the databases via a network (not shown).
  • any acceptable network may be employed whether public, open, dedicated, private, or so forth.
  • the communications links to the network may be of any acceptable type, including conventional telephone lines, fiber optics, cable modem links, digital subscriber lines, wireless data transfer systems, or the like.
  • the computer 18 is provided with communications interface hardware 62 and software 60 of generally known design, permitting establishment of networks links and the exchange of data with the databases.
  • CPU 50 in combination with the computer software comprising operating system 46, tracking software module 54, calibration software module 56, display software module 58, communication module 60, and neuromonitoring software module 66 controls the operations and processes of system 10.
  • the processes implemented by CPU 50 may be communicated as electrical signals along bus 68 to an I/O interface 70 and a video interface 72.
  • the I/O interface is connected to a printer 74, an image archive (remote or local) 76, and an audio (speaker) system 78.
  • Tracking software module 54 performs the processes necessary for tracking objects in an image guided system as described herein and are known to those skilled in the art.
  • Calibration software module 56 computes the geometric transform which corrects for image distortions and registers the images to the anatomical reference frame 38, and thus the patient's anatomy.
  • Display software module 58 applies, and if desired, computes the offsets between the guide tracking markers 40 and the instrument 14 in order generate an icon representing the trajectory of the instrument for superposition over the images. For instruments with fixed lengths and angulations, these offsets can be measured once and stored in database 52. The user would then select from a list of instruments, the one being used in the procedure so the proper offsets are applied by display software module 58. For instruments with variable lengths and angulations, the offsets could be measured manually and entered via keyboard 47, or measured in conjunction a tracked pointer (not shown) or tracked registration jig (not shown).
  • Pre-acquired image data stored locally in image database 52 or remotely in image archive 76 can be fed directly into computer 18 digitally through I/O interface 70, or may be supplied as video data through video interface 72.
  • items shown as stored in memory can also be stored, at least partially, on a hard disk (not shown) or other memory device, such as flash memory, if memory resources are limited.
  • image data may also be supplied over a network, through a mass storage device such as a hard drive, optical disks, tape drives, or any other type of data transfer and storage devices.
  • computer 18 includes a neuromonitoring interface 80 as well as an instrument navigation interface 82.
  • the neuromonitoring interface 80 receives electrical signals from electrodes 84 proximate patient 24. The electrical signals are detected by electrodes 84 in response to electrostimulation applied to neural structures of the patient by instrument 14 or other electrostimulating probe (not shown).
  • the electrodes are electromyography (EMG) electrodes and record muscle response to nerve stimulation.
  • EMG electromyography
  • other neuromonitoring techniques such as, motor evoked potentials (MEP) neuromonitoring and somatosensory evoked potentials (SSEP) neuromonitoring, may be used.
  • MEP motor evoked potentials
  • SSEP somatosensory evoked potentials
  • a stimulator control 86 interfaces with instrument 14 and controls the intensity, direction, and pattern of stimulation applied by instrument 14. Inputs establishing desired stimulation characteristics may be received by the surgeon or other user via input interface 47 or on the instrument 14 itself.
  • the integrated system 10 also carries out real-time tracking of instrument 14 (and patient 24) using markers, reflectors, or other tracking devices.
  • instrument 14 includes markers 40 whose movements are tracked by instrument tracker 88, which may include a camera or other known tracking equipment.
  • the patient may include markers or reflectors so that patient movement can be tracked.
  • instrument 14 is also connected to a power supply 90. As will be shown, the instrument 14 may be powered by a battery housed within the instrument itself, a power supply housed within the computer cabinet, or inductively.
  • the integrated surgical navigational and neuromonitoring system is designed to assist a surgeon in navigating an instrument, e.g., surgical tool, probe, or other instrument, through visualization of the instrument relative to patient anatomy.
  • an instrument e.g., surgical tool, probe, or other instrument
  • real-time positional and orientation information regarding the instrument relative to patient anatomy can be superimposed on an anatomical, functional, or derived image of the patient.
  • the integrated system 10 also performs neuromonitoring to assess the position and integrity of neural structures.
  • the surgeon can move the instrument to a desired location, view the placement of the instrument relative to patient anatomy on display 12, apply an electrical stimulus to neural structures proximate the instrument, and measure the response to that electrical stimulus.
  • This neural information gathered can then be added to the visualization of the patient anatomy through graphic or textual annotations, color or other coding of the neural structure, or other labeling techniques to convey, in human discernable form, the neural information gathered from the application of an electrical stimulus.
  • the integrated system also helps the surgeon in visualizing patient anatomy, such as key nerve structures, and associating position or integrity with the patient anatomy. As will be shown with respect to Figs. 4-5, a GUI is used to convey and facilitate interaction with the surgical navigational and neuromonitoring information.
  • a GUI 92 designed to assist a surgeon or other user in navigating a surgical tool, such as a probe or a bone screwdriver, is shown.
  • the GUI 92 is bifurcated into an image portion 94 and a menu portion 96.
  • the image portion contains three image panes 98, 100, 102 that, in the illustrated example, contain a coronal, a sagittal, and an axial image, respectively, of patient anatomy.
  • the image portion also contains a rendering pane 104.
  • the menu portion 96 provides selectable links that, when selected by a surgeon, enables interfacing with that displayed in the image panes 98, 100, 102 or with other data acquired from the patient.
  • the image panes provide an anatomical map or framework for a surgeon to track an instrument, which can be representatively displayed by pointer 106.
  • the integrated system described herein tracks movement of an instrument and provides a real-time visualization of the position of the pointer superimposed on the images contained in panes 98, 100, 102.
  • the displayed images can be derived from one or more diagnostic images acquired of the patient, an atlas model, or a combination thereof.
  • the images displayed in the image panes are automatically refreshed such that an instantaneous position of the instrument, via pointer 106, provides positional information to the surgeon.
  • the image panes and the positional feedback provided by pointer 106 can assist the surgeon in isolating a neural structure for neural monitoring. That is, a general understanding of nerve location can be determined from the images contained in the image panes 98, 100, 102. Through visual inspection of the panes, the surgeon can then move the instrument proximal a neural structure, apply an electrostimulation, and measure the neurological response. That neurological response can be used to assess the integrity of the neural structure in a manner consistent with known neuromonitoring studies. Additionally, the neurological information can also be used to localize more precisely the position of the stimulated neural structure.
  • the visualization of patient anatomy e.g., the images contained in panes 98, 100, 102
  • the neurological response of a stimulated neural structure can then be used to pinpoint the position and orientation of that neural structure on the patient anatomy visualization using color-coding or other indicia.
  • the computer using the measured response of a neural structure and its positional information, as indicated by the surgeon positioning the instrument proximal the structure, can compare the measured response to data contained in a database and determine if the measured response is consistent with that expected given.
  • the integration of the navigation and neuromonitoring information enables the development of neural maps. That is, through repeated movement of the instrument and neurological monitoring, the combined information can be integrated to localize neural structure position, classify those neural structures based on position and/or response, and code through color or other indicia, a neurological, anatomically driven map of the patient.
  • the tip of the instrument is represented by pointer 106.
  • pointer 106 it is contemplated that tip, hind, or full instrument representations can be used to assist with navigation.
  • three images of the same anatomy, but at different views are shown, other image display approaches may be used.
  • one of the image panes 104 is illustratively used for a three-dimensional rendering of a patient anatomy, such as a neural structure bundle 108.
  • the rendering can be formed by registration of multi-angle images of the patient anatomy, derived from atlas information, or a combination thereof.
  • the surgeon positions the instrument proximal a target anatomical structure.
  • the surgeon selects "3D Rendering" tab 110 of menu 96.
  • the computer determines the position of the pointer 106 and generates a 3D rendering of the anatomical structure "pointed at" by the pointer. In this way, the surgeon can select an anatomical feature and then visually inspect that anatomical feature in a 3D rendering on the GUI 92.
  • the integrated system maintains or has access to a technical library contained on one or more databases.
  • the surgeon can access that technical data through selection of "Technical Data" tab 112.
  • the computer causes display of available resources (not shown) in menu 96.
  • Another window may be displayed; however, in a preferred implementation, a single GUI is used to prevent superposition of screens and windows over the navigational images.
  • the technical resources may include links to internet web pages, intranet web pages, articles, publications, presentations, maps, tutorials, and the like.
  • the list of resources is tailored to the given position of the instrument when the surgeon selects tab 112.
  • access to the technical resource information can be streamlined for efficient access during a surgical procedure.
  • Menu 96 also includes a tracker sub-menu 114 and an annotation sub-menu 116.
  • the tracker sub-menu 114 includes a "current" tab 118, a "past trajectory” tab 120, and an "anticipated trajectory” tab 122 that provide on-demand view options for displaying instrument navigation information.
  • User selection of tab 118 causes the current position of the instrument to be displayed in the image panes.
  • User selection of tab 120 causes the traveled trajectory of the instrument to be displayed.
  • User selection of tab 122 causes the anticipated trajectory, based on the current position of the head of the instrument, to be displayed. It is contemplated that more than a single tab can be active or selected at a time.
  • the annotations sub-menu 116 contains a "New" tab 124, a "View” tab 126, and an "Edit” tab 128.
  • Tabs 124, 126, 128 facilitate making, viewing, and editing annotations regarding a surgical procedure and anatomical and neural observations.
  • a surgeon can make a general annotation or record notes regarding a specific surgical procedure or anatomical observation, such as an observation regarding a neural structure, its position, integrity, or neurological response.
  • the computer automatically associates an annotation with the position of the instrument when the annotation was made.
  • annotations can be made and associated with a neural or other structure during the course of a surgical procedure.
  • the computer will cause a list of annotations to be appear in pane 116.
  • annotations made and associated with a neural structure will be viewable by positioning the instrument proximal the neural structure. Akin to a mouse-over technique, positioning the instrument proximal an annotated neural structure will cause any previous annotations to appear automatically if such a feature is enabled.
  • tabs and selectors both general, such as a patient information tab 130, or specific, can be incorporated into the menu pane 96. It is also understood that the presentation and arrangement of the tabs in menu pane 96 is merely one contemplated example.
  • image pane 102 is shown to further illustrate instrument tracking.
  • the instantaneous position of the instrument can be viewed relative to patient anatomy via localization of pointer 106.
  • selection of the "past trajectory" tab 120 on menu 96, Fig. 4 causes the past or traveled trajectory of the instrument to be shown by dashed trajectory line 132.
  • the anticipated trajectory 134 can also be viewed relative to the patient anatomy based on the instantaneous position and orientation of the tip or leading portion of the instrument.
  • trajectory paths can be stored and that stored trajectories can be recalled and viewed relative to the patient anatomy.
  • a current or real-time instrument trajectory can be compared to past trajectories.
  • the surgeon or other user can turn instrument tracking on and off as desired.
  • the look- ahead technique described above projects the graphical representation of the instrument into the image, there is no requirement that the instrument's graphical representation be in the space of the image to be projected into the image. In other words, for example, the surgeon may be holding the instrument above the patient and outside the space of the image, so that the representation of the instrument does not appear in the images. However, it may still be desirable to project ahead a fixed length into the image to facilitate planning of the procedure.
  • a trajectory is represented by a directional line. It is contemplated, however, that other representations may be used. For example, a trajectory can be automatically assigned a different color or unique numerical label. Other types of directional indicators may also be used, and different shapes, styles, sizes, and textures can be employed to differentiate among the trajectories.
  • the surgeon also has the option of not showing the label for any trajectory if desired.
  • the surgeon also has the option of changing the default color or label text for any trajectory through appropriate controls contained in menu 96. In one example, past trajectories are assigned one color whereas anticipated or look-ahead trajectories are assigned a different color. Also, while on a single trajectory is illustrated in Fig. 5, it is recognized that multiple instruments can be tracked at a time and their trajectories tracked, predicted, and displayed on the image.
  • the integrated system 10 tracks the position of an instrument, such as a surgical tool or probe, relative to patient anatomy using markers, reflectors, and the like.
  • the instrument is also capable of applying an electrical stimulus to a neural structure so that neurological information, such as nerve position and nerve integrity, can be determined without requiring introduction of another instrument to the patient anatomy.
  • the instrument can be tethered to a computer 18 via a stimulator control interface 86 and a power supply 90, or, in an alternate embodiment, the instrument can be wirelessly connected to the stimulator control interface 86 and be powered inductively or by a self-contained battery.
  • FIG. 6 illustrates operational circuitry for inductively powering the instrument and for wirelessly determining positional information of an instrument rather than using markers and reflectors.
  • the operational circuitry 136 includes a signal generator 138 for generating an electromagnetic field.
  • the signal generator 138 preferably includes multiple coils (not shown). Each coil of the signal generator 138 may be activated in succession to induce a number of magnetic fields thereby inducing a corresponding voltage signal in a sensing coil.
  • Signal generator 138 employs a distinct magnetic assembly so that the voltages induced in a sensing coil 140 corresponding to a transmitted time-dependent magnetic field produce sufficient information to describe the location, i.e. position and orientation, of the instrument.
  • a coil refers to an electrically conductive, magnetically sensitive element that is responsive to time -varying magnetic fields for generating induced voltage signals as a function of, and representative of, the applied time-varying magnetic field.
  • the signals produced by the signal generator 138 containing sufficient information to describe the position of the instrument are referred to hereinafter as reference signals.
  • the signal generator is also configured to induce a voltage in the sensing coil 140 sufficient to power electronic components of the instrument, such as a nerve stimulation unit 142 and a transmitter 144.
  • the signals transmitted by the signal generator 138 for powering the device are frequency multiplexed with the reference signals.
  • the frequency ranges of the reference signal and powering signal are modulated so as to occupy mutually exclusive frequency intervals. This technique allows the signals to be transmitted simultaneously over a common channel, such as a wireless channel, while keeping the signals apart so that they do not interfere with each other.
  • the reference and positional signals are preferably frequency modulated (FM); however, amplitude modulation (AM) may also be used.
  • the powering signals may be transmitted by separate signal generators, each at a differing frequencies.
  • the portion for receiving a reference signal further includes a sensing unit 146 and a power circuit 148.
  • Sensing unit 146 and power circuit 148 each may receive an induced voltage signal due to a frequency multiplexed reference signal and powering signal on sensing/powering coil 140.
  • Sensing unit 146 and power circuit 148 both may separate the voltage signals induced by the multiplexed magnetic signals into positional and powering signals.
  • the sensing unit 146 measures the induced voltage signal portion corresponding to a reference signal as a positional signal indicative of a current position of the instrument.
  • the positional signal is transmitted by transmitter 144.
  • power circuit 148 may retain the induced voltage signal portion corresponding to a powering signal for producing power sufficient to power the transmitter 144 and apply electrostimulation to a neural structure.
  • Power circuit 148 rectifies the induced voltage generated on the coil 140 by the powering signals to produce DC power that is used power the transmitter 144 and the nerve stimulation unit 142.
  • Power circuit 148 may store the DC power using a capacitor, small battery, or other storage device for later use.
  • the integrated system 10 includes an electromagnetic control unit 150 that regulates operation of the signal generator 138 and includes a receiver (not shown) for receiving the positional information transmitted wirelessly by the transmitter 144.
  • the control unit 150 is adapted to receive magnetic field mode positional signals and transmit those positional signals to the CPU for processing to determine the position and/or orientation of the instrument.
  • the CPU preferably begins determining the position of the instrument by first determining the angular orientation of the sensing coil 140 and then using the orientation of the coil 140 to determine the position of the instrument.
  • the present invention is not limited to any specific method of determining the position of the instrument. While a single sensing/powering coil 140 is shown, it is contemplated that separate sensing and powering coils may be used.
  • a surgical instrument such as a probe, a retractor, or a bone screwdriver is also used to apply an electrical stimulus to a neural structure.
  • Figures 7-14 illustrate various examples of integrated surgical and electro stimulating tools.
  • Figure 7 illustrates a surgical probe 152 that includes an elongated and, preferably, textured handle 154 having a proximal end 156 and a distal end 158.
  • the surgical probe 152 is connectable to the neuromonitoring interface 80, Fig. 3, by jacks 160 extending from the handle proximal end 156.
  • Handle includes a transversely projecting actuator 162 proximate a tapered distal segment 164 terminating in handle distal end 158 which carries a distally projecting stainless steel shaft 166.
  • Shaft 166 is tapered and preferably has a larger outside diameter proximate the handle distal end 158, tapering to a smaller outside diameter proximate the shaft distal end 168, with a distally projecting length from handle distal end 158 to shaft distal end 168 encased in clear plastic, thin- wall, shrinkable tubing.
  • Extending from the handle 154 and electrically connected to conductors 170 is an anode 172 and a cathode 174.
  • the anode and cathode 172, 174 extend slightly past the shaft distal end 168 and are used to apply electrostimulation to a neural structure.
  • the outer surface of the handle 154 also includes a reflector/marker network 176 to facilitate tracking of the position and orientation of the probe 152.
  • the probe 152 is shown as having three reflectors 176 that may be permanently or removably fixed to the handle 154.
  • the size, shape, and position of the reflectors 176 are known by the surgical navigational system, thus, when captured by a camera, the position and orientation of the probe 152 can be readily ascertained. It is recognized that more than or less than three reflectors may be used.
  • the actuator 162 enables the surgeon to selectively apply electrostimulation to patient anatomy during a surgical procedure.
  • the probe 152 can be used for surgical purposes without the application of electrostimulation and, when desired by the surgeon, used to illicit a neurological response from a neural structure.
  • the probe 152 is powered by a power supply (not shown) external to the probe 152 via the jacks 160.
  • Retractor 178 includes elongated and, preferably, textured handle 180 having a proximal end 182 and a distal end 184. Extending from the distal end 184 is a tapered shaft 186 that terminates in a curved head 188 that includes an anode tip 190 and a cathode tip 192, that are coplanar with one another.
  • the handle 180 provides an interior volume 194 sized and shaped to hold batteries 196 that supply power sufficient to electrostimulate neural structures when desired by the surgeon.
  • the batteries 196 are permanently sealed within the interior volume 194 of the handle 180 so as to prevent contact with body fluids and cleaning fluids.
  • the batteries are removable and therefore replaceable by threadingly removing a cap portion of the handle. It is contemplated that rechargeable batteries may be used and that the batteries may be recharged without removing them from the handle.
  • the handle 180 also includes three reflectors 198 that provide visual feedback to a camera (not shown) or other detection device to determine the position and orientation of the retractor. Similar to that described with respect to Fig. 7, the retractor 178 further includes an actuator 200 that enables a surgeon to selectively turn the electrostimulation functionality of the retractor 178 on so as to apply electrostimulation to a neural structure.
  • Figure 9 illustrates a corded retractor 202 according to the present disclosure.
  • the retractor 202 is powered by a remote battery or other power supply through a conventional jack connection using jacks 204.
  • the handle 206 of the retractor 202 includes reflectors 208 to enable surgical navigational hardware and software to track the position and orientation of the retractor 202.
  • Retractor 202 also includes an actuator 210 to selectively apply electrostimulation to a neural structure. Electrostimulation is facilitated by an anode conductor 212 and a cathode conductor 214 extending past the shaft 216.
  • the anode and cathode conductors 212, 214 extend along the entire length of the shaft 216 and connect to a power supply via connection with jack connectors 217.
  • a bone screwdriver 218 is configured to provide electrostimulation in addition to driving a bone screw.
  • Screwdriver 218 includes a handle 220 with a driving shaft 222 extending from a distal end thereof.
  • the handle 220 is sized to accommodate batteries 224 to provide power for electrostimulation.
  • the handle 20 also includes reflectors 226 secured thereto in either a permanent or removable fashion.
  • the driving shaft 222 extends from the distal end 228 of the handle 220 to a driving head 230 sized and shaped to accommodate driving of bone screw. Extending parallel to the driving shaft 222 are sheathed anode and cathode electrodes 232, 234.
  • the sheathed anode and cathode electrodes 232, 234 are preferably retractable so as to not interfere with the surgeon during driving of a bone screw.
  • the sheathed electrodes 232, 234 are extended and retracted manually by the surgeon using an eyelet 236.
  • the eyelet is positioned in sufficient proximity to the handle 220 so that a surgeon can extend and retract the electrodes 232, 234 while holding the handle 220 and be able to depress the actuator 238 to apply the electrical stimulation.
  • the handle includes a cavity (not shown) defined by appropriate stops to define the range of translation of the electrodes.
  • FIG 11 is an elevation view of a surgical tap according to another aspect of the present disclosure.
  • a surgical tap 240 is constructed for pedicle hole preparation, but is also capable of neurostimulation and providing navigational information.
  • the surgical tap 240 includes a handle 242 with a conductive shaft 244 extending therefrom.
  • An insulating sheath 246 surrounds only a portion of the shaft so as to limit electrostimulation to the conductive tip 248.
  • the conductive tip 248 includes a series of threads 250 that engage the pedicle or other bony structure during insertion of the tap.
  • the threads 250 are formed such that a longitudinal recess or channel 252 is defined along the length of the tip.
  • Handle 242 has an actuator switch 254 that allows a user to selectively apply electrostimulation during insertion of the tip. As such, electrostimulation can be applied while the surgical tap is forming a pedicle screw pilot hole or probing of the pedicle.
  • Energy is applied to the conductive tip 248 via conductor 256, which is connectable to an energy source of the neuromonitoring system, Fig. 1.
  • batteries can be disposed in the handle and used to supply electrostimulating energy to the conductive tip 248.
  • the handle 242 also has three reflectors 258 which provide visual feedback to a camera (not shown) or other detection device to determine the position and orientation of the tap.
  • a camera not shown
  • other detection device to determine the position and orientation of the tap.
  • Figure 12 shows a surgical probe 260 according to another embodiment of the present disclosure. Similar to the examples described above, probe 260 has a handle 262 with a series of reflectors 264 coupled to or otherwise formed thereon. Extending from the proximate end of the handle are jacks 266 for connecting the probe 260 to the energy source of the neuromonitoring system, Fig. 2. Extending from the distal end of the handle 262 is a conductive shaft 268 partially shrouded by an insulating sheath 270. The unsheathed portion of the shaft 268 is a conductive tip 272 capable of probing the pedicle or other bony structure. The handle also has an actuator 274 for selectively energizing the conductive tip 272 for the application of electrostimulation during probing.
  • Figure 13 is a cross-sectional view of the conductive tip 272.
  • the conductive shaft 268 includes an anode conductive portion 274 and a cathode conductive portion 276 separated from the anode conductive portion 274 by an insulator 278. This is further illustrated in Fig. 14. With this construction, electrostimulation is applied between the anode conduction portion 276 and the electrically isolated cathode conductive portion 274 for bipolar electrostimulation.
  • the illustrative tools described above are designed to not only perform a surgical function, but also apply electrostimulation to a neural structure of the patient.
  • a surgeon can move the instrument, visualize that movement in real-time, and apply electrostimulation (uni-polar and bi-polar) as desired at various instrument positions without the need for a separate stimulation instrument.
  • electrostimulation can also be applied to enhance navigation through the application of a leading electrostimulation pattern.
  • electrostimulation is automatically applied ahead of the tip of the instrument.
  • neurological information is automatically acquired as the instrument is moved and the visualization of patient anatomy automatically updated to incorporate the neurological information.
  • the neurological information can be used to localize, with better specificity, the actual location and orientation of neural structures.
  • electrostimulation with a broadcasting scope can be applied as the instrument is moved. If a neurological response is not measured, such a broad electrostimulation continues. However, if a neurological response is measured, a pinpointing electrostimulation can be repeatedly applied with decreasing coverage to localize the position of the stimulated neural structure.
  • the leading electrostimulation can also be used to signal to the surgeon that the instrument is approaching a nerve or other neural structure.
  • the signal may be a visual identifier on the GUI or in the form of an audible warning broadcast through the audio system described herein.
  • the integrated system determines the instantaneous position of the instrument at 280. The system then compares the position of the instrument with information regarding the anatomical makeup of the patient to determine the proximity of the instrument to neural structures that may not be readably visible on the anatomical visualization at 282. If the instrument is not near a neural structure 282, 284, the process loops back to step 280.
  • the neural structure is identified or classified from an anatomical framework of the patient and/or the neurological response of the structure.
  • an appropriate signal is output 290 signaling that the instrument is near a neural structure.
  • the intensity and identification afforded the signal may be based on the type of neural structure identified as being proximal the instrument.
  • the volume and the pattern of an audible alarm may vary depending upon the type of neural structure.
  • the volume and/or pattern of audible alarm may change as the instrument moves closer to or farther away from the neural structure.
  • the audible signals provide real-time feedback to the surgeon regarding the position of the instrument relative to a neural structure.
  • the integrated system is also capable of performing measurements between trajectories or instrument positions.
  • bone measurements can be done to determine if sufficient bone has been removed for a particular surgical procedure.
  • the instrument can be tracked across the profile of a portion of a bone to be removed. The trajectory across the profile can then be stored as a trajectory. Following one or more bone removal procedures, the instrument can again be tracked across the bone now having a portion thereof removed.
  • the system can then compute the differences between those trajectories and provide a quantitative value to the surgeon, via the GUI, for example, to assist the surgeon in determining if enough bone has been removed for the particular surgical procedure.
  • the characteristics of the electrostimulation can be automatically adjusted based on the tracked instantaneous position of the instrument. That is, the integrated system, through real-time tracking of the instrument and a general understanding of patient anatomy layout from images, atlas models, and the like, can automatically set the intensity, scope, and type of electrostimulation based on the anatomy proximal the instrument when the surgeon directs application of electrostimulation. Rather than automatically set the electrostimulation characteristics, the system could similar display, on the GUI, the electrostimulation values derived by the system for consideration by the surgeon. In this regard, the surgeon could adopt, through appropriate inputs to the GUI, the suggested characteristics or define values different from those suggested by the system. Also, since an instrument could be used for bone milling or removal and electrostimulation, neurological responses could be measured during active milling or bone removal.
  • an implant such as a pedicle screw, when coupled to a conductive portion of a surgical tool, may also be conductive and thus used to apply electrostimulation during implantation of the implant.
  • a bone screw may also be used to apply electrostimulation when engaged with the driving and conductive end of a driver.
  • surgical instruments having reflectors for optically determining instrument position and orientation have been illustratively shown, the surgical instruments may include circuitry such as that described with respect to Fig. 6 for electromagnetically determining instrument position and orientation and inductively powering the electrostimulation and transmitter circuits.
  • the surgical instruments described herein illustrate various examples in which the present disclosure can be implemented. It is recognized that other instruments other than those described can be used. Further, preferably, the instruments are formed of biocompatible materials, such as stainless steel. It is recognized however that other biocompatible materials can be used.
  • the neuromonitoring information provided by a stand-alone neuromonitoring probe and system can be provided to a stand-alone surgical navigational system for the integrated visualization of navigational and neuromonitoring information.
  • the integrated system is also capable of providing on-demand access to technical resources to a surgeon. Moreover, the integrated system is designed to provide a list of on-demand resources based on instrument position, neural structure position, or neural structure neuroresponse. As set forth in Fig. 16, the integrated system is designed to receive a user input 292 from the surgeon or other user requesting publication of a technical resource. Responsive to that input, the integrated system determines the instantaneous position of the instrument 294 when the request is made. Based on the instrument position, anatomical structures proximal the instrument are then determined 296.
  • the system accesses corresponding portions of a technical resource database 298 to derive and display a list of related technical resources available for publication to the surgeon at 300.
  • the list is preferably in the form of selectable computer data links displayed on the GUI for surgeon selection and may link to articles, publications, tutorials, maps, presentations, video, instructions, and manuals, for example.
  • the selected technical resource is uploaded from the database and published to the surgeon or other user at 304. It is contemplated that the integrated system may upload the technical resource from a local or remote database.
  • Figure 17 sets forth the steps of a predictive process for providing feedback to a surgeon or other is assessing neural integrity.
  • the process begins at step 306 with determining a position of the electrostimulation instrument when an electrostimulation is applied.
  • the location of the stimulated neural structure is also determined at 308.
  • the neural structure is identified 310. Identification of the neural structure can be determined from comparing anatomical information of the patient with previous neural maps, atlas models, anatomical maps, and the like. Based on identification of the neural structure, e.g., class, the neurological response of the neural structure to the electrostimulation is predicted 312.
  • the predicted neurological response is then compared to the actual, measured neurological response at 314.
  • the results of that comparison are then conveyed at 316 to the surgeon or other user with the GUI to assist with determining the neural integrity of the stimulated neural structure.
  • the visualization of the stimulated and measured neural structure can be automatically updated based on the comparison, e.g., color coded or annotated to indicate that the neurological response was not in line with that expected.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Neurosurgery (AREA)
  • Cardiology (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Neurology (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Surgical Instruments (AREA)

Abstract

The invention relates to an integrated surgical navigational and neuromonitoring system having automated surgical assistance and control. The integrated system provides real-time introperative assistance to a surgeon or other user. The integrated system can also automatically control neuromonitoring based on a position of a neuromonitoring probe.

Description

INTEGRATED SURGICAL NAVIGATIONAL AND NEUROMONITORING SYSTEM HAVING AUTOMATED SURGICAL ASSISTANCE AND CONTROL
BACKGROUND
Surgical procedures and, in particular, neuro-related procedures are often assisted by a surgical navigational system to assist a surgeon in translating and positioning a surgical tool or probe. Conventional surgical navigational systems use reflectors and/or markers to provide positional information of the surgical tool relative to a preoperative rendering of a patient anatomy. Surgical navigational systems, however, do not carry out neuromonitoring functions to determine the integrity of a neural structure or the proximity of the surgical tool to that neural structure. On the other hand, neural integrity monitoring systems are designed to use electrostimulation to identify nerve location for predicting and preventing neurological injury. However, neural integrity monitoring systems do not provide visual navigational assistance. Therefore, there is a need for an integrated neuromonitoring and surgical navigational system that is capable of visually assisting a surgeon in navigating a surgical tool or probe as well as being capable of neuromonitoring to evaluate surgical tool proximity to a neural structure and/or the integrity of the neural structure.
SUMMARY
In one aspect, this disclosure is directed to an apparatus that includes an instrument tracking system configured to track movement of an instrument and a database containing technical information regarding a surgical procedure and patient anatomy. The apparatus also includes a computer operatively linked with the instrument tracking system and the database. The computer is programmed to determine an anatomical structure proximate the instrument and determine a portion of the technical information contained on the database that relates to the anatomical structure. The computer is further programmed to generate and display identifiers for the portion of the technical information in a user- selectable manner to allow a user to selectively obtain technical information relating to one of the surgical procedure and the anatomical structure.
In another aspect, the disclosure is directed to a method that involves tracking a surgical instrument and applying electrostimulation at a given surgical instrument position. The method also includes determining a location of a neural structure relative to the surgical instrument position from a neurological response of the neural structure to the electrostimulation.
In a further aspect, the disclosure includes an apparatus having a computer programmed to determine a location of a neuromonitoring probe designed to apply electrostimulation to a patient. The computer is further programmed to compare the determined location to an anatomical framework of the patient, wherein the anatomical framework provides a general localization of a neural structure. The computer is also programmed to automatically determine one of electrostimulation intensity and electrostimulation pattern for electrostimulating the neural structure based on the position of the neuromonitoring probe and the neural structure.
In yet another aspect, the disclosure is directed to a computer readable storage medium having instructions thereon that when executed by a computer causes the computer to access an anatomical visualization of a patient. The instructions also causes the computer to access neurological information acquired from the patient and update the anatomical visualization to incorporate the neurological information.
In a further aspect, the invention is directed to a surgical method that includes translating a surgical tool relative to patient anatomy containing a neural structure and applying an electrical stimulus to the neural structure with the surgical tool. The surgical method also includes determining a position of the neural structure relative to other anatomical structures of the patient anatomy through inspection of a GUI displaying a visualization of the patient anatomy and the surgical tool.
These and other aspects, forms, objects, features, and benefits of the present invention will become apparent from the following detailed drawings and descriptions.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a pictorial view of an integrated surgical navigational and neuromonitoring system.
Figure 2 is a pictorial view of a surgical suite incorporating the integrated surgical navigational and neuromonitoring system of Fig. 1. Figure 3 is a block diagram of the integrated surgical navigational and neuromonitoring system of Fig. 1.
Figure 4 is a front view of a GUI displayed by the integrated surgical navigational and neuromonitoring system of Figs 1-3.
Figure 5 is a front view of a portion of the GUI shown in Fig. 4.
Figure 6 is a block diagram of a wireless instrument tracking system for use with the integrated surgical navigational and neuromonitoring system of Figs. 1-3.
Figure 7 is a side view of surgical probe according to one aspect of the present disclosure.
Figure 8 is a side view of a cordless retractor capable of applying electrostimulation according to one aspect of the present disclosure.
Figure 9 is a side view of a corded retractor capable of applying electrostimulation according to one aspect of the present disclosure.
Figure 10 is a side view of a cordless bone screwdriver capable of applying electrostimulation according to one aspect of the present disclosure.
Figure 11 is a side view of a surgical tap capable of applying electrostimulation according to another aspect of the present disclosure.
Figure 12 is a side view of a surgical probe according to another aspect of the present disclosure.
Figure 13 is a cross-sectional view of the surgical probe of Fig. 12 taken along lines 13-13 thereof.
Figure 14 is an end view of the surgical probe shown in Figs. 12-13.
Figure 15 is a flow chart setting forth the steps signaling instrument proximity to an anatomical structure according to one aspect of the present disclosure.
Figure 16 is a flow chart setting forth the steps of accessing and publishing technical resources according to an aspect of the present disclosure.
Figure 17 is a flow chart setting forth the steps of determining neural structure integrity according to one aspect of the invention. DETAILED DESCRIPTION
The present disclosure relates generally to the field of neuro-related surgery, and more particularly to systems and methods for integrated surgical navigation and neuromonitoring. For the purposes of promoting an understanding of the principles of the invention, reference will now be made to embodiments or examples illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alteration and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the disclosure relates.
With reference to Fig. 1, there is shown an apparatus for the symbiotic display of surgical navigational and neuromonitoring information. The integrated image-based surgical navigation and neuromonitoring system 10 enables a surgeon to generate and display on monitor 12 the trajectory of instrument 14, which is preferably a surgical instrument also capable of facilitating the acquisition of neurological information, relative to a visualization of patient anatomy. Data representing one or more pre-acquired images 16 is fed to computer 18. Computer 18 tracks the position of instrument 14 in real-time utilizing detector 20. Computer 18 then registers and displays the trajectory of instrument 14 with images 16 in real-time. An icon representing the trajectory of instrument 14 is superimposed on the pre-acquired images 16 and shown on monitor 12. At the surgeon's command, the real-time trajectory of instrument 14 can be stored in computer 18. This command also creates a new static icon representing the trajectory of the instrument on display 12 at the time the surgeon's command was issued. The surgeon has the option of issuing additional commands, each one storing a real-time trajectory and creating a new static icon for display by default. The surgeon can override this default and choose to not display any static icon. The surgeon also has the option to perform a number of geometric measurements using the real-time and stored instrument trajectories.
In addition to displaying and storing a trajectory of instrument 14 relative to patient anatomy, computer system 18 also updates the visualization of patient anatomy shown on display 12 with indicators representative of neurological information acquired from the patient. As will be described in greater detail below, the neurological indicators can include color coding of certain anatomical structures, textual or graphical annotations superimposed on the pre-acquired images or visualization thereof, or other identifying markers. Reference to a visualization of patient anatomy herein may include a pre- acquired image, a graphical representation derived from one or more pre-acquired images, atlas information, or a combination thereof.
Referring to Fig. 2, a surgical suite 22 incorporating the image-based surgical navigation and neuromonitoring system 10 is shown. Pre-acquired images of patient 24 are collected when a patient, lying on table 26, is placed within C-arm imaging device 28. The term "pre-acquired," as used herein, does not imply any specified time sequence. Preferably, however, the images are taken at some time prior to when surgical navigation is performed. Usually, images are taken from two substantially orthogonal directions, such as anterior-posterior (A-P) and lateral, of the anatomy of interest. The imaging device 28 includes x-ray source 30 and x-ray receiving section 32. Receiving section 32 includes target tracking markers 34. Operation of the C-arm imaging device 28 is controlled by a physician or other user by C-arm control computer 36.
While a C-arm imaging device 28 is shown for the acquisition of images from patient 24, it is understood that other imaging devices may be used to acquire anatomical and/or functional images of the patient. For example, images may be acquired using computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), ultrasound, and single photon emission computed tomography (SPECT). An O- arm imaging system may also be used for image acquisition. Further, it is contemplated that images may be acquired preoperatively with one type of imaging modality remote from the surgical suite 22 and acquired preoperatively or intraoperatively at the surgical suite 22 with another type of imaging modality. These multi-modality images can be registered using known registration techniques.
Acquired images are transmitted to computer 36 where they may be forwarded to surgical navigation computer 18. Computer 18 provides the ability to display the received images via monitor 12. Other devices, for example, such as heads up displays, may also be used to display the images.
Further referring to Fig. 2, system 10 generally performs the real-time tracking of instrument 14, and may also track the position of receiver section 32 and reference frame 38. Detector 20 senses the presence of tracking markers on each object to be tracked. Detector 20 is coupled to computer 18 which is programmed with software modules that analyze the signals transmitted by detector 20 to determine the position of each object in detector space. The manner in which the detector localizes the object is known in the art.
In general, instrument 14 is tracked by the detector, which is part of an optical tracking system (not shown) using attached tracking markers 40, such as reflectors, in order for its three-dimensional position to be determined in detector space. Computer 18 is communicatively linked with the optical tracking system and integrates this information with the pre-acquired images of patient 24 to produce a display which assists surgeon 42 when performing surgical procedures. An iconic representation of the trajectory of instrument 14 is simultaneously overlaid on the pre-acquired images of patient 24 and displayed on monitor 12. In this manner, surgeon 42 is able to see the trajectory of the instrument relative to the patient's anatomy in real-time.
Further referring to Fig. 2, the system according to the invention preferably has the ability to save the dynamic real-time trajectory of instrument 14. By issuing a command using foot-switch 44, for example, computer 18 receives a signal to store the real-time trajectory of the instrument in the memory of computer 18. Alternately, the surgeon or other user may issue the command using other input devices, such as a push-button on the instrument, voice command, touchpad/touch screen input, and the like. This "storage command" also instructs computer 18 to generate a new static icon representing the saved trajectory of the instrument, essentially "freezing" the icon at the point when the input was received. The static icon, along with the icon representing the real-time trajectory of the instrument, can be simultaneously superimposed over the pre-acquired image. If multiple images are being displayed, both static and real-time icons can be superimposed on all of the displayed images. Other means of issuing the storage command, such as, for example, through a GUI, may also be used. The surgeon also has the option of storing multiple instrument trajectories. Each time a desired storage command is issued, the real-time trajectory of the instrument is stored and a new static icon representing the stored trajectory is displayed on the pre-acquired image, or if more than one image is being displayed, on all the pre-acquired images. The system according to the invention preferably has the additional capability to measure angles between the real-time trajectory and one or more of the stored trajectories, or between stored trajectories, in a manner similar to that described in U.S. Pat. No. 6,920,347, the disclosure of which is incorporated herein.
In addition to tracking and storing instrument trajectory, as will be described, neurological information can be acquired from the patient and that information that can be represented in a visible form that can be shown on display 12. For example, with the aid of pre-acquired images and trajectory information, surgeon 42 may move the instrument 14 in a guided manner to an anatomical region containing neural structures and using instrument 14 or other neuro logically stimulating device together with electrodes (not shown) may then acquire neurological information from the neural structures. The acquired neurological information is then passed to computer 18 which registers the neurological information with the neural structure from which the neurological information was acquired. Based on the position of the instrument 14, computer 18 can determine the location of the neural structure that was stimulated and then update the visualization of that neural structure on display 12 to include markers or other indices representative of the acquired neurological information. For example, based on the location, orientation, and neurological response, computer 18 can determine the class of the stimulated neural structure and add an annotation to the visualization of the neural structure on display 12. Alternately, the neural structure may be assigned a designated color in the visualization on display 12 based on its class or other defining characteristics.
In addition to characterizing a stimulated neural structure, computer 18, together with positional information of the neural structure, may also predict the structure of the nerve and graphically display that predicted structure to the surgeon on display 12. In this regard, a portion of a nerve may be stimulated, but the entire nerve structure predicted and graphically displayed. Further, while the pre-acquired images and/or visualizations thereof provide the surgeon with a general understanding of the patient anatomy relative to the tracked instrument, the acquired neurological information supplements that understanding with greater precision with respect to neural structures. Thus, by localizing the position of neural structures, the integrated system enhances the surgeon's understanding of the anatomy for the particular patient. To further assist the surgeon, through localization of neural structures, viewable or audible indicators may be automatically given by the computer 18 to the surgeon when the instrument 14 is in proximity to a neural structure. Moreover, the indicators may be tailored to coincide with the class, position, or other characteristic of the neural structure.
Using voice recognition software and hardware, or other input devices, surgeon 42 or other user may also add notes regarding the neural structure from which a neurological response was measured. Those notes may then be stored in memory of computer 18. In one embodiment, surgeon 42 wears a headphone 46 and microphone 48 to facilitate hands-free note making during the surgical procedure. As will be explained further below, computer 18 may also broadcast on-demand audio information to the surgeon via an audio system connected to the headphone or other speakers.
Referring now to Fig. 3, a block diagram of the integrated surgical navigational and neuromonitoring system 10 is shown. Computer 18 includes a GUI system operating in conjunction with a display screen of display monitor 12. The GUI system is implemented in conjunction with operating system 46 running computer 18. The GUI is implemented as part of the computer 18 to receive input data and commands from a user interface 47 such as a keyboard, mouse, lightwand, touchpad, touch screen, voice recognition module, foot switch, joystick, and the like. For simplicity of the drawings and explanation, many components of a conventional computer have not been illustrated such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.
A computer program used to implement the various steps of the present invention is generally located in memory unit 48, and the processes of the present invention are carried out through the use of a central processing unit (CPU) 50. The memory unit 48 is representative of both read-only memory and random access memory. The memory unit also contains a database 52 that stores data, for example, image data and tables, including such information as stored instrument positions, extension values, and geometric transform parameters, used in conjunction with the present invention. Database 52 can also be used to store data, such as quantitative and qualitative assessments, of monitored neurological structures. The memory unit further contains a technical data database 53 that stores data pertaining to, for example, surgical procedures, general anatomical structure information, videos, publications, tutorials, presentations, anatomical illustrations, surgical guides, and the like, that can be accessed by a surgeon or other user preoperatively, intraoperatively, or postoperatively to assist with diagnosis and treatment. Also contained in memory 48 is a communication software module 60 that facilitates communication, via modem 62, of the computer 18 to remote databases, e.g., technical data database 64.
It is understood that the single representations of an image archival database and a technical data database is for demonstrative purposes only, and it is assumed that there may be a need for multiple databases in such a system. Additionally, computer 18 may access the databases via a network (not shown). According to the present invention, any acceptable network may be employed whether public, open, dedicated, private, or so forth. The communications links to the network may be of any acceptable type, including conventional telephone lines, fiber optics, cable modem links, digital subscriber lines, wireless data transfer systems, or the like. In this regard, the computer 18 is provided with communications interface hardware 62 and software 60 of generally known design, permitting establishment of networks links and the exchange of data with the databases.
CPU 50, in combination with the computer software comprising operating system 46, tracking software module 54, calibration software module 56, display software module 58, communication module 60, and neuromonitoring software module 66 controls the operations and processes of system 10. The processes implemented by CPU 50 may be communicated as electrical signals along bus 68 to an I/O interface 70 and a video interface 72. In addition to be connected to user interface 47, the I/O interface is connected to a printer 74, an image archive (remote or local) 76, and an audio (speaker) system 78.
Tracking software module 54 performs the processes necessary for tracking objects in an image guided system as described herein and are known to those skilled in the art. Calibration software module 56 computes the geometric transform which corrects for image distortions and registers the images to the anatomical reference frame 38, and thus the patient's anatomy. Display software module 58 applies, and if desired, computes the offsets between the guide tracking markers 40 and the instrument 14 in order generate an icon representing the trajectory of the instrument for superposition over the images. For instruments with fixed lengths and angulations, these offsets can be measured once and stored in database 52. The user would then select from a list of instruments, the one being used in the procedure so the proper offsets are applied by display software module 58. For instruments with variable lengths and angulations, the offsets could be measured manually and entered via keyboard 47, or measured in conjunction a tracked pointer (not shown) or tracked registration jig (not shown).
Pre-acquired image data stored locally in image database 52 or remotely in image archive 76 can be fed directly into computer 18 digitally through I/O interface 70, or may be supplied as video data through video interface 72. In addition, items shown as stored in memory can also be stored, at least partially, on a hard disk (not shown) or other memory device, such as flash memory, if memory resources are limited. Furthermore, while not explicitly shown, image data may also be supplied over a network, through a mass storage device such as a hard drive, optical disks, tape drives, or any other type of data transfer and storage devices.
In addition to the modules and interfaces described above, computer 18 includes a neuromonitoring interface 80 as well as an instrument navigation interface 82. The neuromonitoring interface 80 receives electrical signals from electrodes 84 proximate patient 24. The electrical signals are detected by electrodes 84 in response to electrostimulation applied to neural structures of the patient by instrument 14 or other electrostimulating probe (not shown). In this example, the electrodes are electromyography (EMG) electrodes and record muscle response to nerve stimulation. Alternately, other neuromonitoring techniques, such as, motor evoked potentials (MEP) neuromonitoring and somatosensory evoked potentials (SSEP) neuromonitoring, may be used. A stimulator control 86 interfaces with instrument 14 and controls the intensity, direction, and pattern of stimulation applied by instrument 14. Inputs establishing desired stimulation characteristics may be received by the surgeon or other user via input interface 47 or on the instrument 14 itself. As described above, the integrated system 10 also carries out real-time tracking of instrument 14 (and patient 24) using markers, reflectors, or other tracking devices. In one example, instrument 14 includes markers 40 whose movements are tracked by instrument tracker 88, which may include a camera or other known tracking equipment. Similarly, the patient may include markers or reflectors so that patient movement can be tracked. To effectuate application of an electrical stimulus, instrument 14 is also connected to a power supply 90. As will be shown, the instrument 14 may be powered by a battery housed within the instrument itself, a power supply housed within the computer cabinet, or inductively.
The integrated surgical navigational and neuromonitoring system is designed to assist a surgeon in navigating an instrument, e.g., surgical tool, probe, or other instrument, through visualization of the instrument relative to patient anatomy. As described herein, using tracking tools and techniques, real-time positional and orientation information regarding the instrument relative to patient anatomy can be superimposed on an anatomical, functional, or derived image of the patient. In addition to assisting a surgeon with instrument tracking, the integrated system 10 also performs neuromonitoring to assess the position and integrity of neural structures. In this regard, the surgeon can move the instrument to a desired location, view the placement of the instrument relative to patient anatomy on display 12, apply an electrical stimulus to neural structures proximate the instrument, and measure the response to that electrical stimulus. This neural information gathered can then be added to the visualization of the patient anatomy through graphic or textual annotations, color or other coding of the neural structure, or other labeling techniques to convey, in human discernable form, the neural information gathered from the application of an electrical stimulus. The integrated system also helps the surgeon in visualizing patient anatomy, such as key nerve structures, and associating position or integrity with the patient anatomy. As will be shown with respect to Figs. 4-5, a GUI is used to convey and facilitate interaction with the surgical navigational and neuromonitoring information.
Referring now to Fig. 4, a GUI 92 designed to assist a surgeon or other user in navigating a surgical tool, such as a probe or a bone screwdriver, is shown. In the illustrated example, the GUI 92 is bifurcated into an image portion 94 and a menu portion 96. The image portion contains three image panes 98, 100, 102 that, in the illustrated example, contain a coronal, a sagittal, and an axial image, respectively, of patient anatomy. The image portion also contains a rendering pane 104. The menu portion 96 provides selectable links that, when selected by a surgeon, enables interfacing with that displayed in the image panes 98, 100, 102 or with other data acquired from the patient.
The image panes provide an anatomical map or framework for a surgeon to track an instrument, which can be representatively displayed by pointer 106. The integrated system described herein tracks movement of an instrument and provides a real-time visualization of the position of the pointer superimposed on the images contained in panes 98, 100, 102. It is noted that the displayed images can be derived from one or more diagnostic images acquired of the patient, an atlas model, or a combination thereof. As the instrument is moved relative to the patient anatomy, the images displayed in the image panes are automatically refreshed such that an instantaneous position of the instrument, via pointer 106, provides positional information to the surgeon.
Moreover, as the integrated system supports both surgical instrument navigation and neuromonitoring, the image panes and the positional feedback provided by pointer 106 can assist the surgeon in isolating a neural structure for neural monitoring. That is, a general understanding of nerve location can be determined from the images contained in the image panes 98, 100, 102. Through visual inspection of the panes, the surgeon can then move the instrument proximal a neural structure, apply an electrostimulation, and measure the neurological response. That neurological response can be used to assess the integrity of the neural structure in a manner consistent with known neuromonitoring studies. Additionally, the neurological information can also be used to localize more precisely the position of the stimulated neural structure. For example, the visualization of patient anatomy, e.g., the images contained in panes 98, 100, 102, provides a general visual understanding of anatomy position, orientation, and location. The neurological response of a stimulated neural structure can then be used to pinpoint the position and orientation of that neural structure on the patient anatomy visualization using color-coding or other indicia.
Moreover, based on the general location of a neural structure and its localized position, assessment of the neural structure can be enhanced. That is, the computer, using the measured response of a neural structure and its positional information, as indicated by the surgeon positioning the instrument proximal the structure, can compare the measured response to data contained in a database and determine if the measured response is consistent with that expected given.
In addition to integrity assessment and positional localization, the integration of the navigation and neuromonitoring information enables the development of neural maps. That is, through repeated movement of the instrument and neurological monitoring, the combined information can be integrated to localize neural structure position, classify those neural structures based on position and/or response, and code through color or other indicia, a neurological, anatomically driven map of the patient.
It is noted that in the illustrated example, the tip of the instrument is represented by pointer 106. However, it is contemplated that tip, hind, or full instrument representations can be used to assist with navigation. Also, while three images of the same anatomy, but at different views are shown, other image display approaches may be used.
Still referring to Fig. 4, one of the image panes 104 is illustratively used for a three-dimensional rendering of a patient anatomy, such as a neural structure bundle 108. The rendering can be formed by registration of multi-angle images of the patient anatomy, derived from atlas information, or a combination thereof. In practice, the surgeon positions the instrument proximal a target anatomical structure. The surgeon then, if desired, selects "3D Rendering" tab 110 of menu 96. Upon such a selection, the computer than determines the position of the pointer 106 and generates a 3D rendering of the anatomical structure "pointed at" by the pointer. In this way, the surgeon can select an anatomical feature and then visually inspect that anatomical feature in a 3D rendering on the GUI 92.
Further, as referenced above, the integrated system maintains or has access to a technical library contained on one or more databases. The surgeon can access that technical data through selection of "Technical Data" tab 112. Upon such a selection, the computer causes display of available resources (not shown) in menu 96. It is contemplated that another window may be displayed; however, in a preferred implementation, a single GUI is used to prevent superposition of screens and windows over the navigational images. The technical resources may include links to internet web pages, intranet web pages, articles, publications, presentations, maps, tutorials, and the like. Moreover, in one preferred example, the list of resources is tailored to the given position of the instrument when the surgeon selects tab 112. Thus, it is contemplated that access to the technical resource information can be streamlined for efficient access during a surgical procedure.
Menu 96 also includes a tracker sub-menu 114 and an annotation sub-menu 116. The tracker sub-menu 114, in the illustrated example, includes a "current" tab 118, a "past trajectory" tab 120, and an "anticipated trajectory" tab 122 that provide on-demand view options for displaying instrument navigation information. User selection of tab 118 causes the current position of the instrument to be displayed in the image panes. User selection of tab 120 causes the traveled trajectory of the instrument to be displayed. User selection of tab 122 causes the anticipated trajectory, based on the current position of the head of the instrument, to be displayed. It is contemplated that more than a single tab can be active or selected at a time.
The annotations sub-menu 116 contains a "New" tab 124, a "View" tab 126, and an "Edit" tab 128. Tabs 124, 126, 128 facilitate making, viewing, and editing annotations regarding a surgical procedure and anatomical and neural observations. In this regard, a surgeon can make a general annotation or record notes regarding a specific surgical procedure or anatomical observation, such as an observation regarding a neural structure, its position, integrity, or neurological response. In one preferred example, the computer automatically associates an annotation with the position of the instrument when the annotation was made. Thus, annotations can be made and associated with a neural or other structure during the course of a surgical procedure. Moreover, by depressing the "view" tab 126, the computer will cause a list of annotations to be appear in pane 116. Alternately, or in addition thereto, annotations made and associated with a neural structure will be viewable by positioning the instrument proximal the neural structure. Akin to a mouse-over technique, positioning the instrument proximal an annotated neural structure will cause any previous annotations to appear automatically if such a feature is enabled.
It is understood that other tabs and selectors, both general, such as a patient information tab 130, or specific, can be incorporated into the menu pane 96. It is also understood that the presentation and arrangement of the tabs in menu pane 96 is merely one contemplated example.
Referring now to Fig. 5, image pane 102 is shown to further illustrate instrument tracking. As described above, through user selection of the appropriate input tab, the instantaneous position of the instrument can be viewed relative to patient anatomy via localization of pointer 106. Additionally, selection of the "past trajectory" tab 120 on menu 96, Fig. 4, causes the past or traveled trajectory of the instrument to be shown by dashed trajectory line 132. Similarly, the anticipated trajectory 134 can also be viewed relative to the patient anatomy based on the instantaneous position and orientation of the tip or leading portion of the instrument.
Additionally, it is contemplated that trajectory paths can be stored and that stored trajectories can be recalled and viewed relative to the patient anatomy. In this regard, a current or real-time instrument trajectory can be compared to past trajectories. Moreover, it is recognized that not all instrument movement is recorded. In this regard, the surgeon or other user can turn instrument tracking on and off as desired. Also, although the look- ahead technique described above projects the graphical representation of the instrument into the image, there is no requirement that the instrument's graphical representation be in the space of the image to be projected into the image. In other words, for example, the surgeon may be holding the instrument above the patient and outside the space of the image, so that the representation of the instrument does not appear in the images. However, it may still be desirable to project ahead a fixed length into the image to facilitate planning of the procedure.
In the illustrated example, a trajectory is represented by a directional line. It is contemplated, however, that other representations may be used. For example, a trajectory can be automatically assigned a different color or unique numerical label. Other types of directional indicators may also be used, and different shapes, styles, sizes, and textures can be employed to differentiate among the trajectories. The surgeon also has the option of not showing the label for any trajectory if desired. The surgeon also has the option of changing the default color or label text for any trajectory through appropriate controls contained in menu 96. In one example, past trajectories are assigned one color whereas anticipated or look-ahead trajectories are assigned a different color. Also, while on a single trajectory is illustrated in Fig. 5, it is recognized that multiple instruments can be tracked at a time and their trajectories tracked, predicted, and displayed on the image.
As described with respect to Figs. 1-5, the integrated system 10 tracks the position of an instrument, such as a surgical tool or probe, relative to patient anatomy using markers, reflectors, and the like. In one aspect, the instrument is also capable of applying an electrical stimulus to a neural structure so that neurological information, such as nerve position and nerve integrity, can be determined without requiring introduction of another instrument to the patient anatomy. The instrument can be tethered to a computer 18 via a stimulator control interface 86 and a power supply 90, or, in an alternate embodiment, the instrument can be wirelessly connected to the stimulator control interface 86 and be powered inductively or by a self-contained battery.
Figure 6 illustrates operational circuitry for inductively powering the instrument and for wirelessly determining positional information of an instrument rather than using markers and reflectors. The operational circuitry 136 includes a signal generator 138 for generating an electromagnetic field. The signal generator 138 preferably includes multiple coils (not shown). Each coil of the signal generator 138 may be activated in succession to induce a number of magnetic fields thereby inducing a corresponding voltage signal in a sensing coil.
Signal generator 138 employs a distinct magnetic assembly so that the voltages induced in a sensing coil 140 corresponding to a transmitted time-dependent magnetic field produce sufficient information to describe the location, i.e. position and orientation, of the instrument. As used herein, a coil refers to an electrically conductive, magnetically sensitive element that is responsive to time -varying magnetic fields for generating induced voltage signals as a function of, and representative of, the applied time-varying magnetic field. The signals produced by the signal generator 138 containing sufficient information to describe the position of the instrument are referred to hereinafter as reference signals.
The signal generator is also configured to induce a voltage in the sensing coil 140 sufficient to power electronic components of the instrument, such as a nerve stimulation unit 142 and a transmitter 144. In the preferred embodiment, the signals transmitted by the signal generator 138 for powering the device, hereinafter referred to as powering signals, are frequency multiplexed with the reference signals. The frequency ranges of the reference signal and powering signal are modulated so as to occupy mutually exclusive frequency intervals. This technique allows the signals to be transmitted simultaneously over a common channel, such as a wireless channel, while keeping the signals apart so that they do not interfere with each other. The reference and positional signals are preferably frequency modulated (FM); however, amplitude modulation (AM) may also be used.
Alternatively, the powering signals may be transmitted by separate signal generators, each at a differing frequencies. As embodied herein, the portion for receiving a reference signal further includes a sensing unit 146 and a power circuit 148. Sensing unit 146 and power circuit 148 each may receive an induced voltage signal due to a frequency multiplexed reference signal and powering signal on sensing/powering coil 140. Sensing unit 146 and power circuit 148 both may separate the voltage signals induced by the multiplexed magnetic signals into positional and powering signals.
The sensing unit 146 measures the induced voltage signal portion corresponding to a reference signal as a positional signal indicative of a current position of the instrument. The positional signal is transmitted by transmitter 144. Similarly, power circuit 148 may retain the induced voltage signal portion corresponding to a powering signal for producing power sufficient to power the transmitter 144 and apply electrostimulation to a neural structure. Power circuit 148 rectifies the induced voltage generated on the coil 140 by the powering signals to produce DC power that is used power the transmitter 144 and the nerve stimulation unit 142. Power circuit 148 may store the DC power using a capacitor, small battery, or other storage device for later use.
The integrated system 10 includes an electromagnetic control unit 150 that regulates operation of the signal generator 138 and includes a receiver (not shown) for receiving the positional information transmitted wirelessly by the transmitter 144. In this regard, the control unit 150 is adapted to receive magnetic field mode positional signals and transmit those positional signals to the CPU for processing to determine the position and/or orientation of the instrument. The CPU preferably begins determining the position of the instrument by first determining the angular orientation of the sensing coil 140 and then using the orientation of the coil 140 to determine the position of the instrument. However, the present invention is not limited to any specific method of determining the position of the instrument. While a single sensing/powering coil 140 is shown, it is contemplated that separate sensing and powering coils may be used.
As described herein, in one aspect of the disclosure, a surgical instrument, such as a probe, a retractor, or a bone screwdriver is also used to apply an electrical stimulus to a neural structure. Figures 7-14 illustrate various examples of integrated surgical and electro stimulating tools.
Figure 7 illustrates a surgical probe 152 that includes an elongated and, preferably, textured handle 154 having a proximal end 156 and a distal end 158. The surgical probe 152 is connectable to the neuromonitoring interface 80, Fig. 3, by jacks 160 extending from the handle proximal end 156. Handle includes a transversely projecting actuator 162 proximate a tapered distal segment 164 terminating in handle distal end 158 which carries a distally projecting stainless steel shaft 166. Shaft 166 is tapered and preferably has a larger outside diameter proximate the handle distal end 158, tapering to a smaller outside diameter proximate the shaft distal end 168, with a distally projecting length from handle distal end 158 to shaft distal end 168 encased in clear plastic, thin- wall, shrinkable tubing. Extending from the handle 154 and electrically connected to conductors 170 is an anode 172 and a cathode 174. The anode and cathode 172, 174 extend slightly past the shaft distal end 168 and are used to apply electrostimulation to a neural structure.
The outer surface of the handle 154 also includes a reflector/marker network 176 to facilitate tracking of the position and orientation of the probe 152. The probe 152 is shown as having three reflectors 176 that may be permanently or removably fixed to the handle 154. As is known in conventional surgical instrument tracking systems, the size, shape, and position of the reflectors 176 are known by the surgical navigational system, thus, when captured by a camera, the position and orientation of the probe 152 can be readily ascertained. It is recognized that more than or less than three reflectors may be used.
The actuator 162 enables the surgeon to selectively apply electrostimulation to patient anatomy during a surgical procedure. As such, the probe 152 can be used for surgical purposes without the application of electrostimulation and, when desired by the surgeon, used to illicit a neurological response from a neural structure. In the embodiment illustrated in Fig. 7, the probe 152 is powered by a power supply (not shown) external to the probe 152 via the jacks 160.
In Fig. 8, a battery powered retractor according to another embodiment of the invention is shown. Retractor 178 includes elongated and, preferably, textured handle 180 having a proximal end 182 and a distal end 184. Extending from the distal end 184 is a tapered shaft 186 that terminates in a curved head 188 that includes an anode tip 190 and a cathode tip 192, that are coplanar with one another. The handle 180 provides an interior volume 194 sized and shaped to hold batteries 196 that supply power sufficient to electrostimulate neural structures when desired by the surgeon. In one embodiment, the batteries 196 are permanently sealed within the interior volume 194 of the handle 180 so as to prevent contact with body fluids and cleaning fluids. In another embodiment, not illustrated herein, the batteries are removable and therefore replaceable by threadingly removing a cap portion of the handle. It is contemplated that rechargeable batteries may be used and that the batteries may be recharged without removing them from the handle.
The handle 180 also includes three reflectors 198 that provide visual feedback to a camera (not shown) or other detection device to determine the position and orientation of the retractor. Similar to that described with respect to Fig. 7, the retractor 178 further includes an actuator 200 that enables a surgeon to selectively turn the electrostimulation functionality of the retractor 178 on so as to apply electrostimulation to a neural structure.
Figure 9 illustrates a corded retractor 202 according to the present disclosure. In this example, the retractor 202 is powered by a remote battery or other power supply through a conventional jack connection using jacks 204. Like that described with respect to Fig. 8, the handle 206 of the retractor 202 includes reflectors 208 to enable surgical navigational hardware and software to track the position and orientation of the retractor 202. Retractor 202 also includes an actuator 210 to selectively apply electrostimulation to a neural structure. Electrostimulation is facilitated by an anode conductor 212 and a cathode conductor 214 extending past the shaft 216. The anode and cathode conductors 212, 214 extend along the entire length of the shaft 216 and connect to a power supply via connection with jack connectors 217.
In another example, as shown in Fig. 10, a bone screwdriver 218 is configured to provide electrostimulation in addition to driving a bone screw. Screwdriver 218 includes a handle 220 with a driving shaft 222 extending from a distal end thereof. The handle 220 is sized to accommodate batteries 224 to provide power for electrostimulation. The handle 20 also includes reflectors 226 secured thereto in either a permanent or removable fashion. The driving shaft 222 extends from the distal end 228 of the handle 220 to a driving head 230 sized and shaped to accommodate driving of bone screw. Extending parallel to the driving shaft 222 are sheathed anode and cathode electrodes 232, 234. The sheathed electrodes 232, 234, when extended, extend beyond the driving head 230 of the driving shaft 222. The sheathed anode and cathode electrodes 232, 234 are preferably retractable so as to not interfere with the surgeon during driving of a bone screw.
The sheathed electrodes 232, 234 are extended and retracted manually by the surgeon using an eyelet 236. Preferably, the eyelet is positioned in sufficient proximity to the handle 220 so that a surgeon can extend and retract the electrodes 232, 234 while holding the handle 220 and be able to depress the actuator 238 to apply the electrical stimulation. Accordingly, the handle includes a cavity (not shown) defined by appropriate stops to define the range of translation of the electrodes.
Figure 11 is an elevation view of a surgical tap according to another aspect of the present disclosure. In this example, a surgical tap 240 is constructed for pedicle hole preparation, but is also capable of neurostimulation and providing navigational information. In this regard, the surgical tap 240 includes a handle 242 with a conductive shaft 244 extending therefrom. An insulating sheath 246 surrounds only a portion of the shaft so as to limit electrostimulation to the conductive tip 248. The conductive tip 248 includes a series of threads 250 that engage the pedicle or other bony structure during insertion of the tap. The threads 250 are formed such that a longitudinal recess or channel 252 is defined along the length of the tip.
Handle 242 has an actuator switch 254 that allows a user to selectively apply electrostimulation during insertion of the tip. As such, electrostimulation can be applied while the surgical tap is forming a pedicle screw pilot hole or probing of the pedicle. Energy is applied to the conductive tip 248 via conductor 256, which is connectable to an energy source of the neuromonitoring system, Fig. 1. Alternatively, batteries can be disposed in the handle and used to supply electrostimulating energy to the conductive tip 248. The handle 242 also has three reflectors 258 which provide visual feedback to a camera (not shown) or other detection device to determine the position and orientation of the tap. One skilled in the art will recognize that other techniques may be used to track the position of the tap, such as electronic position sensors in the handle.
Figure 12 shows a surgical probe 260 according to another embodiment of the present disclosure. Similar to the examples described above, probe 260 has a handle 262 with a series of reflectors 264 coupled to or otherwise formed thereon. Extending from the proximate end of the handle are jacks 266 for connecting the probe 260 to the energy source of the neuromonitoring system, Fig. 2. Extending from the distal end of the handle 262 is a conductive shaft 268 partially shrouded by an insulating sheath 270. The unsheathed portion of the shaft 268 is a conductive tip 272 capable of probing the pedicle or other bony structure. The handle also has an actuator 274 for selectively energizing the conductive tip 272 for the application of electrostimulation during probing.
Figure 13 is a cross-sectional view of the conductive tip 272. As shown, the conductive shaft 268 includes an anode conductive portion 274 and a cathode conductive portion 276 separated from the anode conductive portion 274 by an insulator 278. This is further illustrated in Fig. 14. With this construction, electrostimulation is applied between the anode conduction portion 276 and the electrically isolated cathode conductive portion 274 for bipolar electrostimulation.
The illustrative tools described above are designed to not only perform a surgical function, but also apply electrostimulation to a neural structure of the patient. As described herein, with the aid image based navigation, a surgeon can move the instrument, visualize that movement in real-time, and apply electrostimulation (uni-polar and bi-polar) as desired at various instrument positions without the need for a separate stimulation instrument. Further, electrostimulation can also be applied to enhance navigation through the application of a leading electrostimulation pattern. In this regard, as the instrument is traversed through the patient anatomy, electrostimulation is automatically applied ahead of the tip of the instrument. As such, neurological information is automatically acquired as the instrument is moved and the visualization of patient anatomy automatically updated to incorporate the neurological information. Moreover, the neurological information can be used to localize, with better specificity, the actual location and orientation of neural structures. For example, electrostimulation with a broadcasting scope can be applied as the instrument is moved. If a neurological response is not measured, such a broad electrostimulation continues. However, if a neurological response is measured, a pinpointing electrostimulation can be repeatedly applied with decreasing coverage to localize the position of the stimulated neural structure.
Referring now to Fig. 15, in a further example, the leading electrostimulation can also be used to signal to the surgeon that the instrument is approaching a nerve or other neural structure. The signal may be a visual identifier on the GUI or in the form of an audible warning broadcast through the audio system described herein. In this regard, the integrated system determines the instantaneous position of the instrument at 280. The system then compares the position of the instrument with information regarding the anatomical makeup of the patient to determine the proximity of the instrument to neural structures that may not be readably visible on the anatomical visualization at 282. If the instrument is not near a neural structure 282, 284, the process loops back to step 280. If the instrument is at or near a previously identified neural structure 282, 286, the neural structure is identified or classified from an anatomical framework of the patient and/or the neurological response of the structure. Once the neural structure is identified 288, an appropriate signal is output 290 signaling that the instrument is near a neural structure. It is contemplated that the intensity and identification afforded the signal may be based on the type of neural structure identified as being proximal the instrument. For example, the volume and the pattern of an audible alarm may vary depending upon the type of neural structure. Further, in the example of audible proximity indicators, the volume and/or pattern of audible alarm may change as the instrument moves closer to or farther away from the neural structure. Thus, the audible signals provide real-time feedback to the surgeon regarding the position of the instrument relative to a neural structure. After the appropriate signal is output, the process returns to determining the position of the instrument at 280.
As described above, the integrated system is also capable of performing measurements between trajectories or instrument positions. Thus, for example, bone measurements can be done to determine if sufficient bone has been removed for a particular surgical procedure. For instance, the instrument can be tracked across the profile of a portion of a bone to be removed. The trajectory across the profile can then be stored as a trajectory. Following one or more bone removal procedures, the instrument can again be tracked across the bone now having a portion thereof removed. The system can then compute the differences between those trajectories and provide a quantitative value to the surgeon, via the GUI, for example, to assist the surgeon in determining if enough bone has been removed for the particular surgical procedure.
Also, the characteristics of the electrostimulation can be automatically adjusted based on the tracked instantaneous position of the instrument. That is, the integrated system, through real-time tracking of the instrument and a general understanding of patient anatomy layout from images, atlas models, and the like, can automatically set the intensity, scope, and type of electrostimulation based on the anatomy proximal the instrument when the surgeon directs application of electrostimulation. Rather than automatically set the electrostimulation characteristics, the system could similar display, on the GUI, the electrostimulation values derived by the system for consideration by the surgeon. In this regard, the surgeon could adopt, through appropriate inputs to the GUI, the suggested characteristics or define values different from those suggested by the system. Also, since an instrument could be used for bone milling or removal and electrostimulation, neurological responses could be measured during active milling or bone removal.
While a probe, a retractor, a screwdriver, and a tap have been shown and described, it is contemplated that other surgical tools according to the present disclosure may be used to carry out surgical functions as well as apply electrostimulation, such as blunt dilators, awls, pedicle access needles, biopsy needles, drug delivery needles, ball tip probes, inner body dilators, spinal disc removal tools, inner body spacer tools, soft tissue retractors, and others. Additionally, it is contemplated that an implant, such as a pedicle screw, when coupled to a conductive portion of a surgical tool, may also be conductive and thus used to apply electrostimulation during implantation of the implant. For example, a bone screw may also be used to apply electrostimulation when engaged with the driving and conductive end of a driver. Also, while surgical instruments having reflectors for optically determining instrument position and orientation have been illustratively shown, the surgical instruments may include circuitry such as that described with respect to Fig. 6 for electromagnetically determining instrument position and orientation and inductively powering the electrostimulation and transmitter circuits.
The surgical instruments described herein illustrate various examples in which the present disclosure can be implemented. It is recognized that other instruments other than those described can be used. Further, preferably, the instruments are formed of biocompatible materials, such as stainless steel. It is recognized however that other biocompatible materials can be used.
Moreover, while an integrated surgical navigational and neuromonitoring system has been described, it is recognized that stand-alone systems may be communicatively linked to one another in a handshake fashion. Thus, through software modules, such as those described herein, the neuromonitoring information provided by a stand-alone neuromonitoring probe and system can be provided to a stand-alone surgical navigational system for the integrated visualization of navigational and neuromonitoring information.
As described herein, the integrated system is also capable of providing on-demand access to technical resources to a surgeon. Moreover, the integrated system is designed to provide a list of on-demand resources based on instrument position, neural structure position, or neural structure neuroresponse. As set forth in Fig. 16, the integrated system is designed to receive a user input 292 from the surgeon or other user requesting publication of a technical resource. Responsive to that input, the integrated system determines the instantaneous position of the instrument 294 when the request is made. Based on the instrument position, anatomical structures proximal the instrument are then determined 296. From the position of the instrument, the identified proximal anatomy, and, if applicable, the neurological response of a proximal neural structure, the system accesses corresponding portions of a technical resource database 298 to derive and display a list of related technical resources available for publication to the surgeon at 300. The list is preferably in the form of selectable computer data links displayed on the GUI for surgeon selection and may link to articles, publications, tutorials, maps, presentations, video, instructions, and manuals, for example. In response to a user selection on the GUI 302, the selected technical resource is uploaded from the database and published to the surgeon or other user at 304. It is contemplated that the integrated system may upload the technical resource from a local or remote database. Another process capable of being carried out by the integrated system described herein is shown in Fig. 17. Figure 17 sets forth the steps of a predictive process for providing feedback to a surgeon or other is assessing neural integrity. The process begins at step 306 with determining a position of the electrostimulation instrument when an electrostimulation is applied. The location of the stimulated neural structure is also determined at 308. Based on the location of the neural structure, the neural structure is identified 310. Identification of the neural structure can be determined from comparing anatomical information of the patient with previous neural maps, atlas models, anatomical maps, and the like. Based on identification of the neural structure, e.g., class, the neurological response of the neural structure to the electrostimulation is predicted 312. The predicted neurological response is then compared to the actual, measured neurological response at 314. The results of that comparison are then conveyed at 316 to the surgeon or other user with the GUI to assist with determining the neural integrity of the stimulated neural structure. Additionally, the visualization of the stimulated and measured neural structure can be automatically updated based on the comparison, e.g., color coded or annotated to indicate that the neurological response was not in line with that expected.
Although only a few exemplary embodiments have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of this disclosure. Accordingly, all such modifications and alternative are intended to be included within the scope of the invention as defined in the following claims. Those skilled in the art should also realize that such modifications and equivalent constructions or methods do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure. It is understood that all spatial references, such as "horizontal," "vertical," "top," "upper," "lower," "bottom," "left," "right," "cephalad," "caudal," "upper," and "lower," are for illustrative purposes only and can be varied within the scope of the disclosure. Further, the embodiments of the present disclosure may be adapted to work singly or in combination over multiple spinal levels and vertebral motion segments. Also, though the embodiments have been described with respect to the spine and, more particularly, to vertebral motion segments, the present disclosure has similar application to other motion segments and parts of the body. In the claims, means-plus-function clauses are intended to cover the elements described herein as performing the recited function and not only structural equivalents, but also equivalent elements.

Claims

We claim:
1. An apparatus comprising: an instrument tracking system configured to track movement of an instrument; a database containing technical information regarding a surgical procedure and patient anatomy; and a computer operatively linked with the instrument tracking system and the database, and programmed to: determine an anatomical structure proximate the instrument; determine a portion of the technical information contained on the database that relates to the anatomical structure; and generate and display identifiers for the portion of the technical information in a user-selectable manner to allow a user to selectively obtain technical information relating to one of the surgical procedure and the anatomical structure.
2. The apparatus of claim 1 wherein the technical information includes one of videos, articles, presentations, publications, and maps contained in electronic format on the database.
3. The apparatus of claim 1 further comprising a GUI, and wherein the computer is further programmed to determine an anticipated trajectory of the instrument and display the anticipated trajectory on the GUI.
4. The apparatus of claim 3 wherein the database further contains anatomical images of the patient anatomy, and wherein the computer is further programmed to display the anticipated trajectory superimposed over an anatomical image of the patient anatomy.
5. The apparatus of claim 1 further comprising a neuromonitoring system configured to induce and measure neurological response, and wherein the computer is further programmed to determine a location of a neural structure relative to the instrument.
6. The apparatus of claim 5 wherein the instrument is capable of applying an electrostimulation, and wherein the computer is further programmed to control the neuromonitoring system and application of electrostimulation by the instrument such that electrostimulation is applied forward of a movement of the instrument.
7. The apparatus of claim 6 wherein the computer is further programmed to cause publication of an indicator if the instrument is approaching a neural structure.
8. The apparatus of claim 5 wherein the computer is further programmed to automatically determine an electrostimuation pattern from the location of the neural structure and control the neuromonitoring system to electrostimulate the neural structure according to the determined electrostimulation pattern.
9. The apparatus of claim 5 wherein the computer is further programmed to predict the neurological response of the neural structure from the location of the neural structure and compare a measured neurological response with the predicted neurological response and determine an integrity of the neural structure from the comparison.
10. A method comprising : tracking a surgical instrument; applying electrostimulation at a given surgical instrument position; and determining a location of a neural structure relative to the surgical instrument position from a neurological response of the neural structure to the electrostimulation.
11. The method of claim 10 further comprising determining a trajectory of surgical instrument movement and generating a neural structure map along the trajectory by applying electrostimulation to and measuring neurological response of neural structures along the trajectory.
12. The method of claim 11 further comprising displaying the trajectory superimposed on a visualization of patient anatomy containing the neural structures.
13. The method of claim 12 further comprising developing the visualization of patient anatomy from an anatomical atlas.
14. The method of claim 12 further comprising developing the visualization of patient anatomy from an image of the patient anatomy.
15. The method of claim 10 further comprising automatically setting one of electrostimulation intensity and electrostimulation pattern based on the surgical instrument position.
16. The method of claim 10 further comprising automatically identifying the neural structure from the neurological response.
17. The method of claim 16 further comprising updating a visualization of the neural structure based on its identification.
18. The method of claim 16 further comprising displaying a list of on-demand resources available for review by a surgeon based on the identified neural structure.
19. The method of claim 18 further comprising updating the list as the surgical instrument moves from one identified neural structure to another identified neural structure.
20. The method of claim 10 further comprising varying a direction of electrostimulation based on movement of the surgical instrument.
21. An apparatus comprising a computer programmed to: determine a location of a neuromonitoring probe designed to apply electrostimulation to a patient; compare the determined location to an anatomical framework of the patient, the anatomical framework providing a general localization of a neural structure; and automatically determine one of electrostimulation intensity and electrostimulation pattern for electrostimulating the neural structure based on the position of the neuromonitoring probe and the neural structure.
22. The apparatus of claim 21 comprising a technical resource library of on-demand technical resources and contained on a database, and wherein the computer is further programmed to generate a list of on-demand resources available for review by a surgeon based on the position of the neuromonitoring probe.
23. The apparatus of claim 21 including an integrated neuromonitoring and surgical navigational system comprising the computer.
24. The apparatus of claim 23 wherein the integrated neuromonitoring and surgical navigational system includes a display designed to display a GUI showing a visualization of the anatomical framework and wherein the computer causes a superimposition of a position marker on the visualization of the anatomical framework indicating the location of the neuromonitoring probe.
25. The apparatus of claim 24 wherein the computer is further programmed to indicate, on the GUI, an anticipated trajectory of surgical instrument movement.
26. A computer readable storage medium having instructions thereon that when executed by a computer causes the computer to: access an anatomical visualization of a patient; access neurological information acquired from the patient; and update the anatomical visualization to incorporate the neurological information.
27. The computer readable storage medium of claim 26 wherein the instructions further cause the computer to color-code the anatomical visualization based on the neurological information.
28. The computer readable storage medium of claim 26 wherein the instructions further cause the computer to derive the anatomical visualization from at least one of an anatomical image and an atlas model.
29. The computer readable storage medium of claim 26 wherein the instructions further cause the computer to update the anatomical visualization in real-time during acquisition of neurological information.
30. A surgical method comprising: translating a surgical tool relative to patient anatomy containing a neural structure; applying an electrical stimulus to the neural structure with the surgical tool; and determining a position of the neural structure relative to other anatomical structures of the patient anatomy through inspection of a GUI displaying a visualization of the patient anatomy and the surgical tool.
PCT/US2008/051745 2007-01-25 2008-01-23 Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control WO2008091917A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2008207954A AU2008207954A1 (en) 2007-01-25 2008-01-23 Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
EP08713917.6A EP2124735B1 (en) 2007-01-25 2008-01-23 Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
JP2009547389A JP2010516406A (en) 2007-01-25 2008-01-23 Integrated surgical navigational nerve monitoring system with automatic surgical support and control
CN200880007403A CN101677778A (en) 2007-01-25 2008-01-23 Have surgical navigational and neuromonitoring integrated system from have an operation auxiliary and control appliance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/626,942 US8374673B2 (en) 2007-01-25 2007-01-25 Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
US11/626,942 2007-01-25

Publications (3)

Publication Number Publication Date
WO2008091917A2 true WO2008091917A2 (en) 2008-07-31
WO2008091917A3 WO2008091917A3 (en) 2008-12-18
WO2008091917A4 WO2008091917A4 (en) 2009-02-26

Family

ID=39705027

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/051745 WO2008091917A2 (en) 2007-01-25 2008-01-23 Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control

Country Status (7)

Country Link
US (1) US8374673B2 (en)
EP (1) EP2124735B1 (en)
JP (1) JP2010516406A (en)
KR (1) KR20090115162A (en)
CN (1) CN101677778A (en)
AU (1) AU2008207954A1 (en)
WO (1) WO2008091917A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014037531A1 (en) * 2012-09-06 2014-03-13 Norwegian University Of Science And Technology (Ntnu) Treatment of headache by injection of neuroinhibitory substance to sphenopalatine ganglion or otic ganglion
CN103857349A (en) * 2011-09-26 2014-06-11 尹祥真 Intelligent surgery system

Families Citing this family (175)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
US8052688B2 (en) * 2006-10-06 2011-11-08 Wolf Ii Erich Electromagnetic apparatus and method for nerve localization during spinal surgery
CN102282527B (en) * 2008-11-21 2014-07-02 伦敦健康科学中心研究公司 Hands-Free Pointer System
BRPI0921978A2 (en) 2008-11-26 2019-01-15 Calgary Scient Inc method of providing remote access to application program state and storage media.
US8906094B2 (en) * 2008-12-31 2014-12-09 Spineology, Inc. System and method for performing percutaneous spinal interbody fusion
US10055105B2 (en) 2009-02-03 2018-08-21 Calgary Scientific Inc. Method and system for enabling interaction with a plurality of applications using a single user interface
KR20110125647A (en) 2009-02-03 2011-11-21 캘거리 싸이언티픽 인코포레이티드 Method and system for enabling interaction with a plurality of applications using a single user interface
WO2011014598A1 (en) * 2009-07-29 2011-02-03 Nexpath Medical S.A. Neurophysiological stimulation system and methods with wireless instrumentation
US10631912B2 (en) 2010-04-30 2020-04-28 Medtronic Xomed, Inc. Interface module for use with nerve monitoring and electrosurgery
US8696549B2 (en) * 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
US9741084B2 (en) 2011-01-04 2017-08-22 Calgary Scientific Inc. Method and system for providing remote access to data for display on a mobile device
CA2734860A1 (en) 2011-03-21 2012-09-21 Calgary Scientific Inc. Method and system for providing a state model of an application program
WO2012131660A1 (en) 2011-04-01 2012-10-04 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system for spinal and other surgeries
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US10219811B2 (en) 2011-06-27 2019-03-05 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
AU2012296247B2 (en) 2011-08-15 2017-06-22 Calgary Scientific Inc. Non-invasive remote access to an application program
JP6164747B2 (en) 2011-08-15 2017-07-19 カルガリー サイエンティフィック インコーポレイテッド Method for flow control in a collaborative environment and for reliable communication
JP6322140B2 (en) 2011-09-30 2018-05-09 カルガリー サイエンティフィック インコーポレイテッド Unconnected application extension including interactive digital surface layer for sharing and annotation of collaborative remote applications
WO2013076554A1 (en) 2011-11-23 2013-05-30 Calgary Scientific Inc. Methods ans systems for collaborative remote application sharing and conferencing
US9602581B2 (en) 2012-03-02 2017-03-21 Calgary Scientific Inc. Remote control of an application using dynamic-linked library (DLL) injection
KR101374189B1 (en) * 2012-04-25 2014-03-13 한양대학교 에리카산학협력단 Navigation system for surgery
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US12004905B2 (en) 2012-06-21 2024-06-11 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
WO2013192598A1 (en) 2012-06-21 2013-12-27 Excelsius Surgical, L.L.C. Surgical robot platform
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US9729673B2 (en) 2012-06-21 2017-08-08 Calgary Scientific Inc. Method and system for providing synchronized views of multiple applications for display on a remote computing device
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US20140081659A1 (en) 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
TW201424677A (en) * 2012-12-18 2014-07-01 Plus Biotechnology Co Ltd A Instrument navigation method and navigation device thereof
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US10098585B2 (en) 2013-03-15 2018-10-16 Cadwell Laboratories, Inc. Neuromonitoring systems and methods
US9968408B1 (en) * 2013-03-15 2018-05-15 Nuvasive, Inc. Spinal balance assessment
US9636112B2 (en) * 2013-08-16 2017-05-02 Covidien Lp Chip assembly for reusable surgical instruments
KR101501955B1 (en) * 2013-09-02 2015-03-18 신지원 Remote Control System of Surgical Operating Room Equippments and Control Method therefor
US9283048B2 (en) 2013-10-04 2016-03-15 KB Medical SA Apparatus and systems for precise guidance of surgical tools
EP3075111B1 (en) 2013-11-29 2017-12-20 Calgary Scientific Inc. Method for providing a connection of a client to an unmanaged service in a client-server remote access system
US9241771B2 (en) 2014-01-15 2016-01-26 KB Medical SA Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10039605B2 (en) 2014-02-11 2018-08-07 Globus Medical, Inc. Sterile handle for controlling a robotic surgical system from a sterile field
WO2015138708A1 (en) * 2014-03-12 2015-09-17 Proximed, Llc Surgical guidance systems, devices, and methods
US10004562B2 (en) 2014-04-24 2018-06-26 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
US10357257B2 (en) 2014-07-14 2019-07-23 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US10987050B2 (en) 2014-07-21 2021-04-27 ProPep Surgical, LLC System and method for laparoscopic nerve identification, nerve location marking, and nerve location recognition
US10398369B2 (en) 2014-08-08 2019-09-03 Medtronic Xomed, Inc. Wireless stimulation probe device for wireless nerve integrity monitoring systems
US10015264B2 (en) 2015-01-30 2018-07-03 Calgary Scientific Inc. Generalized proxy architecture to provide remote access to an application framework
KR20170110612A (en) 2015-01-30 2017-10-11 캘거리 싸이언티픽 인코포레이티드 Highly Scalable, Fault Tolerant Remote Access Architecture and Access Method
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US20160262800A1 (en) 2015-02-13 2016-09-15 Nuvasive, Inc. Systems and methods for planning, performing, and assessing spinal correction during surgery
WO2016131903A1 (en) 2015-02-18 2016-08-25 KB Medical SA Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US11980465B2 (en) 2015-04-03 2024-05-14 Medtronic Xomed, Inc. System and method for omni-directional bipolar stimulation of nerve tissue of a patient via a bipolar stimulation probe
US10039915B2 (en) 2015-04-03 2018-08-07 Medtronic Xomed, Inc. System and method for omni-directional bipolar stimulation of nerve tissue of a patient via a surgical tool
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10058394B2 (en) 2015-07-31 2018-08-28 Globus Medical, Inc. Robot arm and methods of use
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
EP3344179B1 (en) * 2015-08-31 2021-06-30 KB Medical SA Robotic surgical systems
US10034716B2 (en) 2015-09-14 2018-07-31 Globus Medical, Inc. Surgical robotic systems and methods thereof
US9771092B2 (en) 2015-10-13 2017-09-26 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US10445466B2 (en) 2015-11-18 2019-10-15 Warsaw Orthopedic, Inc. Systems and methods for post-operative outcome monitoring
US10339273B2 (en) 2015-11-18 2019-07-02 Warsaw Orthopedic, Inc. Systems and methods for pre-operative procedure determination and outcome predicting
CN108289714B (en) * 2015-12-04 2021-06-18 皇家飞利浦有限公司 System and workflow for mesh-free transperineal prostate intervention
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
BR112018067591B1 (en) 2016-03-02 2023-11-28 Nuvasive, Inc. SYSTEM FOR SURGICAL PLANNING AND EVALUATION OF CORRECTION OF SPINAL DEFORMITY IN AN INDIVIDUAL
WO2017155999A1 (en) 2016-03-07 2017-09-14 Hansa Medical Products, Inc. Apparatus and method for forming an opening in patient's tissue
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
EP3241518B1 (en) 2016-04-11 2024-10-23 Globus Medical, Inc Surgical tool systems
WO2018005842A1 (en) * 2016-06-30 2018-01-04 Intuitive Surgical Operations, Inc. Graphical user interface for displaying guidance information in a plurality of modes during an image-guided procedure
US10849517B2 (en) 2016-09-19 2020-12-01 Medtronic Xomed, Inc. Remote control module for instruments
WO2018059838A1 (en) * 2016-09-27 2018-04-05 Brainlab Ag Efficient positioning of a mechatronic arm
EP3375399B1 (en) 2016-10-05 2022-05-25 NuVasive, Inc. Surgical navigation system
WO2018075784A1 (en) 2016-10-21 2018-04-26 Syverson Benjamin Methods and systems for setting trajectories and target locations for image guided surgery
TWI681751B (en) * 2016-11-14 2020-01-11 鐿鈦科技股份有限公司 Method and system for verificating panoramic images of implants
TWI618036B (en) * 2017-01-13 2018-03-11 China Medical University Simulated guiding method for surgical position and system thereof
EP3360502A3 (en) 2017-01-18 2018-10-31 KB Medical SA Robotic navigation of robotic surgical systems
US9935395B1 (en) 2017-01-23 2018-04-03 Cadwell Laboratories, Inc. Mass connection plate for electrical connectors
EP3585254B1 (en) 2017-02-24 2024-03-20 Masimo Corporation Medical device cable and method of sharing data between connected medical devices
WO2018156809A1 (en) * 2017-02-24 2018-08-30 Masimo Corporation Augmented reality system for displaying patient data
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
US20180289432A1 (en) 2017-04-05 2018-10-11 Kb Medical, Sa Robotic surgical systems for preparing holes in bone tissue and methods of their use
US10932705B2 (en) 2017-05-08 2021-03-02 Masimo Corporation System for displaying and controlling medical monitoring data
CN107085671A (en) * 2017-06-29 2017-08-22 江苏奥康尼医疗科技发展有限公司 A kind of auxiliary equipment of repair of cartilage operation
US11135015B2 (en) 2017-07-21 2021-10-05 Globus Medical, Inc. Robot surgical platform
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
KR102062252B1 (en) 2017-08-30 2020-01-03 부산대학교 산학협력단 Intraoperative Neuromonitoring System Using Bio-pressure Sensor
US10588644B2 (en) * 2017-08-31 2020-03-17 DePuy Synthes Products, Inc. Guide attachment for power tools
US10814491B2 (en) 2017-10-06 2020-10-27 Synaptive Medical (Barbados) Inc. Wireless hands-free pointer system
US11357548B2 (en) 2017-11-09 2022-06-14 Globus Medical, Inc. Robotic rod benders and related mechanical and motor housings
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
EP3492032B1 (en) 2017-11-09 2023-01-04 Globus Medical, Inc. Surgical robotic systems for bending surgical rods
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
USD842324S1 (en) * 2017-11-17 2019-03-05 OR Link, Inc. Display screen or portion thereof with graphical user interface
USD842325S1 (en) * 2017-11-17 2019-03-05 OR Link, Inc. Display screen or portion thereof with graphical user interface
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
US11992339B2 (en) 2018-05-04 2024-05-28 Cadwell Laboratories, Inc. Systems and methods for dynamic neurophysiological stimulation
US11253182B2 (en) 2018-05-04 2022-02-22 Cadwell Laboratories, Inc. Apparatus and method for polyphasic multi-output constant-current and constant-voltage neurophysiological stimulation
US11443649B2 (en) 2018-06-29 2022-09-13 Cadwell Laboratories, Inc. Neurophysiological monitoring training simulator
WO2020010097A1 (en) 2018-07-02 2020-01-09 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
JP7487176B2 (en) * 2018-08-22 2024-05-20 マジック リープ, インコーポレイテッド Patient Visibility System
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
CN113454507B (en) 2018-12-21 2024-05-07 奇跃公司 Cavitation structure for promoting total internal reflection within a waveguide
US11998287B1 (en) * 2019-03-18 2024-06-04 Dopl Technologies Inc. Platform for facilitating remote robotic medical procedures
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US20200297357A1 (en) 2019-03-22 2020-09-24 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11529149B2 (en) * 2019-09-20 2022-12-20 Spineology Inc. Percutaneous discectomy kit and method
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US12033081B2 (en) 2019-11-14 2024-07-09 Magic Leap, Inc. Systems and methods for virtual and augmented reality
US11992373B2 (en) 2019-12-10 2024-05-28 Globus Medical, Inc Augmented reality headset with varied opacity for navigated robotic surgery
US12064189B2 (en) 2019-12-13 2024-08-20 Globus Medical, Inc. Navigated instrument for use in robotic guided surgery
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US20210278936A1 (en) * 2020-03-09 2021-09-09 Biosense Webster (Israel) Ltd. Electrophysiological user interface
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11607277B2 (en) 2020-04-29 2023-03-21 Globus Medical, Inc. Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11246637B2 (en) * 2020-05-11 2022-02-15 Alphatec Spine, Inc. Stimulating targeting needle
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US12070276B2 (en) 2020-06-09 2024-08-27 Globus Medical Inc. Surgical object tracking in visible light via fiducial seeding and synthetic image registration
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US12076091B2 (en) 2020-10-27 2024-09-03 Globus Medical, Inc. Robotic navigational system
US11941814B2 (en) 2020-11-04 2024-03-26 Globus Medical Inc. Auto segmentation using 2-D images taken during 3-D imaging spin
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
USD951274S1 (en) * 2020-12-30 2022-05-10 Varian Medical Systems, Inc. Display screen of an electronic device with a graphical user interface
US12070286B2 (en) 2021-01-08 2024-08-27 Globus Medical, Inc System and method for ligament balancing with robotic assistance
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11931052B2 (en) * 2021-10-08 2024-03-19 Nuvasive, Inc. Assemblies, systems, and methods for a neuromonitoring drill bit
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same
CN114366309A (en) * 2022-01-17 2022-04-19 上海锦立城医疗科技有限公司 Surgical robot with nerve monitoring function
US12103480B2 (en) 2022-03-18 2024-10-01 Globus Medical Inc. Omni-wheel cable pusher
US12048493B2 (en) 2022-03-31 2024-07-30 Globus Medical, Inc. Camera tracking system identifying phantom markers during computer assisted surgery navigation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1306050A1 (en) 2001-10-24 2003-05-02 BrainLAB AG Microprobe with navigation system
WO2006084194A2 (en) 2005-02-02 2006-08-10 Nuvasive, Inc. System and methods for monitoring during anterior surgery

Family Cites Families (237)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2429968A (en) 1945-10-22 1947-10-28 Martin L Stanphill Neuro-vaso detector
US2669986A (en) 1950-02-01 1954-02-23 James B Crawley Apparatus for electronically locating nerve irritations
US2704064A (en) 1952-09-10 1955-03-15 Meditron Company Neurosurgical stimulator
BE653103A (en) * 1961-01-13
US3336916A (en) 1963-10-30 1967-08-22 Richard F Edlich Electrocautery process
US3313293A (en) 1964-01-13 1967-04-11 Hewlett Packard Co Multi-electrode needle
US3682162A (en) 1968-12-13 1972-08-08 Wellcome Found Combined electrode and hypodermic syringe needle
US3598108A (en) 1969-02-28 1971-08-10 Khosrow Jamshidi Biopsy technique and biopsy device
US3628524A (en) 1969-02-28 1971-12-21 Khosrow Jamshidi Biopsy needle
US3630192A (en) 1969-07-14 1971-12-28 Khosrow Jamshidi Instrument for internal organ biopsy
US3664329A (en) 1970-03-09 1972-05-23 Concept Nerve locator/stimulator
US3785368A (en) 1971-08-23 1974-01-15 Carthy T Mc Abnormal nerve pressure locus detector and method
US3800783A (en) 1972-06-22 1974-04-02 K Jamshidi Muscle biopsy device
US3929123A (en) 1973-02-07 1975-12-30 Khosrow Jamshidi Muscle biopsy needle
US3830226A (en) 1973-06-15 1974-08-20 Concept Variable output nerve locator
US4163446A (en) 1978-01-31 1979-08-07 Khosrow Jamshidi Biopsy needle and removable pad therefor
NL183221C (en) 1978-03-20 1988-09-01 Univ Groningen DEVICE FOR DETECTING THE ACTIVITY OF THE BREATHING AGENTS AND THE HEART OF A LIVING BEING.
US4235242A (en) 1979-04-02 1980-11-25 Med General, Inc. Electronic circuit permitting simultaneous use of stimulating and monitoring equipment
US4262676A (en) 1979-08-24 1981-04-21 Khosrow Jamshidi Biopsy needle having integral stylet locking device
US4266555A (en) 1979-11-09 1981-05-12 Khosrow Jamshidi Biopsy needle with stylet and cannula orientation
US4356828A (en) 1980-03-03 1982-11-02 Khosrow Jamshidi Bone marrow aspiration needle
US4344440A (en) 1980-04-01 1982-08-17 Trygve Aaby Microprobe for monitoring biophysical phenomena associated with cardiac and neural activity
US4493327A (en) 1982-07-20 1985-01-15 Neurometrics, Inc. Automatic evoked potential detection
DE3229466A1 (en) 1982-08-06 1984-02-09 Sterimed Gesellschaft für medizinischen Bedarf mbH, 6600 Saarbrücken POINTING AND CATHETERIZING DEVICE FOR HUMAN OR ANIMAL BODIES
US4515168A (en) 1983-07-22 1985-05-07 Chester Martin H Clamp-on nerve stimulator and locator
US4892105A (en) 1986-03-28 1990-01-09 The Cleveland Clinic Foundation Electrical stimulus probe
JPH02500410A (en) 1986-08-05 1990-02-15 ユニヴァーシティ オブ ウエイルズ カレッジ オブ メディシン proximity detector
US4934377A (en) 1987-11-24 1990-06-19 The Cleveland Clinic Foundation Intraoperative neuroelectrophysiological monitoring system
FR2627981A1 (en) 1988-03-07 1989-09-08 Levy Guy APPARATUS FOR DETERMINING THE THICKNESS OF THE DENTINE ABOVE THE PULP ORGAN
DE8803153U1 (en) 1988-03-09 1988-06-23 B. Braun Melsungen Ag, 3508 Melsungen Catheter device for plexus anesthesia
US4920979A (en) 1988-10-12 1990-05-01 Huntington Medical Research Institute Bidirectional helical electrode for nerve stimulation
US4955383A (en) 1988-12-22 1990-09-11 Biofield Corporation Discriminant function analysis method and apparatus for disease diagnosis and screening
US4962766A (en) 1989-07-19 1990-10-16 Herzon Garrett D Nerve locator and stimulator
US5024228A (en) 1989-11-29 1991-06-18 Goldstone Andrew C Electrode endotracheal tube
US5125406A (en) 1989-11-29 1992-06-30 Eet Limited Partnership (Del) Electrode endotracheal tube
US5078147A (en) 1990-01-25 1992-01-07 Vivo Corporation Method of noninvasive ultrasonic detection of nerve root inflammation
US5046506A (en) 1990-02-09 1991-09-10 Singer Medical Products, Inc. Molded needle with adhesive
US5081990A (en) 1990-05-11 1992-01-21 New York University Catheter for spinal epidural injection of drugs and measurement of evoked potentials
US6347240B1 (en) 1990-10-19 2002-02-12 St. Louis University System and method for use in displaying images of a body part
WO1992006645A1 (en) 1990-10-19 1992-04-30 St. Louis University Surgical probe locating system for head use
US5388587A (en) 1990-12-04 1995-02-14 Dorsograf Ab Method and apparatus for measuring the transport time of nerve signals excited in different dermatoms of a patient
SE467561B (en) 1990-12-04 1992-08-10 Dorsograf Ab DEVICE FOR SEATING TRANSPORT TIME OF NERV SIGNALS
US5203330A (en) 1991-02-26 1993-04-20 Vickers Plc Disposable electrodes for electromyography (EMG) and nerve conduction velocity (NCV) and kit containing same
US5255677A (en) 1991-02-26 1993-10-26 Vickers Plc Disposable electrodes for electromyography (EMG) and nerve conduction velocity (NCV) and kit containing same
US5161533A (en) 1991-09-19 1992-11-10 Xomed-Treace Inc. Break-apart needle electrode system for monitoring facial EMG
US5524338A (en) 1991-10-22 1996-06-11 Pi Medical Corporation Method of making implantable microelectrode
US5201903A (en) 1991-10-22 1993-04-13 Pi (Medical) Corporation Method of making a miniature multi-conductor electrical cable
US5630839A (en) 1991-10-22 1997-05-20 Pi Medical Corporation Multi-electrode cochlear implant and method of manufacturing the same
US5284153A (en) 1992-04-14 1994-02-08 Brigham And Women's Hospital Method for locating a nerve and for protecting nerves from injury during surgery
US5474558A (en) 1992-04-30 1995-12-12 Neubardt; Seth L. Procedure and system for spinal pedicle screw insertion
US5196015A (en) 1992-04-30 1993-03-23 Neubardt Seth L Procedure for spinal pedicle screw insertion
US5271413A (en) 1992-07-22 1993-12-21 Dalamagas Photios P Method to sense the tissue for injection from a hypodermic needle
US5532613A (en) 1993-04-16 1996-07-02 Tokyo Electron Kabushiki Kaisha Probe needle
DE9422172U1 (en) 1993-04-26 1998-08-06 St. Louis University, St. Louis, Mo. Specify the location of a surgical probe
US5335668A (en) 1993-04-30 1994-08-09 Medical Scientific, Inc. Diagnostic impedance measuring system for an insufflation needle
US5526821A (en) 1993-06-03 1996-06-18 Medical Biopsy, Inc. Biopsy needle with sample retaining means
US5421727A (en) 1993-06-07 1995-06-06 Stevens; Barry H. Dental instrument with microwave/RF radiation and method of treating a tooth
SE500769C2 (en) 1993-06-21 1994-08-29 Televerket Procedure for locating mobile stations in digital telecommunications networks
JP2630224B2 (en) 1993-09-30 1997-07-16 日本電気株式会社 Portable radio
US5524632A (en) 1994-01-07 1996-06-11 Medtronic, Inc. Method for implanting electromyographic sensing electrodes
US5560372A (en) 1994-02-02 1996-10-01 Cory; Philip C. Non-invasive, peripheral nerve mapping device and method of use
US5540235A (en) 1994-06-30 1996-07-30 Wilson; John R. Adaptor for neurophysiological monitoring with a personal computer
US5513651A (en) 1994-08-17 1996-05-07 Cusimano; Maryrose Integrated movement analyzing system
US5462065A (en) 1994-08-17 1995-10-31 Cusimano; Maryrose Integrated movement analyziing system
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
EP0869745B8 (en) 1994-10-07 2003-04-16 St. Louis University Surgical navigation systems including reference and localization frames
WO1996032156A1 (en) 1995-04-10 1996-10-17 St. Luke's-Roosevelt Hospital Peripheral nerve stimulation device for unassisted nerve blockade
US5775331A (en) 1995-06-07 1998-07-07 Uromed Corporation Apparatus and method for locating a nerve
JP2736326B2 (en) 1995-07-10 1998-04-02 工業技術院長 Single nerve action potential measurement device
US5807275A (en) 1995-07-19 1998-09-15 Medical Biopsy, Inc. Biopsy needle
US5800500A (en) 1995-08-18 1998-09-01 Pi Medical Corporation Cochlear implant with shape memory material and method for implanting the same
US5630422A (en) 1995-09-08 1997-05-20 Zanakis; Michael F. Diagnostic system for detecting and indicating cranial movements
US6351659B1 (en) 1995-09-28 2002-02-26 Brainlab Med. Computersysteme Gmbh Neuro-navigation system
US5779642A (en) 1996-01-16 1998-07-14 Nightengale; Christopher Interrogation device and method
USD387427S (en) 1996-02-12 1997-12-09 Surgical Navigation Technologies, Inc. Ventriculostomy probe
US5941876A (en) 1996-03-11 1999-08-24 Medical Scientific, Inc. Electrosurgical rotating cutting device
BR9708139A (en) 1996-03-21 2000-10-24 Jasao Corp Process for verifying the effectiveness of a manipulation therapy
US6167145A (en) 1996-03-29 2000-12-26 Surgical Navigation Technologies, Inc. Bone navigation system
US7225019B2 (en) 1996-04-30 2007-05-29 Medtronic, Inc. Method and system for nerve stimulation and cardiac sensing prior to and during a medical procedure
US6009212A (en) 1996-07-10 1999-12-28 Washington University Method and apparatus for image registration
US5853373A (en) 1996-08-05 1998-12-29 Becton, Dickinson And Company Bi-level charge pulse apparatus to facilitate nerve location during peripheral nerve block procedures
US5843148A (en) 1996-09-27 1998-12-01 Medtronic, Inc. High resolution brain stimulation lead and method of use
WO1998025513A2 (en) 1996-12-09 1998-06-18 Swee Chuan Tjin Apparatus for continuous cardiac output monitoring
US5792212A (en) 1997-03-07 1998-08-11 Medtronic, Inc. Nerve evoked potential measurement system using chaotic sequences for noise rejection
US5928158A (en) 1997-03-25 1999-07-27 Aristides; Arellano Medical instrument with nerve sensor
DE59813142D1 (en) 1997-04-01 2005-12-01 Axel Muntermann DEVICE FOR DETECTING CATHETER TISSUE CONTACT AND INTERACTION WITH THE TISSUE IN CATHETER ABLATION
US6708184B2 (en) 1997-04-11 2004-03-16 Medtronic/Surgical Navigation Technologies Method and apparatus for producing and accessing composite data using a device having a distributed communication controller interface
US5970499A (en) 1997-04-11 1999-10-19 Smith; Kurt R. Method and apparatus for producing and accessing composite data
US6004312A (en) 1997-04-15 1999-12-21 Paraspinal Diagnostic Corporation Computerized EMG diagnostic system
US6002957A (en) 1997-04-15 1999-12-14 Paraspinal Diagnostic Corporation EMG electrode array support belt
USD422706S (en) 1997-04-30 2000-04-11 Surgical Navigation Technologies Biopsy guide tube
US7628761B2 (en) 1997-07-01 2009-12-08 Neurometrix, Inc. Apparatus and method for performing nerve conduction studies with localization of evoked responses
US5851191A (en) 1997-07-01 1998-12-22 Neurometrix, Inc. Apparatus and methods for assessment of neuromuscular function
US6132386A (en) 1997-07-01 2000-10-17 Neurometrix, Inc. Methods for the assessment of neuromuscular function by F-wave latency
US6434507B1 (en) 1997-09-05 2002-08-13 Surgical Navigation Technologies, Inc. Medical instrument and method for use with computer-assisted image guided surgery
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US5987960A (en) 1997-09-26 1999-11-23 Picker International, Inc. Tool calibrator
USD420132S (en) 1997-11-03 2000-02-01 Surgical Navigation Technologies Drill guide
US6356783B1 (en) 1997-11-20 2002-03-12 David R. Hubbard, Jr. Multi-electrode and needle injection device for diagnosis and treatment of muscle injury and pain
US6021343A (en) 1997-11-20 2000-02-01 Surgical Navigation Technologies Image guided awl/tap/screwdriver
US6348058B1 (en) 1997-12-12 2002-02-19 Surgical Navigation Technologies, Inc. Image guided spinal surgery guide, system, and method for use thereof
US6181961B1 (en) 1997-12-16 2001-01-30 Richard L. Prass Method and apparatus for an automatic setup of a multi-channel nerve integrity monitoring system
US6306100B1 (en) 1997-12-16 2001-10-23 Richard L. Prass Intraoperative neurophysiological monitoring system
US6654634B1 (en) 1997-12-16 2003-11-25 Richard L. Prass Method and apparatus for connection of stimulus and recording electrodes of a multi-channel nerve integrity monitoring system
US6011996A (en) 1998-01-20 2000-01-04 Medtronic, Inc Dual electrode lead and method for brain target localization in functional stereotactic brain surgery
US6330466B1 (en) 1998-02-23 2001-12-11 California Institute Of Technology Using a multi-electrode probe in creating an electrophysiological profile during stereotactic neurosurgery
US6810281B2 (en) 2000-12-21 2004-10-26 Endovia Medical, Inc. Medical mapping system
US6391005B1 (en) 1998-03-30 2002-05-21 Agilent Technologies, Inc. Apparatus and method for penetration with shaft having a sensor for sensing penetration depth
US6298262B1 (en) 1998-04-21 2001-10-02 Neutar, Llc Instrument guidance for stereotactic surgery
US6337994B1 (en) 1998-04-30 2002-01-08 Johns Hopkins University Surgical needle probe for electrical impedance measurements
US6224603B1 (en) 1998-06-09 2001-05-01 Nuvasive, Inc. Transiliac approach to entering a patient's intervertebral space
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US6027456A (en) 1998-07-10 2000-02-22 Advanced Neuromodulation Systems, Inc. Apparatus and method for positioning spinal cord stimulation leads
US6002964A (en) 1998-07-15 1999-12-14 Feler; Claudio A. Epidural nerve root stimulation
US6173300B1 (en) 1998-08-11 2001-01-09 Advanced Micro Devices, Inc. Method and circuit for determining leading or trailing zero count
US6292701B1 (en) 1998-08-12 2001-09-18 Medtronic Xomed, Inc. Bipolar electrical stimulus probe with planar electrodes
US6104957A (en) 1998-08-21 2000-08-15 Alo; Kenneth M. Epidural nerve root stimulation with lead placement method
US6745062B1 (en) 1998-10-05 2004-06-01 Advanced Imaging Systems, Inc. Emg electrode apparatus and positioning system
WO2000021442A1 (en) 1998-10-09 2000-04-20 Surgical Navigation Technologies, Inc. Image guided vertebral distractor
US5947972A (en) 1998-10-28 1999-09-07 Midas Rex, L.P. Irrigation pressurization system
US6266558B1 (en) 1998-12-01 2001-07-24 Neurometrix, Inc. Apparatus and method for nerve conduction measurements with automatic setting of stimulus intensity
US6754374B1 (en) 1998-12-16 2004-06-22 Surgical Navigation Technologies, Inc. Method and apparatus for processing images with regions representing target objects
DE19859155C2 (en) 1998-12-21 2003-08-28 Henke Sass Wolf Gmbh Endoscope with a coupling device (video coupler) for connecting a video camera
ATE306213T1 (en) 1998-12-23 2005-10-15 Nuvasive Inc DEVICES FOR CANNULATION AND NERVE MONITORING
US6564078B1 (en) 1998-12-23 2003-05-13 Nuvasive, Inc. Nerve surveillance cannula systems
DE60038603T2 (en) 1999-01-13 2009-05-20 Cytyc Corp., Marlborough IDENTIFICATION OF DUCTAL OPENINGS USING THE CHARACTERISTIC ELECTRICAL SIGNAL
US6193715B1 (en) 1999-03-19 2001-02-27 Medical Scientific, Inc. Device for converting a mechanical cutting device to an electrosurgical cutting device
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US6224549B1 (en) 1999-04-20 2001-05-01 Nicolet Biomedical, Inc. Medical signal monitoring and display
US6491699B1 (en) 1999-04-20 2002-12-10 Surgical Navigation Technologies, Inc. Instrument guidance method and system for image guided surgery
US6190395B1 (en) 1999-04-22 2001-02-20 Surgical Navigation Technologies, Inc. Image guided universal instrument adapter and method for use with computer-assisted image guided surgery
US6259945B1 (en) 1999-04-30 2001-07-10 Uromed Corporation Method and device for locating a nerve
US6928490B1 (en) 1999-05-20 2005-08-09 St. Louis University Networking infrastructure for an operating room
US6366805B1 (en) 1999-05-26 2002-04-02 Viasys Healthcare Inc. Time frame synchronization of medical monitoring signals
FR2795624B1 (en) 1999-07-01 2001-09-28 Vanacker Gerard METHOD FOR DRILLING THE VERTEBRAL PEDICLE, PARTICULARLY FOR THE PLACEMENT OF A PEDICULAR SCREW, AN INSTRUMENT FOR THE IMPLEMENTATION OF SUCH A PROCESS
US6298256B1 (en) 1999-09-10 2001-10-02 Frank-Egbert Meyer Device and method for the location and catheterization of the surroundings of a nerve
US6334068B1 (en) 1999-09-14 2001-12-25 Medtronic Xomed, Inc. Intraoperative neuroelectrophysiological monitor
US6674916B1 (en) 1999-10-18 2004-01-06 Z-Kat, Inc. Interpolation in transform space for multiple rigid object registration
US6187018B1 (en) 1999-10-27 2001-02-13 Z-Kat, Inc. Auto positioner
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US6381485B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies, Inc. Registration of human anatomy integrated for electromagnetic localization
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US6499488B1 (en) 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
GB2356051A (en) 1999-11-06 2001-05-09 Neil Meredith Measuring the vascularity within bone tissue using electrical contact impedance measurements
US6466817B1 (en) 1999-11-24 2002-10-15 Nuvasive, Inc. Nerve proximity and status detection system and method
JP4854900B2 (en) 1999-11-24 2012-01-18 ヌバシブ, インコーポレイテッド EMG measurement method
US6725078B2 (en) 2000-01-31 2004-04-20 St. Louis University System combining proton beam irradiation and magnetic resonance imaging
US6725080B2 (en) 2000-03-01 2004-04-20 Surgical Navigation Technologies, Inc. Multiple cannula image guided tool for image guided procedures
US6456874B1 (en) 2000-03-13 2002-09-24 Arrow International Inc. Instrument for delivery of anaesthetic drug
US8611993B2 (en) 2000-03-13 2013-12-17 Arrow International, Inc. Pre-loaded lockable stimulating catheter for delivery of anaesthetic drugs
US7386341B2 (en) 2000-03-13 2008-06-10 Arrow International, Inc. Instrument and method for delivery of anaesthetic drugs
AUPQ646500A0 (en) 2000-03-27 2000-04-20 Australian National University, The Method and apparatus for assessing neural function by sparse stimuli
US6312392B1 (en) 2000-04-06 2001-11-06 Garrett D. Herzon Bipolar handheld nerve locator and evaluator
US7660621B2 (en) 2000-04-07 2010-02-09 Medtronic, Inc. Medical device introducer
US6535756B1 (en) 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
WO2001087154A1 (en) 2000-05-18 2001-11-22 Nuvasive, Inc. Tissue discrimination and applications in medical procedures
US6748276B1 (en) 2000-06-05 2004-06-08 Advanced Neuromodulation Systems, Inc. Neuromodulation therapy system
AU6976801A (en) 2000-06-08 2001-12-17 Nuvasive Inc Relative nerve movement and status detection system and method
US6306403B1 (en) 2000-06-14 2001-10-23 Allergan Sales, Inc. Method for treating parkinson's disease with a botulinum toxin
US6704736B1 (en) 2000-06-28 2004-03-09 Microsoft Corporation Method and apparatus for information transformation and exchange in a relational database environment
US6671550B2 (en) 2000-09-20 2003-12-30 Medtronic, Inc. System and method for determining location and tissue contact of an implantable medical device within a body
US6487446B1 (en) 2000-09-26 2002-11-26 Medtronic, Inc. Method and system for spinal cord stimulation prior to and during a medical procedure
US6533732B1 (en) 2000-10-17 2003-03-18 William F. Urmey Nerve stimulator needle guidance system
US6560479B2 (en) 2001-01-17 2003-05-06 Viasys Healthcare Inc. Electrode disconnect system and method for medical signal monitoring system
US6725086B2 (en) 2001-01-17 2004-04-20 Draeger Medical Systems, Inc. Method and system for monitoring sedation, paralysis and neural-integrity
EP1364183B1 (en) 2001-01-30 2013-11-06 Mako Surgical Corp. Tool calibrator and tracker system
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US20030105503A1 (en) 2001-06-08 2003-06-05 Nuvasive, Inc. Relative nerve movement and status detection system and method
NZ512510A (en) 2001-06-20 2004-02-27 Assa Abloy Financial Services A latch device
DE10129912A1 (en) 2001-06-21 2003-01-02 Efmt Entwicklungs Und Forschun needle electrode
US6685729B2 (en) 2001-06-29 2004-02-03 George Gonzalez Process for testing and treating aberrant sensory afferents and motors efferents
US6832111B2 (en) 2001-07-06 2004-12-14 Hosheng Tu Device for tumor diagnosis and methods thereof
JP4295086B2 (en) 2001-07-11 2009-07-15 ヌバシブ, インコーポレイテッド System and method for determining nerve proximity, nerve orientation, and pathology during surgery
GB2379878B (en) 2001-09-21 2004-11-10 Gyrus Medical Ltd Electrosurgical system and method
EP1435828A4 (en) 2001-09-25 2009-11-11 Nuvasive Inc System and methods for performing surgical procedures and assessments
US6965794B2 (en) 2001-10-05 2005-11-15 Fasstech, Inc. Apparatus for routing electromyography signals
US20030093007A1 (en) 2001-10-17 2003-05-15 The Government Of The U.S.A., As Represented By The Secretary, Department Of Health And Human Serv Biopsy apparatus with radio frequency cauterization and methods for its use
EP1444004B1 (en) 2001-10-18 2011-12-14 Uroplasty, Inc. Electro-nerve stimulator system and methods
EP1443859A4 (en) 2001-10-24 2006-03-22 Cutting Edge Surgical Inc Intraosteal ultrasound during surgical implantation
US7664544B2 (en) 2002-10-30 2010-02-16 Nuvasive, Inc. System and methods for performing percutaneous pedicle integrity assessments
DE50103657D1 (en) 2001-11-02 2004-10-21 Moeller Wedel Gmbh Observation device for a stereoscopic surgical microscope
US6807444B2 (en) 2001-11-05 2004-10-19 Hosheng Tu Apparatus and methods for monitoring tissue impedance
US7214197B2 (en) 2001-11-06 2007-05-08 Prass Richard L Intraoperative neurophysiological monitoring system
US6993384B2 (en) 2001-12-04 2006-01-31 Advanced Bionics Corporation Apparatus and method for determining the relative position and orientation of neurostimulation leads
US20050010262A1 (en) 2002-02-01 2005-01-13 Ali Rezai Modulation of the pain circuitry to affect chronic pain
FR2835732B1 (en) 2002-02-11 2004-11-12 Spinevision DEVICE FOR TRACKING THE PENETRATION OF A PENETRATION MEANS IN ANATOMICAL ELEMENTS
US20030167021A1 (en) 2002-03-04 2003-09-04 Shimm Peter B. Apparatus for locating and anesthetizing nerve groups
US20040176759A1 (en) 2003-03-07 2004-09-09 Subashini Krishnamurthy Radiopaque electrical needle
US7294127B2 (en) 2002-03-05 2007-11-13 Baylis Medical Company Inc. Electrosurgical tissue treatment method
US20050177209A1 (en) 2002-03-05 2005-08-11 Baylis Medical Company Inc. Bipolar tissue treatment system
US6896675B2 (en) 2002-03-05 2005-05-24 Baylis Medical Company Inc. Intradiscal lesioning device
US8882755B2 (en) 2002-03-05 2014-11-11 Kimberly-Clark Inc. Electrosurgical device for treatment of tissue
US20050277918A1 (en) 2003-03-07 2005-12-15 Baylis Medical Company Inc. Electrosurgical cannula
US7831292B2 (en) * 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
AU2003218010A1 (en) 2002-03-06 2003-09-22 Z-Kat, Inc. System and method for using a haptic device in combination with a computer-assisted surgery system
US20040010204A1 (en) 2002-03-28 2004-01-15 Pearl Technology Holdings, Llc Electronic/fiberoptic tissue differentiation instrumentation
AU2003245439A1 (en) 2002-06-11 2003-12-22 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US20040044295A1 (en) * 2002-08-19 2004-03-04 Orthosoft Inc. Graphical user interface for computer-assisted surgery
US6892090B2 (en) 2002-08-19 2005-05-10 Surgical Navigation Technologies, Inc. Method and apparatus for virtual endoscopy
US7282033B2 (en) 2002-09-04 2007-10-16 Urmey William F Positioning system for a nerve stimulator needle
US20040049121A1 (en) 2002-09-06 2004-03-11 Uri Yaron Positioning system for neurological procedures in the brain
US20040049475A1 (en) * 2002-09-06 2004-03-11 Toshiba Tec Kabushiki Kaisha System and method for globally providing document access history information
US20040122482A1 (en) 2002-12-20 2004-06-24 James Tung Nerve proximity method and device
WO2004062470A2 (en) 2003-01-03 2004-07-29 Advanced Neuromodulation Systems, Inc. System and method for stimulation of a person’s brain stem
US7216001B2 (en) 2003-01-22 2007-05-08 Medtronic Xomed, Inc. Apparatus for intraoperative neural monitoring
DE10303964A1 (en) 2003-01-31 2004-08-19 Wolfgang Prof. Dr. Oettinger Medical drilling device and medical drilling method
WO2004070573A2 (en) * 2003-02-04 2004-08-19 Z-Kat, Inc. Computer-assisted external fixation apparatus and method
US20040225228A1 (en) 2003-05-08 2004-11-11 Ferree Bret A. Neurophysiological apparatus and procedures
US6999820B2 (en) 2003-05-29 2006-02-14 Advanced Neuromodulation Systems, Inc. Winged electrode body for spinal cord stimulation
US20040243207A1 (en) 2003-05-30 2004-12-02 Olson Donald R. Medical implant systems
US20040260358A1 (en) 2003-06-17 2004-12-23 Robin Vaughan Triggered electromyographic test device and methods of use thereof
US20040260357A1 (en) 2003-06-17 2004-12-23 Robin Vaughan Triggered electromyographic test device and methods of use thereof
CA2432810A1 (en) 2003-06-19 2004-12-19 Andres M. Lozano Method of treating depression, mood disorders and anxiety disorders by brian infusion
US6960208B2 (en) 2003-06-30 2005-11-01 Boston Scientific Scimed, Inc. Apparatus and methods for delivering energy to a target site within bone
DE10333543A1 (en) * 2003-07-23 2005-02-24 Siemens Ag A method for the coupled presentation of intraoperative as well as interactive and iteratively re-registered preoperative images in medical imaging
AU2004263152B2 (en) 2003-08-05 2009-08-27 Nuvasive, Inc. Systems and methods for performing dynamic pedicle integrity assessments
US20050033393A1 (en) 2003-08-08 2005-02-10 Advanced Neuromodulation Systems, Inc. Apparatus and method for implanting an electrical stimulation system and a paddle style electrical stimulation lead
US7905840B2 (en) 2003-10-17 2011-03-15 Nuvasive, Inc. Surgical access system and related methods
WO2005051480A2 (en) 2003-11-20 2005-06-09 Advanced Neuromodulation Systems, Inc. Electrical stimulation system, lead, and method providing reduced neuroplasticity effects
US7283866B2 (en) 2004-05-18 2007-10-16 Hatch Ltd Needle having multiple electrodes
US10342452B2 (en) 2004-07-29 2019-07-09 Medtronic Xomed, Inc. Stimulator handpiece for an evoked potential monitoring system
JP5145038B2 (en) * 2004-08-09 2013-02-13 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and apparatus for processing an image of an interventional instrument having a marker
US20060224219A1 (en) 2005-03-31 2006-10-05 Sherwood Services Ag Method of using neural stimulation during nucleoplasty procedures
US7553307B2 (en) 2004-10-15 2009-06-30 Baxano, Inc. Devices and methods for tissue modification
US20060122458A1 (en) 2004-10-15 2006-06-08 Baxano, Inc. Devices and methods for tissue access
EP1799129B1 (en) 2004-10-15 2020-11-25 Baxano, Inc. Devices for tissue removal
US7865236B2 (en) * 2004-10-20 2011-01-04 Nervonix, Inc. Active electrode, bio-impedance based, tissue discrimination system and methods of use
US7643861B2 (en) 2005-01-18 2010-01-05 John Richard Ives Technique for design, and placement, of a subdermal Ag—Ag/Cl biopotential electrode
US7643884B2 (en) 2005-01-31 2010-01-05 Warsaw Orthopedic, Inc. Electrically insulated surgical needle assembly
US20060173374A1 (en) 2005-01-31 2006-08-03 Neubardt Seth L Electrically insulated surgical probing tool
US20060178594A1 (en) 2005-02-07 2006-08-10 Neubardt Seth L Apparatus and method for locating defects in bone tissue
US8092455B2 (en) 2005-02-07 2012-01-10 Warsaw Orthopedic, Inc. Device and method for operating a tool relative to bone tissue and detecting neural elements
US20060200219A1 (en) 2005-03-01 2006-09-07 Ndi Medical, Llc Systems and methods for differentiating and/or identifying tissue regions innervated by targeted nerves for diagnostic and/or therapeutic purposes
US7896815B2 (en) 2005-03-01 2011-03-01 Checkpoint Surgical, Llc Systems and methods for intra-operative stimulation
JP2006340774A (en) * 2005-06-07 2006-12-21 Hitachi Medical Corp Sergery navigation system with nerve monitoring function
WO2007017642A1 (en) * 2005-08-05 2007-02-15 Depuy Orthopädie Gmbh Computer assisted surgery system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1306050A1 (en) 2001-10-24 2003-05-02 BrainLAB AG Microprobe with navigation system
WO2006084194A2 (en) 2005-02-02 2006-08-10 Nuvasive, Inc. System and methods for monitoring during anterior surgery

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103857349A (en) * 2011-09-26 2014-06-11 尹祥真 Intelligent surgery system
CN103857349B (en) * 2011-09-26 2016-08-17 尹祥真 Intelligent operation system
WO2014037531A1 (en) * 2012-09-06 2014-03-13 Norwegian University Of Science And Technology (Ntnu) Treatment of headache by injection of neuroinhibitory substance to sphenopalatine ganglion or otic ganglion
US9579368B2 (en) 2012-09-06 2017-02-28 Norwegian University Of Science And Technology (Ntnu) Treatment of headache by injection of neuroinhibitory substance to sphenopalatine ganglion or otic ganglion
US10716834B2 (en) 2012-09-06 2020-07-21 Norwegian University Of Science And Technology (Ntnu) Intervention device
US11712464B2 (en) 2012-09-06 2023-08-01 Norwegian University Of Science And Technology (Ntnu) Intervention device

Also Published As

Publication number Publication date
AU2008207954A1 (en) 2008-07-31
WO2008091917A4 (en) 2009-02-26
US8374673B2 (en) 2013-02-12
KR20090115162A (en) 2009-11-04
JP2010516406A (en) 2010-05-20
US20080183190A1 (en) 2008-07-31
EP2124735B1 (en) 2015-07-15
CN101677778A (en) 2010-03-24
WO2008091917A3 (en) 2008-12-18
EP2124735A2 (en) 2009-12-02

Similar Documents

Publication Publication Date Title
US8374673B2 (en) Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
AU2008209355B2 (en) Integrated visualization of surgical navigational and neural monitoring information
US7987001B2 (en) Surgical navigational and neuromonitoring instrument
US20080183188A1 (en) Integrated Surgical Navigational and Neuromonitoring System
EP2431070A1 (en) Method and apparatus for coordinated display of anatomical and neuromonitoring information
KR101157312B1 (en) Surgical navigational and neuromonitoring instrument
US8734466B2 (en) Method and apparatus for controlled insertion and withdrawal of electrodes
EP2139418A1 (en) Method and apparatus for controlled insertion and withdrawal of electrodes
JP2018527118A (en) Illuminated endoscopic pedicle probe for dynamic real-time monitoring for proximity to the nerve

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880007403.1

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08713917

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2008207954

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2009547389

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2008207954

Country of ref document: AU

Date of ref document: 20080123

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2008713917

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020097017591

Country of ref document: KR