USRE49094E1 - Systems and methods for performing spine surgery - Google Patents

Systems and methods for performing spine surgery Download PDF

Info

Publication number
USRE49094E1
USRE49094E1 US16/211,219 US201816211219A USRE49094E US RE49094 E1 USRE49094 E1 US RE49094E1 US 201816211219 A US201816211219 A US 201816211219A US RE49094 E USRE49094 E US RE49094E
Authority
US
United States
Prior art keywords
image
instrument
surgical
target site
neurophysiological
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/211,219
Inventor
Eric Finley
Albert Kim
Thomas Scholl
Jeffrey Barnes
Bryce Nesbitt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuvasive Inc
Original Assignee
Nuvasive Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nuvasive Inc filed Critical Nuvasive Inc
Priority to US16/211,219 priority Critical patent/USRE49094E1/en
Assigned to BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT reassignment BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NUVASIVE CLINICAL SERVICES MONITORING, INC., NUVASIVE CLINICAL SERVICES, INC., NUVASIVE SPECIALIZED ORTHOPEDICS, INC., NUVASIVE, INC.
Assigned to NUVASIVE, INC. reassignment NUVASIVE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BARNES, JEFFREY, SCHOLL, THOMAS, FINLEY, ERIC, KIM, ALBERT, NESBITT, BRYCE
Application granted granted Critical
Publication of USRE49094E1 publication Critical patent/USRE49094E1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/4893Nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]

Definitions

  • the present application pertains to spine surgery. More particularly, the present application pertains to a surgical tracking system that monitors the locations of surgical objects and nerves within the body during spine surgery.
  • the spinal column is a highly complex system of bones and connective tissues that provide support for the body and protect the delicate spinal cord and nerves.
  • the spinal column includes a series of vertebral bodies stacked atop one another, each vertebral body including an inner or central portion of relatively weak cancellous bone and an outer portion of relatively strong cortical bone. Situated between each vertebral body is an intervertebral disc that cushions and dampens compressive forces exerted upon the spinal column.
  • a vertebral canal containing the spinal cord is located behind the vertebral bodies.
  • the spine has a natural curvature (i.e., lordosis in the lumbar and cervical regions and kyphosis in the thoracic region) such that the endplates of the upper and lower vertebrae are inclined towards one another.
  • spinal column disorders including scoliosis (abnormal lateral curvature of the spine), excess kyphosis (abnormal forward curvature of the spine), excess lordosis (abnormal backward curvature of the spine), spondylolisthesis (forward displacement of one vertebra over another), and other disorders caused by abnormalities, disease, or trauma (such as ruptured or slipped discs, degenerative disc disease, fractured vertebrae, and the like). Patients that suffer from such conditions often experience extreme and debilitating pain, as well as diminished nerve function.
  • Open surgical techniques are generally undesirable in that they typically require large incisions with high amounts of tissue displacement to gain access to the surgical target site, which produces concomitantly high amounts of pain, lengthened hospitalization (increasing health care costs), and high morbidity in the patient population.
  • Less-invasive surgical techniques including minimal access and minimally invasive techniques are gaining favor due to the fact that they involve accessing the surgical target site via incisions of substantially smaller size with greatly reduced tissue displacement requirements. This, in turn, reduces the pain, morbidity, and cost associated with such procedures.
  • the systems and methods described herein are directed to addressing the challenges described above, and others, associated with various minimally-invasive spine procedures.
  • the present invention includes a system and methods for decreased reliance on fluoroscopic imaging while avoiding harm to neural tissue during surgery.
  • the present invention includes a position tracking system for tracking the location of surgical objects within the surgical field, a neuromonitoring system for detecting the existence of (and optionally the distance and/or direction to) neural structures during a surgical procedure, and a processing system communicatively linked to both the position tracking system and the neuromonitoring system.
  • the position tracking system includes an infrared (IR) position sensor, an IR-reflective tracking array attached to an intraoperative imaging system, and at least one IR-reflective tracking array attached to at least one surgical object.
  • the position tracking system is communicatively linked to the processing system such that the processing system may display position tracking information to a user (e.g., a surgeon or a medical professional assisting the surgeon).
  • the neuromonitoring system includes instruments capable of stimulating the peripheral nerves of a patient and additional instruments capable of recording the evoked neuromuscular responses.
  • the neuromonitoring system is communicatively linked to the processing system such that the processing unit is programmed to measure the response of nerves depolarized by the stimulation signals to indicate the existence of (and optionally the distance and/or direction to neural structures during the surgical procedure.
  • the processing system is programmed to perform a plurality of predetermined functions using one or both of the position tracking system and neuromonitoring system.
  • the processing system may be programmed to register the position of a patient, scale virtual fluoroscopic images, track one or more surgical objects within the patient, determine the distance between two points of interest within the surgical field, determine the angle between two points of interest within the surgical field, recommend the size of one or more surgical instruments or implants, perform neurophysiologic monitoring, map neurophysiologic monitoring onto virtual fluoroscopic images, correct for translational and/or rotational offsets of virtual fluoroscopic images, and track the movement of one or more reference objects during the surgical procedure.
  • the processing system is further configured to display the results of these predetermined functions to the user in a meaningful way.
  • one or more surgical procedures may be performed using various embodiments of the system.
  • the surgical procedure is a minimally-invasive lateral lumbar surgery.
  • the surgical procedure is a posterior (open or minimally invasive) lumbar surgery.
  • FIG. 1 is an example operating room setup depicting the components of a surgical tracking system, according to an example embodiment
  • FIG. 2 is a perspective view of a C-arm fluoroscope comprising part of the system of FIG. 1 ;
  • FIG. 3 is a perspective view of various surgical objects that may be tracked with the system of FIG. 1 ;
  • FIG. 4 is a screen shot depicting an example welcome screen of the system of FIG. 1 ;
  • FIG. 5 is a screen shot depicting an example procedure information screen of the system of FIG. 1 ;
  • FIG. 6 is a screen shot depicting an example C-arm setup screen of the system of FIG. 1 ;
  • FIG. 7 is a screen shot depicting an example patient positioning screen during a first step of a patient positioning sequence
  • FIG. 8 is a screen shot depicting the example patient positioning screen during a second step of the patient positioning sequence
  • FIG. 9 is a screen shot depicting an example IR positioning sensor setup screen of the system of FIG. 1 ;
  • FIG. 10 a flow chart indicating the steps of a patient registration and scaling sequence of the system of FIG. 1 , according to one example embodiment
  • FIG. 11 is a screen shot depicting an example registration and scaling screen during a first step in the example patient registration sequence of FIG. 10 ;
  • FIG. 12 is a screen shot depicting the example registration and scaling screen during a second step in the example patient registration sequence of FIG. 10 ;
  • FIG. 13 is a screen shot depicting the example registration and scaling screen during a third step in the example patient registration sequence of FIG. 10 ;
  • FIG. 14 is a screen shot depicting the example registration and scaling screen during a fourth step in the example patient registration sequence of FIG. 10 ;
  • FIG. 15 is a screen shot depicting the example registration and scaling screen during a first step in the example scaling sequence of FIG. 10 ;
  • FIG. 16 is a screen shot depicting the example registration and scaling screen during a second step in the example scaling sequence of FIG. 10 ;
  • FIG. 17 is a screen shot depicting the system tracking the 3D location of a surgical tool in real time
  • FIG. 18 is a screen shot of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations as an initial dilator is advanced from the incision site to a target site on a patient;
  • FIG. 19 is a screen shot of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations as the initial dilator is retreated towards the incision site;
  • FIGS. 20-23 are screen shots of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations and various rotational positions of the electrode as the initial dilator is advanced towards the incision site;
  • FIG. 24 is a screen shot of the system in nerve mapping mode according to a second embodiment
  • FIG. 25 is a screen shot of the system in nerve mapping mode according to a third embodiment.
  • FIG. 26 is a screen shot of the system in offset mode according to a first embodiment.
  • the present invention includes a surgical tracking system that monitors the 3D location of objects in or near a surgical field and then conveys the position of these objects to a user relative to traditional, yet virtual anatomical views.
  • the system conveys the position of virtualized representations of these surgical objects over one or more virtual fluoroscopic images.
  • the virtual surgical objects may be generated with software (e.g., 3D CAD models) and then superimposed onto static two-dimensional (2D) images (e.g., in lateral and anterior-posterior (A/P) views).
  • 2D two-dimensional
  • the system does not eliminate the need for fluoroscopic imaging entirely, it does limit the number of fluoroscopic images required during a surgical procedure thereby minimizing the amount of ionizing radiation that the patient and user are exposed to. Furthermore, the system accomplishes this reduction in radiation exposure without requiring reference markers fixed to the patient, input of pre-operative CT images, or additional large, specialized, and costly equipment.
  • the present invention may facilitate monitoring the location and orientation of surgical access instruments which can aid in both the insertion and positioning of the surgical access instruments themselves, as well as aiding in the later insertion of instruments and/or implants through the surgical access instruments.
  • tracking the location of surgical instruments within the surgical field may be accomplished with significantly decreased reliance on intraoperative imaging.
  • the present invention may be used in conjunction with, or integrated into, a neuromonitoring system for assessing one or more of nerve proximity (and/or nerve directionality) and pedicle integrity, among other functions.
  • the present invention may facilitate safe and reproducible pedicle screw placement by monitoring the trajectory of various surgical instruments used during pilot hole formation and/or screw insertion. While the above examples are described in more detail below, it is expressly noted that they are set forth by way of example and that the present invention may be suitable for use in any number of additional surgical actions where tracking the 3D location of surgical instruments and implants within the surgical field and decreased exposure to x-ray radiation are desired. Accordingly, it will be appreciated then that while the surgical tracking system is generally discussed herein as being attached to instruments such as dilating cannulas, tissue retractors, C-arms, pedicle access tools, etc., other instruments may be substituted depending on the surgical procedure being performed and/or the needs of the surgeon.
  • a surgical tracking system 10 including a position tracking system 12 and at least one surgical object to be tracked.
  • the position tracking system 12 includes an infrared (IR) position sensor 14 (also referred to as an “IR camera”), an IR-reflective tracking array 16 mounted on an intraoperative imaging system 18 , an IR-reflective tracking array 20 mountable to each surgical object to be tracked (e.g. initial dilator 32 ) and a feedback and control device comprising a control unit 22 and a display 24 .
  • IR infrared
  • the control unit 22 has position tracking software and C-arm video import capabilities and is communicatively linked to the display 24 so that information relevant to the surgical procedure may be conveyed to the user in a meaningful manner.
  • the relevant information includes, but is not limited to, spatial positioning data (e.g., translational data in the x, y, and z axes and orientation/rotational data) acquired by the IR position tracking sensor 14 .
  • the intraoperative imaging system 18 may be any commercially available C-arm fluoroscope 26 communicatively linked to a C-arm display 28 with an IR-reflective tracking array 16 attached, for example, to the signal receiver. As illustrated in FIG. 2 , the IR-reflective tracking array 16 is preferably attached to the C-arm 26 by one or more mating points (not shown) on a reticle 30 .
  • reticle 30 may be the reticle shown and described in PCT Patent App. No. PCT/US2008/012121, entitled “Surgical Trajectory Monitoring System and Related Methods,” and filed on Oct. 24, 2008, the entire contents of which is hereby incorporated by reference as if set forth fully herein.
  • the surgical objects to be tracked may be one or more surgical instruments including initial dilator 32 , cobb 238 , rasp 240 , and intervertebral implants 244 (shown in FIG. 3 attached to an implant inserter 242 ).
  • other surgical objects to be tracked may include retractor blades and intervertebral implant trials, among others.
  • any instrument, implant, or device suitable for use in spine surgery may be outfitted with an IR-reflective tracking array 20 and tracked within the surgical field in accordance with the present disclosure.
  • Each IR-reflective tracking array 16 , 20 has IR-reflective spheres 34 arranged in a calculated manner somewhere along its length. Spatial position data about a surgical object may be acquired in its raw form relative to the IR position tracking sensor's 14 local coordinate system with each IR-reflective tracking array 16 , 20 determining its own coordinate system.
  • the system 10 further comprises a neuromonitoring system 36 communicatively linked to the position tracking system 12 via the control unit 22 .
  • the neuromonitoring system 36 may be the neuromonitoring system shown and described U.S. Pat. No. 8,255,045, entitled “Neurophysiologic Monitoring System” and filed on Apr. 3, 2008, the entire contents of which is hereby incorporated by reference as if set forth fully herein.
  • the neuromonitoring system 36 may be further configured to perform neuromonitoring as a surgical access corridor is created (e.g. a lateral access corridor created via a sequential dilation assembly) and/or maintained (e.g. via a retractor assembly) and as one or more surgical implants are placed into the body (e.g.
  • the neuromonitoring system 36 may also be configured to: 1) assess the depth of nearby nerves relative to a surgical object; 2) localize the relative position of the nerves once the surgical object reaches the spinal target site (e.g. a lateral aspect of the spine); 3) couple the neurophysiology data with the previously-acquired positional data of the surgical object; and 4) present the combined data to the user via a virtual fluoroscopic image.
  • the surgical tracking system 10 may be used for monitoring the approach during a lateral trans-psoas procedure shown and described in U.S. Pat. No. 7,905,840, entitled “Surgical Access System and Related Methods,” and filed on Oct. 18, 2004, the entire contents of which is hereby incorporated by reference as if set forth fully herein.
  • FIGS. 4-26 illustrate, by way of example only, one embodiment of a screen display 100 of the control unit 22 capable of receiving input from a user in addition to communicating feedback information to the user.
  • GUI graphical user interface
  • FIG. 4 depicts a welcome screen in accordance with one exemplary embodiment.
  • the system 10 is configured to detect the connection status of each of its required components.
  • icons 102 , 104 , 106 indicate the connection status of the IR position sensor 14 , C-arm reticle 30 , and C-arm video output, respectively.
  • the display 100 may alert the user to address the issue before proceeding via textual, audio, and/or visual means (e.g., textual messages, audible tones, colored screen, blinking screen, etc.).
  • a separate indicator may alert the user once all of the required components are properly connected. For example, an “OK” message below each of icons 102 , 104 , and 106 indicates all components are properly connected. Selecting the “Start” button 108 proceeds to the next screen.
  • FIG. 5 illustrates an example procedure input screen.
  • the user may use the screen 100 to input information into one or more selection fields regarding the patient and procedure.
  • the selection fields may be drop-down menus or check-boxes pertaining to patient positioning 110 (e.g., lateral, prone), spinal levels to be operated on 112 (e.g., L1, L2, L3, L4, L5, S1), and surgical instruments to track 114 (e.g., dilator, cobb, rasp, inserter, retractor blade, custom/other).
  • selecting the “Next” button 116 proceeds to the next screen.
  • the tracked surgical objects are preferably monitored with respect to the position of, and displayed onto the anatomy of one or more virtual fluoroscopic images from the perspective of, the intraoperative imaging system 18 (not with respect to the position of, and the perspective of, the IR position sensor 14 ).
  • a surgical object is referenced off of the location of the C-arm 26 and its positional information is determined relative to coordinate system of the tracking array 16 of the c-arm 26 (the “working coordinate system”).
  • the coordinate system of the IR reflective tracking array 16 affixed to the C-arm 24 is oriented in the direction of the x-ray beam according to a preferred embodiment.
  • one of the axes of the IR reflective tracking array's 16 coordinate system is preferably oriented parallel to the beam axis, and the definition of this orientation is known in the position tracking software resident on the control unit 22 .
  • the C-arm 26 Prior to installing the reticle 30 , the C-arm 26 is placed in the traditional lateral position (with the arc of the “C” pointing straight down (0°) and the axis representing the x-ray beam at 90°).
  • Accelerometers (not shown) installed inside the reticle 30 may be used to generate angular data such that the reticle 30 can be installed at the 12 o'clock position relative to the surface of the C-arm signal receiver.
  • the C-arm 26 can be placed in a traditional anterior/posterior (A/P) position.
  • A/P anterior/posterior
  • the user moves the C-arm 26 to this position with the aid of the accelerometer readout of the reticle 30 as described in the '121 application.
  • An exact A/P position is achieved when accelerometer's x and y axes both read 0°.
  • the software establishes the corresponding array 16 position as 0°/0° and that the axis of the x-ray beam is parallel to the desired “working coordinate system's” y axis.
  • the system 10 may then report to the user when the beam/image axis is at 90° or the “true” lateral position.
  • FIG. 6 An exemplary C-arm setup and instruction screen is illustrated in FIG. 6 .
  • the screen 100 may provide helpful instructions for the user for setting up the C-arm 26 and reticle 30 via, by way of example only, instruction fields 118 , 120 , and 122 .
  • instruction fields 118 , 120 , and 122 may be provided by way of example only.
  • FIGS. 7 and 8 depict exemplary display screens 100 that may assist the user with patient positioning.
  • a first step to confirming patient positioning is to obtain a cross-table A/P fluoroscopic image of the patient in the lateral decubitus position.
  • the screen display 100 may show the user an ideal (sample) cross-table AP image 124 and a C-arm status indicator field 126 to ensure that the C-arm 26 is in the true lateral position.
  • a cross-table A/P image 128 is taken (and re-taken as needed) to approximately match the sample cross-table A/P image 124 .
  • a virtual center dot 130 is superimposed on both cross-table A/P images 124 , 128 .
  • the user may be provided with one or more additional instructions via message box 132 .
  • the message box 132 may remind the user to lock the C-arm wheels before proceeding to the next step. Selecting the “Next” button 116 advances the screen display 100 to the next screen.
  • a second step to confirming patient positioning is to obtain a patient-lateral fluoroscopic image of the patient in the lateral decubitus position.
  • the screen display 100 may show the user an ideal (sample) lateral image 134 and a C-arm status indicator field 126 to ensure that the C-arm is positioned 90° from the previous cross-table position.
  • a lateral image 136 is taken (and retaken as needed) to approximately match the sample lateral image 134 .
  • a virtual center dot 138 is superimposed on both lateral images 134 , 136 .
  • the “Next” button 116 is selected to advance the display screen 100 to the next screen.
  • the IR position sensor 14 can be positioned near the surgical field.
  • the IR position sensor 14 should be positioned so that the C-arm array 16 can be tracked in both the A/P and lateral positions.
  • the instrument tracking array 20 should be visible within the tracking volume of the IR position sensor 14 .
  • FIG. 9 illustrates an exemplary IR position sensor position field screen.
  • the display screen 100 may present the user with an exemplary configuration setup and instructions (shown by way of example only in instruction panel 140 ). Instruction panel 140 instructs the user to position the IR position sensor 14 near the surgical bed so that the IR reflective tracking array 16 on the C-arm 26 is in view of the IR position sensor 14 and the surgical objects to be tracked within the IR position sensor's 14 tracking volume.
  • the user may also input information regarding the general positioning of the patient relative to the IR position sensor 14 in positioning input field 142 .
  • the user may be instructed to select which side of the surgical bed the patient's head is resting on (e.g., side “A” or side “B”).
  • the system 10 may evaluate whether the position of the IR position sensor 14 relative to the IR reflective tracking arrays 16 , 20 is optimal for data acquisition.
  • the system 10 may provide instructions as to where to move the IR position sensor 14 to achieve optimal placement in message box 132 (e.g., towards the patient, away from the patient, towards the feet, towards the head).
  • the display screen 100 may present the status of one or more tracked objects via tracked object status field 144 .
  • any visual indicator textual, graphic, color
  • the tracked object status field 144 shows three surgical instruments being tracked: the C-arm, the initial dilator, and the rasp.
  • the tracked object status field 144 also indicates (via text and/or color indicators) that the C-arm and the initial dilator are within view of the IR position sensor 14 but that the rasp is not. Should the user wish to use the rasp, it will be necessary to move it within the tracking volume of the IR position sensor 14 .
  • the user may then select the next button to continue to the patient registration protocol. Again, various reminders may be provided to the user via message box 132 (e.g., the user may be reminded not to move the IR position sensor for the remainder of the procedure).
  • FIG. 10 depicts a flowchart of the patient registration process for a given view.
  • the user identifies a first point of interest (Point A) at Step 150 .
  • the user positions the C-arm 26 such that Point A is in the center of the fluoroscopic image at Step 152 using a virtual center marker as will be described in greater detail below.
  • the position tracking software captures the current image, the x, y, z position of the reticle 30 , and the rotational position of the reticle 30 with respect to Point A at Step 154 .
  • the user identifies a second point of interest (Point B). It is to be appreciated that the user may choose any part of the anatomy for Point B that is not Point A.
  • the user positions the C-arm 26 such that Point B is in the center of the fluoroscopic image at Step 158 .
  • the position tracking software captures the current image, the x, y, and z positions of the reticle 30 , and the rotational position of the reticle 30 with respect to Point B at Step 164 .
  • points of interest may be acquired in two or more views (e.g. lateral and A/P views). The steps described above are repeated for each view and once all desired views are registered, the images for each view (i.e. the A/P view's scale and/or the lateral view's scale) may be scaled.
  • Scaling may be accomplished in a number of ways (e.g., manually or automatically).
  • the system 10 lines up the images to create the best anatomy match via image recognition. In manual implementations, this may be done, by way of example, by presenting a user with a scaling image and a backdrop image and prompting the user to define where one or more points are located on a backdrop image (as will be described in further detail below).
  • the system 10 measures the physical distance between Point A and Point B (Step 188 ) and determines a scale factor by correlating the physical distance between Points A and B with the number of pixels between Points A and B on the display 24 (Step 190 ).
  • This “scale” factor may then be used to calculate the movements of the 3D-generated virtual instruments and implants against the 2D x-ray image in that particular view, among other things.
  • the surgical volume is defined for the system and the C-arm 26 may then be articulated to a primary position (e.g., a lateral position).
  • the C-arm 26 in this position, acts as the reference object for all other objects being tracked. Effectively, the other tracked objects (i.e. instrumentation, implants, etc.) will be referenced from the C-Arm array's 16 local coordinate system, thereby allowing the movements of the 3D-generated instruments and implants against the 2D image in each view.
  • the screen display 100 may include various features for assisting the user with the patient registration and scaling process.
  • Screen display 100 may include an image acquisition panel 162 , a patient registration task panel 164 , and an instruction panel 166 .
  • the image acquisition panel 162 preferably includes the live fluoroscopic image that is inputted into the system 10 from the intraoperative imaging system 18 via a C-arm input cable.
  • This fluoroscopic image may be the image that the user selects to be the anatomical lateral backdrop (i.e. the virtual lateral fluoroscopic image).
  • the patient registration task panel 166 may include a list of the registration steps completed and those yet to be completed and the instruction panel 166 may include specific directions to the user for what actions to perform to complete the first step in the acquisition process.
  • the screen display 100 may also include a plurality of buttons for navigating through the patient registration process. For example, selecting the “Capture” button 168 obtains the position and image information for the first lateral fluoroscopy shot, selecting the “Reset” button 170 restarts the patient registration process, selecting the “Back” button allows the user to see what was captured for a previously-acquired image, and selecting the “Next” button 174 advances to the next step in the patient registration process.
  • FIGS. 11-14 depict the screen display 100 during the steps of the patient registration process according to one exemplary embodiment.
  • Point A 176 is defined in the anatomical lateral view of the patient with the C-arm 26 in the A/P position.
  • the chosen anatomy for Point A 176 is the anterior border of the disc space at the superior vertebra.
  • An image is taken with Point A 176 in the center of the fluoroscopic image as shown in image acquisition panel 162 .
  • the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point A 176 .
  • This lateral image will be used for the “virtual” patient-lateral 3D backdrop 178 .
  • the user can select the “Next” button 174 to proceed to the next registration step.
  • Point B 180 is then defined in the anatomical lateral view of the patient with the C-arm 26 in the A/P position. It is to be appreciated that the user may choose any part of the anatomy for Point B 180 that is not Point A 176 .
  • Point B 180 is any point on the anatomy that is either anterior or posterior to Point A 176 .
  • this movement from Point A 176 to Point B 180 is constrained to one axis (i.e., the line formed between Point A 176 and Point B 180 is parallel to the anterior-posterior axis).
  • the chosen anatomy for Point B 180 is the posterior border of the disc space at the superior vertebrae.
  • the system 10 may provide the user with instructions for taking the second image.
  • An image is then taken with Point B 180 in the center of the fluoroscopic image as shown in image acquisition panel 162 .
  • the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point B 180 .
  • the user can select the “Next” button 174 to proceed to the next registration step.
  • Point A′ 182 is then defined in the anatomical A/P view of the patient with the C-arm 26 in the lateral position.
  • the chosen anatomy for Point A′ 182 is the ipsilateral border of the disc space.
  • An image is taken with Point A′ 182 in the center of the fluoroscopic image as shown in image acquisition panel 162 .
  • the software captures the following data points: 1) the current x-ray image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point A′ 182 .
  • This A/P image will be used for the patient-lateral 3D backdrop 184 (i.e., the virtual A/P fluoroscopic image).
  • the user can select the “Next” button 174 to proceed to the next registration step.
  • Point B′ 186 is then defined in the anatomical A/P view of the patient with the C-arm 26 in the lateral position.
  • the chosen anatomy for Point B′ 186 is the contralateral border of the disc space.
  • An image is taken with Point B′ 186 in the center of the fluoroscopic image as shown in image acquisition panel 162 .
  • the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point B′ 186 .
  • the user can select the “Next” button 174 to complete the patient registration process.
  • the user may be advised via message box 132 or a pop-up window that the C-arm 26 must be kept in this last registration position for the duration of the surgical procedure.
  • the user may choose between the first, second, and merged image in both the A/P and lateral views for display.
  • the image may be manipulated to orient the image according to user preference.
  • Point A 176 and Point B 180 are then chosen and entered into the software and the physical distance between Point A 176 and Point B 180 is calculated.
  • the user specifies where Point B 180 is located on backdrop image 178 .
  • this can be done manually (as shown in FIG. 15 ) by presenting the user with the backdrop image 178 alongside the scaling image 192 and prompting the user to define where Point B 180 is on the backdrop image 178 (e.g., using a touch-screen or a mouse).
  • Point A′ 182 and Point B′ 186 are then chosen and entered into the software and the physical distance between Point A′ 182 and Point B′ 186 is calculated.
  • Point B′ 186 is located on backdrop image 184 .
  • this can be done manually (as shown in FIG. 16 ) by presenting the user with the backdrop image 184 alongside the scaling image 194 and prompting to define where Point B′ 186 is on the backdrop image 184 .
  • These actions allow the software to correlate the physical distance between points with pixel distances between points on the x-ray images.
  • This “scale” factor is used to properly calculate the movements of the 3D generated instrumentation against the 2D x-ray image in the A/P and lateral views, respectively. With both of the A/P and lateral views registered and scaled, the surgical volume is defined for the system 10 . Selecting the “Next” button 174 allows the user to proceed to the main tracking screen 200 .
  • FIG. 17 depicts a main tracking screen 200 according to one exemplary embodiment.
  • Main tracking screen 200 includes left tracking panel 202 and right tracking panel 204 .
  • Left tracking panel 202 includes backdrop image 178 , data box 206 , and a plurality of image orientation tabs 208 .
  • Right tracking panel 204 includes backdrop image 184 , data box 206 , and a plurality of image orientation tabs 208 .
  • Data box 206 preferably includes the spatial position data acquired regarding the backdrop image 178 or 184 .
  • the plurality of image manipulation tabs 208 may be three buttons for reorienting the backdrop image 178 or 184 to suit the user's preference—for example, selecting the “Flip Horizontal” button will invert the backdrop image horizontally, selecting the “Flip Vertical” button will invert the backdrop image vertically, and selecting the “Reset” button will reset the backdrop image 178 or 184 back to its original configuration. As shown by way of illustration in FIG. 17 , image 184 is flipped horizontally.
  • the system 10 may perform various functions that facilitate the surgical procedure depending on the user's preferences or the patient's requirements.
  • the main tracking screen 200 may include numerous features (e.g., represented by tabs 212 , 214 , 216 , 218 , and 220 shown within navigation panel 210 ) for use during the surgical procedure.
  • the system 10 may identify an incision location on the skin, determine the distance from the skin to a desired location within the body, project one or more angles between anatomical points of interest, select the ideal direction for directing or docking a surgical object within the patient, choose the optimal size of implants based on the patient's anatomy, and map neurophysiologic data onto virtual anatomic locations of the patient (each of which will be explained in detail below with reference to a lateral trans-psoas procedure).
  • the user may wish to verify one or more ideal incision locations. This may be accomplished using the initial dilator 32 with an IR-reflective tracking array 20 attached. To do so, the user selects the “Loaded Tools” tab 218 from the main tracking screen 200 . Once the “Loaded Tools” tab 218 is selected, the display screen 100 displays a surgical object menu table of pre-loaded 3D CAD renderings of each instrument and implant available during the procedure in a variety of sizes. The user may use the display 24 to select the surgical object to be tracked (here, an initial dilator 32 ) and selecting the “Instrument Tracking” tab 222 .
  • the user may place the distal tip 224 of the initial dilator 32 laterally over an area of the patient and maneuver it as necessary until a lateral incision location that is substantially lateral to the surgical target site (e.g., the intervertebral disc space) is identified.
  • a lateral incision location that is substantially lateral to the surgical target site (e.g., the intervertebral disc space) is identified.
  • the user may also wish to determine the distance between the incision site at the skin to a spinal target site (e.g., the intervertebral disc).
  • the user may select the “Distance” tab 212 from the navigation panel 210 on the main tracking screen 200 .
  • the screen 200 will then instruct the user to place the distal tip 224 of the initial dilator 32 on the patient's skin and select a point of interest (e.g. the ipsilateral annulus of the intervertebral disc) and indicate the distance as the distance from the distal tip 228 of the virtual initial dilator 226 to the ipsilateral annulus identified on the virtual fluoroscopic image(s)).
  • a point of interest e.g. the ipsilateral annulus of the intervertebral disc
  • This allows the user to know, for example, the length of retractor blades required for a particular patient.
  • This feature has the added benefit of decreasing surgical time because a surgical assistant may assemble the retractor with the correct length of retractor blades while the user-surgeon proceeds with the surgical procedure.
  • the “Distance” function is not limited to the implementation explained above. Indeed, using the “Distance” tab 212 , the system 10 can determine the distance from the tip of any actively tracked surgical object to any user-selected location on the virtual fluoroscopic image.
  • the screen 200 may instruct the user to select clinically significant boundaries for the surgical procedure on the virtual fluoroscopic images.
  • the user may select the lateral borders of the superior and inferior endplates in the A/P view and the anterior and posterior aspects of the vertebral body in the lateral view.
  • the system 10 will provide the user the distance between the x, y, and z coordinates between these locations via the screen 200 .
  • the system 10 may also provide the user with an idealized axial view (not shown) of the disc space via an appropriately-scaled 3D rendering of a disc space. Armed with this view, the user may track instruments within the disc space and redirect them as necessary based on this third projected plane.
  • the user may also wish to determine the angle between two anatomical points on the patient.
  • the user may select the “Angle” tab 214 from the navigation panel 210 on the main tracking screen 200 .
  • the screen 200 will instruct the user to place the distal tip 224 of the initial dilator 32 on the patient's skin and select a point of interest on the patient (e.g. the iliac crest at L4-5).
  • the system 10 will then project the angle between the virtual initial dilator 226 to the patient's iliac crest at L4-5. This allows the user to know whether the patient's anatomy will accommodate a substantially lateral, retroperitoneal approach to the spine prior to making an incision.
  • This feature has the added benefit of decreasing surgical time and/or preparing for contingency procedures necessitated by challenging anatomy (e.g. a high iliac crest). It is to be appreciated that the Angle function is not limited to the implementation explained above. Indeed, using the “Angle” tab 214 , the system 10 can determine the angle between the tip of any actively tracked surgical object to any user-selected location on the virtual fluoroscopic image.
  • the user may also wish to determine the appropriate size of one or more surgical instruments to be used during the procedure and/or spinal implants to be implanted.
  • the user may select the “Sizing” tab 216 from the navigation panel on the main tracking screen 200 .
  • the change in distance along the y axis may indicate implant width
  • the change in distance along the x axis may indicate implant height
  • the change in distance along the z axis may indicate the implant depth as well as instrument depth.
  • the system 10 may provide the user with suggestions for instrument size, implant size, etc. prior to commencing the surgical procedure based on patient size, pathology, and the like.
  • a scalpel or other instrument is used to make an incision of sufficient size to receive a distal end 224 of the initial dilator 32 at a lateral incision location.
  • the distal end 224 of the initial dilator 32 may be guided through the retroperitoneal space toward the psoas muscle using a fingertip (for example), to protect the peritoneum.
  • the distal end of the initial dilator is then advanced in a substantially lateral direction through the psoas muscle toward the intervertebral disc space at or near the surgical target site.
  • the fibers of the psoas muscle are split using blunt dissection until the spinal target site is reached.
  • the user may track the path of the initial dilator 32 in real time from the incision site to the target spinal site via the instrument tracking mode of the system 10 from the main tracking screen 200 .
  • the main tracking screen 200 will track the A/P and lateral views of the virtual initial dilator 226 as it advances through the patient to ensure the initial dilator 32 will dock on an ideal location of the psoas muscle prior psoas muscle splitting.
  • the user need not verify the location of the distal tip of the initial dilator 224 using fluoroscopic imaging, thereby decreasing the exposure to x-ray radiation.
  • the neuromonitoring system 36 may be used to allow for safe passage of the initial dilator 32 through the psoas muscle to the spinal target site.
  • initial dilator 32 has a proximal end configured to attach to a stimulation connector (e.g. a clip cable) in electrical communication with the neuromonitoring system (not shown) and at least one electrode (not shown) at the distal tip 224 of the initial dilator 32 .
  • the stimulation connector is coupled to both the initial dilator 32 and the neuromonitoring system 36 to provide a stimulation signal as the initial dilator 32 is advanced through the psoas muscle.
  • the electrode located at the distal tip 224 the initial dilator 32 emits one or more stimulation pulses and recording electrodes positioned over muscles innervated by nerves traversing through the psoas registers the presence or absence of a significant EMG response.
  • the neuromonitoring system will continuously search for the stimulus threshold that elicits an EMG response in the myotomes monitored and then reports such thresholds on the display 24 . According to preferred embodiments, this may be done with a hunting algorithm that quickly and automatically determines stimulation thresholds.
  • the stimulus necessary i.e.
  • threshold intensity to elicit an EMG response will vary with distance from one or more nerves.
  • the neuromonitoring system 36 may then report the relative nerve distance indicated by these threshold intensities to the user by any number of color, graphic, or alpha-numeric indicia.
  • the user may be presented with the threshold intensity in units of mA and/or a color associated with the predetermined range in which the determined threshold lies.
  • threshold intensities between 1 mA and 6 mA may also display a red color
  • threshold intensities between 7 mA and 10 mA may also display a yellow color
  • threshold intensities greater than 10 mA may display a green color.
  • the system 10 is configured simultaneously track the movement of the initial dilator 32 via virtual initial dilator 226 and display neurophysiologic data as the initial dilator 32 is being advanced through the psoas muscle to the spinal target site.
  • EMG threshold intensity information can be captured and displayed on the virtual fluoroscopic images 178 , 184 corresponding to the position and orientation the initial dilator 32 occupied when said threshold intensity was elicited.
  • the system can map out the location of nearby nerves as the initial dilator 32 is advanced through the psoas muscle to the spinal target site. This may be accomplished by selecting the “Show Nerve Mapping” tab 220 on the main tracking screen 200 .
  • the position of the initial dilator 32 within the surgical corridor may be tracked as the relative distance of nearby nerves is monitored within the surgical corridor relative to the presence of and indicated via a color (e.g. red/yellow/green) and a number based on the threshold intensities required to elicit a significant threshold response as the initial dilator 32 is advanced to the surgical site.
  • the location of the initial dilator 32 (preferably, the distal tip 224 ) may be tracked on the virtual fluoroscopic images 178 , 184 as set forth above (via the virtual distal tip 228 of the virtual initial dilator 226 ).
  • the system 10 may then map each neurophysiologic response to the precise spatial orientation where each neurophysiologic response was obtained onto the virtual fluoroscopic images 178 , 184 as the initial dilator 32 moves towards the spinal target site.
  • the main tracking screen 200 shows that the neuromonitoring system 36 registered a significant EMG response with 19 mA as the virtual initial dilator 226 was positioned at location 230 (depicted as green on the screen 200 ), a significant EMG response at 8 mA as the virtual initial dilator 226 advanced to location 232 (depicted as yellow the screen 200 ), a significant EMG response at 4 mA as the virtual initial 226 dilator advanced to location 234 (depicted as red on the screen 200 ), and a significant EMG response at 22 mA as the virtual initial dilator 226 advanced to location 236 which is the spinal target site (depicted as green on the screen 200 ).
  • the user knows that at least one nerve lies near location 234
  • the system 10 not only possesses the capability to register spatial positioning and neurophysiologic responses during advancement of the initial dilator 32 , but also spatial positioning and neurophysiologic responses during retreat and/or repositioning of the initial dilator 32 .
  • the main tracking screen 200 shows the virtual initial dilator 226 retreating from the spinal target site (location 236 ).
  • the neurophysiologic responses obtained at locations 236 , 234 , and 232 are removed (as shown in FIG. 20 , the numeric results may be grayed-out and/or with dashed lines).
  • the initial dilator 32 may be repositioned and the repositioned path will be shown with virtual initial dilator 226 along with new neurophysiologic responses as the initial dilator 32 is advanced again.
  • Neurophysiologic data may also be used to track the orientation of one or more nerves relative to the distal tip 224 of the initial dilator 32 .
  • the user can track the radial position of the distal electrode (not shown) with neurophysiologic data indicative of the relative radial position of one or more nearby nerves mapped onto the virtual fluoroscopic images 178 , 164 .
  • the surgeon may then rotate the initial dilator 32 (shown as virtual initial dilator 226 ) about its longitudinal axis through any number of discrete positions and allow the neuromonitoring system 36 to capture neurophysiologic data for each discrete point.
  • FIGS. 20-23 illustrate this concept at one discrete position, two discrete positions, and four discrete positions, respectively.
  • FIG. 20 shows the neurophysiologic data at one discrete point (with no rotation of the initial dilator 32 , a first significant neurophysiologic response was obtained at 27 mA (depicted as green on screen 200 ) which may convey to the user that no nerve lies close to the electrode on the initial dilator 32 at the first discrete position.
  • FIG. 21 shows the initial dilator 32 rotated about its longitudinal axis to four radial positions.
  • a second neurophysiologic response was obtained at 7 mA (depicted as yellow on screen 200 ).
  • a third neurophysiologic response was obtained at 4 mA (depicted as red on screen 200 ).
  • a fourth neurophysiologic response was obtained at 9 mA (depicted as yellow on screen 200 ).
  • the system may be used as frequently as the surgeon wants to allow the system to give the discrete position discrete neurophysiologic data (e.g., minimum of two discrete points, maximum of surgeon preference. Additionally, it is to be appreciated that the system may give the orientation data at multiple depth locations as shown in FIG. 23 .
  • the system 10 may represent the data differently (e.g. an intensity chart over anatomy ( FIG. 24 ) or a color map ( FIG. 25 ), etc. . . .). Furthermore, the system 10 may incorporate more data points (e.g., time of the data capture, trend data) that assist the user in integrating the positioning and neurophysiologic mapping into the surgical procedure. While the foregoing was explained with respect to an initial dilator 32 , it is contemplated that the virtual image/mapping may be used for subsequent dilators, one or more retractor blades, etc.).
  • the position of the distal end of the dilator may be confirmed by selecting the “Instrument Tracking” tab 222 and verifying the position of the distal tip 224 of the initial dilator 32 (via the distal tip 228 of the virtual initial dilator 226 ) is at the spinal target site (thereby obviating the need to confirm this position using fluoroscopic imaging).
  • a K-wire of the initial dilator 32 is then introduced into the targeted disc space after the initial dilator 32 is passed through the psoas muscle.
  • a sequential dilation system including one or more supplemental dilators may be guided over the initial dilator for the purpose of further dilating the tissue down to the surgical target site.
  • each component of the sequential dilation system may be outfitted with an IR-reflective array 16 and its location mapped onto the virtual fluoroscopic image.
  • each component may be outfitted with one or more electrodes such that it neurophysiologic data obtained while each component is advanced to the spinal target site may be mapped onto the virtual fluoroscopic image as well.
  • the retractor blades of the access system are introduced over the supplemental dilator (or the initial dilator if the sequential dilation system is not employed) toward the disc space.
  • the neuromonitoring system 36 may be used to perform neuromonitoring as the blades are positioned helps provide safe passage through the psoas muscle.
  • a posterior shim element and/or retractor extenders (not shown) are engaged with the retractor blades.
  • the surgical tracking system 10 may be used to confirm the position of the blades proximal to the disc space (again, thereby obviating the need to confirm the position of the blades with fluoroscopic imaging).
  • the blades may be used to retract the distraction corridor so as to form an operative corridor. Tracking may be used to verify the position of the distal ends of the blades without the need for fluoroscopic imaging.
  • Various instruments may be inserted through the operative corridor to prepare the targeted disc space.
  • At least one preparation tool such as a disc cutter, pituitary, scrapper, curette, or the like is inserted through the operative corridor to prepare the disc space. Any one of these may be outfitted with an IR-reflective tracking array 20 and its location tracked during the procedure, to verify for the user, for example, the extent of discectomy completed and the extent of discectomy left to be done.
  • one or more sizers are inserted into the disc space to provide appropriate disc height restoration. This can also be monitored via position tracking obviating the need for fluoroscopy.
  • An appropriately-sized implant (preferably determined by using the “Sizing” tab 216 as set forth above) is then advanced into the disc space with an inserter tool.
  • the implant is releasably secured to the inserter tool such that the surgeon may release the implant when it is properly positioned in the disc space.
  • the system 10 of the present invention may decrease the need for additional fluoroscopic images when transitioning from one spinal level to the next.
  • the system 10 may allow the C-arm 26 to be moved during the procedure.
  • the virtual marker e.g. dot 130
  • the user may then manipulate the C-arm 26 such that the dot 130 aligns with a point of interest at the next spinal level. It is to be appreciated that while this may not obviate the need for ionizing radiation entirely, it allows the user a close approximation of the location of the C-arm 26 without the use of localizing fluoroscopy.
  • the system 10 may also provide an image rotation feature which verifies and/or corrects the images displayed so that they are accurately displayed with respect to gravity.
  • the user takes a fluoroscopic image with an instrument that is known to be positioned vertically with respect to gravity (e.g., a plumb bob, a probe, or instrument with an attached orientation sensor, etc.) which is imported into the system 10 .
  • the user takes a fluoroscopic image of the instrument with angular feedback (e.g.
  • the image on the system 10 may need to be rotated relative to the screen 100 until the image with the instrument in it does appear properly positioned.
  • the system 10 of present invention also provides manners in which a user may verify the virtual surgical objects are tracking “true” to the actual surgical object.
  • a virtual surgical object may not track “true” if the patient has moved on the surgical table or the image is distorted due to external factors.
  • the user may select the “Offset” mode 230 from the main tracking screen 200 and take a fluoroscopic image with a surgical object positioned within the disc space and compare the fluoroscopic image with the virtual instrument overlaid onto the same fluoroscopic image. If the position of the actual and virtual surgical objects do not align adequately, modifications may be made to the position of the virtual surgical object on the display screen 100 . As shown by way of example in FIG. 26 , a fluoroscopic image may be taken with the initial dilator 32 positioned within the disc space and the virtual initial dilator 226 projected onto the fluoroscopic image. In this illustration, it can be seen that the two instruments do not line up exactly.
  • the user may instruct the system 10 to make adjustments to the x and y position of the virtual initial dilator 226 to align the virtual tip 228 of the virtual initial dilator 226 with the actual tip 224 of the actual initial dilator 32 using the x and y position adjustment fields 230 , 232 .
  • an upward correction in the y direction and a correction to the right in the x direction would realign the virtual initial dilator 226 to the actual initial dilator 32 .
  • the Offset feature of the surgical tracking system 10 may be used without having to take additional fluoroscopic images during the procedure.
  • the user may repeatedly touch a tracked surgical object to a home position (located within the surgical corridor) that is easily identified both by visual identification and on the virtual fluoroscopic and comparing the offset (if any) that the virtual surgical object has relative to the home location. If, for example, the virtual instrument tracks “true”, the position of the distal end of the tracked surgical object will appear directly on top of the home position. If however, the virtual instrument no longer tracks “true,” the position of the distal end of the tracked instrument may appear offset from the home position.
  • the user may make adjustments to the virtual surgical object in the x and y directions using the x and y position adjustment fields 230 , 232 as explained above.
  • the user can come back to the recorded “home position” as the surgery progresses and verify accurate tracking through the same virtual to real-world comparison multiple times during the surgery if so desired.
  • the home position may be an anatomical landmark (e.g., an ipsilateral osteophyte), a radiolucent marker positioned within the surgical corridor (e.g., a radiolucent sphere) such that a tracked surgical object can be rotated around the sphere's surface and the user can confirm that the movements correlate to the sphere's location and diameter; and a radiodense marker that can be used to produce a definite mark on the virtual fluoroscopic image to confirm that the location of the instrument is on the “home position.”
  • an anatomical landmark e.g., an ipsilateral osteophyte
  • a radiolucent marker positioned within the surgical corridor e.g., a radiolucent sphere
  • a radiodense marker that can be used to produce a definite mark on the virtual fluoroscopic image to confirm that the location of the instrument is on the “home position.”
  • the Offset feature may detect patient movement using accelerometers (such as those disclosed in the '121 application).
  • accelerometers such as those disclosed in the '121 application.
  • a two- or three-axis accelerometer affixed to the patient in a known position and orientation can be used to alert the user and system 10 of possible patient movement.
  • the accelerometers' measurement of static acceleration provides tilt and orientation information and the accelerometers' measurement of dynamic acceleration provides vibration and other movement information. If significant changes in patient positioning are detected by the accelerometers, the system 10 may display a warning that the tracking accuracy may be decreased such that re-registration and scaling may be advisable.
  • the surgical tracking system 10 Details of the surgical tracking system 10 are now discussed in conjunction with a second exemplary use thereof for monitoring the trajectory of a surgical instrument used during pedicle fixation to ensure proper placement of pedicle screws.
  • the system 10 may be used in conjunction with, or further integrated with, the surgical trajectory monitoring system of the '121 application.
  • the control unit 22 Prior to forming a pedicle pilot hole or placing pedicle screws (preferably prior to starting the surgical procedure), it may be of interest to capture the starting point and stopping point of the pedicle cannulation using the surgical tracking system 10 .
  • the control unit 22 captures the spatial information of the C-arm 26 at a first spinal level of interest via the IR position sensor 14 .
  • a first lateral fluoroscopic image is taken with the lateral start points centered in the center of the image.
  • a second lateral fluoroscopic image is then taken with the lateral stopping points centered in the center of the image.
  • the first lateral image may be used as the lateral virtual backdrop image for that spinal level and scaled as set forth above, with reference to the Points A and B in the lateral, trans-psoas access procedure.
  • a virtual protractor may be projected off of this lateral backdrop image such that the cranial-caudal angles may be measured and inputted into the system 10 as set forth in the '121 application.
  • a first A/P fluoroscopic image is taken with the left start point centered in the image (for example, the lateral border of the left pedicle).
  • a second A/P fluoroscopic image is taken with the right start point centered in the image (for example, the lateral border of the right pedicle).
  • the first A/P image may be used as the A/P virtual backdrop image for that spinal level and scaled as set forth above, with reference to the Points A′ and B′ in the lateral, trans-psoas access procedure. It is to be appreciated that selecting the ideal start and stop points for pedicle cannulation in this way is advantageous for at least two reasons. First, fluoroscopic radiation to transition to this spinal level during the procedure is eliminated because the system 10 will alert the user when the C-arm 26 has been brought into the correct position. Second, fluoroscopic radiation is reduced when transitioning between lateral and A/P images. Aside from acquiring the starting points, no further imaging is required.
  • a surgical instrument may be advanced to the target site and positioned on the lateral margin of the pedicle, the preferred starting point according to the embodiment set forth above.
  • the surgical instrument is the pedicle access probe including the orientation sensor described in the '121 application and further outfitted with an IR tracking array 20 .
  • the location of the pedicle access probe may be projected onto the screen 100 in both the lateral and A/P views via a virtual pedicle access probe (not shown).
  • the neuromonitoring system 36 may provide neurophysiologic data and/or mapping during pedicle cannulation.
  • the system 10 can show the dynamic relation of the distal tip of the pedicle access probe relative to the medial pedicle wall and the depth within the vertebral body via the virtual pedicle access probe instead of relying on fluoroscopic imaging.

Abstract

The present application includes a position tracking system for tracking the location of surgical objects within the surgical field, a neuromonitoring system for detecting the existence of (and optionally the distance and/or direction to) neural structures during a surgical procedure, and a processing system communicatively linked to both the position tracking system and the neuromonitoring system.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This patent application claims priority to U.S. Provisional Application Ser. No. 61/552,466 filed on Oct. 28, 2011 and U.S. Provisional Application Ser. No. 61/671,537 filed on Jul. 13, 2012, the complete disclosures of which are hereby incorporated by reference into this application as if set forth fully herein.
FIELD
The present application pertains to spine surgery. More particularly, the present application pertains to a surgical tracking system that monitors the locations of surgical objects and nerves within the body during spine surgery.
BACKGROUND
The spinal column is a highly complex system of bones and connective tissues that provide support for the body and protect the delicate spinal cord and nerves. The spinal column includes a series of vertebral bodies stacked atop one another, each vertebral body including an inner or central portion of relatively weak cancellous bone and an outer portion of relatively strong cortical bone. Situated between each vertebral body is an intervertebral disc that cushions and dampens compressive forces exerted upon the spinal column. A vertebral canal containing the spinal cord is located behind the vertebral bodies. The spine has a natural curvature (i.e., lordosis in the lumbar and cervical regions and kyphosis in the thoracic region) such that the endplates of the upper and lower vertebrae are inclined towards one another.
There are many types of spinal column disorders including scoliosis (abnormal lateral curvature of the spine), excess kyphosis (abnormal forward curvature of the spine), excess lordosis (abnormal backward curvature of the spine), spondylolisthesis (forward displacement of one vertebra over another), and other disorders caused by abnormalities, disease, or trauma (such as ruptured or slipped discs, degenerative disc disease, fractured vertebrae, and the like). Patients that suffer from such conditions often experience extreme and debilitating pain, as well as diminished nerve function.
A noteworthy trend in the medical community is the move away from performing surgery via traditional “open” techniques in favor of so-called “minimally invasive” or “minimal access” techniques. Open surgical techniques are generally undesirable in that they typically require large incisions with high amounts of tissue displacement to gain access to the surgical target site, which produces concomitantly high amounts of pain, lengthened hospitalization (increasing health care costs), and high morbidity in the patient population. Less-invasive surgical techniques (including minimal access and minimally invasive techniques are gaining favor due to the fact that they involve accessing the surgical target site via incisions of substantially smaller size with greatly reduced tissue displacement requirements. This, in turn, reduces the pain, morbidity, and cost associated with such procedures.
One disadvantage to performing minimally invasive surgery is the increased reliance on radiographic imaging to “see” the spinal target site, instruments, and implants during the surgery. While this increased exposure is generally negligible for the patient, over time and over multiple procedures on different patients, this increased exposure adds up for the surgeon and other operating room personnel. Systems and methods have been developed to reduce reliance on radiographic imaging during spine surgery. Once such system and method involves three-dimensional (3D) navigation systems that use positional tracking systems to track the position of implants and instruments relative to the spinal target site and present the surgeon with a representative image of the instrument superimposed on an image of the target site to indicate the position of the implant or instrument relative to the anatomical structures depicted in the image. (e.g. spinal target site). However, these systems have the disadvantages of generally requiring that reference markers be somehow fixed to the patient (e.g. anchoring a pin or other instrument into the patient's spine, thus causing additional trauma to the patient's anatomy), requiring input of pre-operative CT images into the system before or during the procedure, and/or requiring large, specialized, and expensive equipment that may not be available in certain instances or operating rooms. Furthermore, even though 3D navigation provides spatial information regarding a target surgical site, instruments, and implants during surgery, it does not provide neurophysiologic information regarding the nerves lying near and around the operative corridor.
A need exists for systems and methods that provide both information regarding the target spine site, surgical implants, instruments, and nerves surrounding the operative corridor during minimally-invasive spine surgeries. The systems and methods described herein are directed to addressing the challenges described above, and others, associated with various minimally-invasive spine procedures.
SUMMARY
The present invention includes a system and methods for decreased reliance on fluoroscopic imaging while avoiding harm to neural tissue during surgery. According to a broad aspect, the present invention includes a position tracking system for tracking the location of surgical objects within the surgical field, a neuromonitoring system for detecting the existence of (and optionally the distance and/or direction to) neural structures during a surgical procedure, and a processing system communicatively linked to both the position tracking system and the neuromonitoring system.
According to another aspect of the present invention, the position tracking system includes an infrared (IR) position sensor, an IR-reflective tracking array attached to an intraoperative imaging system, and at least one IR-reflective tracking array attached to at least one surgical object. The position tracking system is communicatively linked to the processing system such that the processing system may display position tracking information to a user (e.g., a surgeon or a medical professional assisting the surgeon).
According to another aspect of the present invention, the neuromonitoring system includes instruments capable of stimulating the peripheral nerves of a patient and additional instruments capable of recording the evoked neuromuscular responses. The neuromonitoring system is communicatively linked to the processing system such that the processing unit is programmed to measure the response of nerves depolarized by the stimulation signals to indicate the existence of (and optionally the distance and/or direction to neural structures during the surgical procedure.
According to another aspect of the present invention, the processing system is programmed to perform a plurality of predetermined functions using one or both of the position tracking system and neuromonitoring system. For example, the processing system may be programmed to register the position of a patient, scale virtual fluoroscopic images, track one or more surgical objects within the patient, determine the distance between two points of interest within the surgical field, determine the angle between two points of interest within the surgical field, recommend the size of one or more surgical instruments or implants, perform neurophysiologic monitoring, map neurophysiologic monitoring onto virtual fluoroscopic images, correct for translational and/or rotational offsets of virtual fluoroscopic images, and track the movement of one or more reference objects during the surgical procedure. The processing system is further configured to display the results of these predetermined functions to the user in a meaningful way.
According to another aspect of the invention, one or more surgical procedures may be performed using various embodiments of the system. According to one embodiment, the surgical procedure is a minimally-invasive lateral lumbar surgery. According to another embodiment, the surgical procedure is a posterior (open or minimally invasive) lumbar surgery.
BRIEF DESCRIPTION OF THE DRAWINGS
Many advantages of the present invention will be apparent to those skilled in the art with a reading of this specification in conjunction with the attached drawings, wherein like reference numerals are applied to like elements and wherein:
FIG. 1 is an example operating room setup depicting the components of a surgical tracking system, according to an example embodiment;
FIG. 2 is a perspective view of a C-arm fluoroscope comprising part of the system of FIG. 1;
FIG. 3 is a perspective view of various surgical objects that may be tracked with the system of FIG. 1;
FIG. 4 is a screen shot depicting an example welcome screen of the system of FIG. 1;
FIG. 5 is a screen shot depicting an example procedure information screen of the system of FIG. 1;
FIG. 6 is a screen shot depicting an example C-arm setup screen of the system of FIG. 1;
FIG. 7 is a screen shot depicting an example patient positioning screen during a first step of a patient positioning sequence;
FIG. 8 is a screen shot depicting the example patient positioning screen during a second step of the patient positioning sequence;
FIG. 9 is a screen shot depicting an example IR positioning sensor setup screen of the system of FIG. 1;
FIG. 10 a flow chart indicating the steps of a patient registration and scaling sequence of the system of FIG. 1, according to one example embodiment;
FIG. 11 is a screen shot depicting an example registration and scaling screen during a first step in the example patient registration sequence of FIG. 10;
FIG. 12 is a screen shot depicting the example registration and scaling screen during a second step in the example patient registration sequence of FIG. 10;
FIG. 13 is a screen shot depicting the example registration and scaling screen during a third step in the example patient registration sequence of FIG. 10;
FIG. 14 is a screen shot depicting the example registration and scaling screen during a fourth step in the example patient registration sequence of FIG. 10;
FIG. 15 is a screen shot depicting the example registration and scaling screen during a first step in the example scaling sequence of FIG. 10;
FIG. 16 is a screen shot depicting the example registration and scaling screen during a second step in the example scaling sequence of FIG. 10;
FIG. 17 is a screen shot depicting the system tracking the 3D location of a surgical tool in real time;
FIG. 18 is a screen shot of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations as an initial dilator is advanced from the incision site to a target site on a patient;
FIG. 19 is a screen shot of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations as the initial dilator is retreated towards the incision site;
FIGS. 20-23 are screen shots of the system in nerve mapping mode according to a first embodiment showing neurophysiologic recordings at various depth locations and various rotational positions of the electrode as the initial dilator is advanced towards the incision site;
FIG. 24 is a screen shot of the system in nerve mapping mode according to a second embodiment;
FIG. 25 is a screen shot of the system in nerve mapping mode according to a third embodiment; and
FIG. 26 is a screen shot of the system in offset mode according to a first embodiment.
DESCRIPTION OF THE PREFERRED EMBODIMENT
Illustrative embodiments of the invention are described below. In the interest of clarity, not all features of an actual implementation are described in this specification. It will of course be appreciated that in development of any such actual embodiment, numerous implantation-specific decisions must be made to achieve the developers' specific goals such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure. The systems and methods disclosed herein boast a variety of inventive features and components that warrant patent protection, both individually and in combination.
The present invention includes a surgical tracking system that monitors the 3D location of objects in or near a surgical field and then conveys the position of these objects to a user relative to traditional, yet virtual anatomical views. The system conveys the position of virtualized representations of these surgical objects over one or more virtual fluoroscopic images. The virtual surgical objects may be generated with software (e.g., 3D CAD models) and then superimposed onto static two-dimensional (2D) images (e.g., in lateral and anterior-posterior (A/P) views). The user may rely on these images to locate the surgical objects relative to the patient's anatomy instead of taking repeated x-ray images. While the system does not eliminate the need for fluoroscopic imaging entirely, it does limit the number of fluoroscopic images required during a surgical procedure thereby minimizing the amount of ionizing radiation that the patient and user are exposed to. Furthermore, the system accomplishes this reduction in radiation exposure without requiring reference markers fixed to the patient, input of pre-operative CT images, or additional large, specialized, and costly equipment.
Various embodiments are described of the surgical tracking system and surgical uses thereof for enhancing the safety and efficiency of surgical procedures. In one example, the present invention may facilitate monitoring the location and orientation of surgical access instruments which can aid in both the insertion and positioning of the surgical access instruments themselves, as well as aiding in the later insertion of instruments and/or implants through the surgical access instruments. In another example, tracking the location of surgical instruments within the surgical field may be accomplished with significantly decreased reliance on intraoperative imaging. In yet another example, the present invention may be used in conjunction with, or integrated into, a neuromonitoring system for assessing one or more of nerve proximity (and/or nerve directionality) and pedicle integrity, among other functions. In still another example, the present invention may facilitate safe and reproducible pedicle screw placement by monitoring the trajectory of various surgical instruments used during pilot hole formation and/or screw insertion. While the above examples are described in more detail below, it is expressly noted that they are set forth by way of example and that the present invention may be suitable for use in any number of additional surgical actions where tracking the 3D location of surgical instruments and implants within the surgical field and decreased exposure to x-ray radiation are desired. Accordingly, it will be appreciated then that while the surgical tracking system is generally discussed herein as being attached to instruments such as dilating cannulas, tissue retractors, C-arms, pedicle access tools, etc., other instruments may be substituted depending on the surgical procedure being performed and/or the needs of the surgeon.
With reference now to FIG. 1, there is shown, by way of example, one embodiment of a surgical tracking system 10 including a position tracking system 12 and at least one surgical object to be tracked. Preferably, the position tracking system 12 includes an infrared (IR) position sensor 14 (also referred to as an “IR camera”), an IR-reflective tracking array 16 mounted on an intraoperative imaging system 18, an IR-reflective tracking array 20 mountable to each surgical object to be tracked (e.g. initial dilator 32) and a feedback and control device comprising a control unit 22 and a display 24. The control unit 22 has position tracking software and C-arm video import capabilities and is communicatively linked to the display 24 so that information relevant to the surgical procedure may be conveyed to the user in a meaningful manner. By way of example, the relevant information includes, but is not limited to, spatial positioning data (e.g., translational data in the x, y, and z axes and orientation/rotational data) acquired by the IR position tracking sensor 14.
The intraoperative imaging system 18 may be any commercially available C-arm fluoroscope 26 communicatively linked to a C-arm display 28 with an IR-reflective tracking array 16 attached, for example, to the signal receiver. As illustrated in FIG. 2, the IR-reflective tracking array 16 is preferably attached to the C-arm 26 by one or more mating points (not shown) on a reticle 30. By way of example, reticle 30 may be the reticle shown and described in PCT Patent App. No. PCT/US2008/012121, entitled “Surgical Trajectory Monitoring System and Related Methods,” and filed on Oct. 24, 2008, the entire contents of which is hereby incorporated by reference as if set forth fully herein.
As depicted in FIG. 3, the surgical objects to be tracked may be one or more surgical instruments including initial dilator 32, cobb 238, rasp 240, and intervertebral implants 244 (shown in FIG. 3 attached to an implant inserter 242). Though not shown, and by way of example only, other surgical objects to be tracked may include retractor blades and intervertebral implant trials, among others. Indeed any instrument, implant, or device suitable for use in spine surgery may be outfitted with an IR-reflective tracking array 20 and tracked within the surgical field in accordance with the present disclosure. FIG. 1 illustrates an IR-reflective tracking array 16 positioned on an intraoperative imaging device 18 and an IR-reflective tracking array 20 positioned on an initial dilator 32. Each IR- reflective tracking array 16, 20 has IR-reflective spheres 34 arranged in a calculated manner somewhere along its length. Spatial position data about a surgical object may be acquired in its raw form relative to the IR position tracking sensor's 14 local coordinate system with each IR- reflective tracking array 16, 20 determining its own coordinate system.
The system 10 further comprises a neuromonitoring system 36 communicatively linked to the position tracking system 12 via the control unit 22. By way of example only, the neuromonitoring system 36 may be the neuromonitoring system shown and described U.S. Pat. No. 8,255,045, entitled “Neurophysiologic Monitoring System” and filed on Apr. 3, 2008, the entire contents of which is hereby incorporated by reference as if set forth fully herein. The neuromonitoring system 36 may be further configured to perform neuromonitoring as a surgical access corridor is created (e.g. a lateral access corridor created via a sequential dilation assembly) and/or maintained (e.g. via a retractor assembly) and as one or more surgical implants are placed into the body (e.g. a pedicle screw). The neuromonitoring system 36 may also be configured to: 1) assess the depth of nearby nerves relative to a surgical object; 2) localize the relative position of the nerves once the surgical object reaches the spinal target site (e.g. a lateral aspect of the spine); 3) couple the neurophysiology data with the previously-acquired positional data of the surgical object; and 4) present the combined data to the user via a virtual fluoroscopic image.
Details of the surgical tracking system 10 are discussed in conjunction with a first exemplary use thereof for monitoring a minimally-invasive, lateral trans-psoas approach to the spine during a lateral lumbar interbody fusion. By way of example, the system 10 may be used for monitoring the approach during a lateral trans-psoas procedure shown and described in U.S. Pat. No. 7,905,840, entitled “Surgical Access System and Related Methods,” and filed on Oct. 18, 2004, the entire contents of which is hereby incorporated by reference as if set forth fully herein.
FIGS. 4-26 illustrate, by way of example only, one embodiment of a screen display 100 of the control unit 22 capable of receiving input from a user in addition to communicating feedback information to the user. In this example (though it is not a necessity), a graphical user interface (GUI) is utilized to enter data directly from the screen display 100. FIG. 4 depicts a welcome screen in accordance with one exemplary embodiment. The system 10 is configured to detect the connection status of each of its required components. By way of example only, icons 102, 104, 106 indicate the connection status of the IR position sensor 14, C-arm reticle 30, and C-arm video output, respectively. If one or more required components are not connected or are connected improperly, the display 100 may alert the user to address the issue before proceeding via textual, audio, and/or visual means (e.g., textual messages, audible tones, colored screen, blinking screen, etc.). A separate indicator may alert the user once all of the required components are properly connected. For example, an “OK” message below each of icons 102, 104, and 106 indicates all components are properly connected. Selecting the “Start” button 108 proceeds to the next screen.
FIG. 5 illustrates an example procedure input screen. The user may use the screen 100 to input information into one or more selection fields regarding the patient and procedure. The selection fields may be drop-down menus or check-boxes pertaining to patient positioning 110 (e.g., lateral, prone), spinal levels to be operated on 112 (e.g., L1, L2, L3, L4, L5, S1), and surgical instruments to track 114 (e.g., dilator, cobb, rasp, inserter, retractor blade, custom/other). Once the user has provided the requested information, selecting the “Next” button 116 proceeds to the next screen.
The tracked surgical objects are preferably monitored with respect to the position of, and displayed onto the anatomy of one or more virtual fluoroscopic images from the perspective of, the intraoperative imaging system 18 (not with respect to the position of, and the perspective of, the IR position sensor 14). Thus a surgical object is referenced off of the location of the C-arm 26 and its positional information is determined relative to coordinate system of the tracking array 16 of the c-arm 26 (the “working coordinate system”).
To track objects with respect to the C-arm 26, the coordinate system of the IR reflective tracking array 16 affixed to the C-arm 24 is oriented in the direction of the x-ray beam according to a preferred embodiment. By way of example, one of the axes of the IR reflective tracking array's 16 coordinate system is preferably oriented parallel to the beam axis, and the definition of this orientation is known in the position tracking software resident on the control unit 22. Prior to installing the reticle 30, the C-arm 26 is placed in the traditional lateral position (with the arc of the “C” pointing straight down (0°) and the axis representing the x-ray beam at 90°). Accelerometers (not shown) installed inside the reticle 30 may be used to generate angular data such that the reticle 30 can be installed at the 12 o'clock position relative to the surface of the C-arm signal receiver. After properly installing the reticle 30, the C-arm 26 can be placed in a traditional anterior/posterior (A/P) position. Preferably, the user moves the C-arm 26 to this position with the aid of the accelerometer readout of the reticle 30 as described in the '121 application. An exact A/P position is achieved when accelerometer's x and y axes both read 0°. This will also verify that the axis of gravity (and the axis of the X-ray beam) is normal to the surface of the C-arm receiver, which corresponds to the plane of the fluoroscopic image. Once this position is verified, the software establishes the corresponding array 16 position as 0°/0° and that the axis of the x-ray beam is parallel to the desired “working coordinate system's” y axis. The system 10 may then report to the user when the beam/image axis is at 90° or the “true” lateral position.
An exemplary C-arm setup and instruction screen is illustrated in FIG. 6. To assist the user in setting up the working coordinate system of the C-arm 26, the screen 100 may provide helpful instructions for the user for setting up the C-arm 26 and reticle 30 via, by way of example only, instruction fields 118, 120, and 122. Once the user has acknowledged the movement restrictions and confirmed the proper reticle 30 setup, selecting the “Next” button 116 advances to the next screen.
Following setup, the user should ensure that the patient is properly positioned. FIGS. 7 and 8 depict exemplary display screens 100 that may assist the user with patient positioning. A first step to confirming patient positioning is to obtain a cross-table A/P fluoroscopic image of the patient in the lateral decubitus position. The screen display 100 may show the user an ideal (sample) cross-table AP image 124 and a C-arm status indicator field 126 to ensure that the C-arm 26 is in the true lateral position. A cross-table A/P image 128 is taken (and re-taken as needed) to approximately match the sample cross-table A/P image 124. A virtual center dot 130 is superimposed on both cross-table A/ P images 124, 128. Once image 128 matches sample 124, the user may be provided with one or more additional instructions via message box 132. For example, the message box 132 may remind the user to lock the C-arm wheels before proceeding to the next step. Selecting the “Next” button 116 advances the screen display 100 to the next screen.
A second step to confirming patient positioning is to obtain a patient-lateral fluoroscopic image of the patient in the lateral decubitus position. The screen display 100 may show the user an ideal (sample) lateral image 134 and a C-arm status indicator field 126 to ensure that the C-arm is positioned 90° from the previous cross-table position. A lateral image 136 is taken (and retaken as needed) to approximately match the sample lateral image 134. A virtual center dot 138 is superimposed on both lateral images 134, 136. When the user is finished confirming this view, the “Next” button 116 is selected to advance the display screen 100 to the next screen.
After patient positioning is complete, the IR position sensor 14 can be positioned near the surgical field. The IR position sensor 14 should be positioned so that the C-arm array 16 can be tracked in both the A/P and lateral positions. Also, the instrument tracking array 20 should be visible within the tracking volume of the IR position sensor 14. FIG. 9 illustrates an exemplary IR position sensor position field screen. The display screen 100 may present the user with an exemplary configuration setup and instructions (shown by way of example only in instruction panel 140). Instruction panel 140 instructs the user to position the IR position sensor 14 near the surgical bed so that the IR reflective tracking array 16 on the C-arm 26 is in view of the IR position sensor 14 and the surgical objects to be tracked within the IR position sensor's 14 tracking volume. The user may also input information regarding the general positioning of the patient relative to the IR position sensor 14 in positioning input field 142. By way of example only, the user may be instructed to select which side of the surgical bed the patient's head is resting on (e.g., side “A” or side “B”). After the general position of the IR position sensor 14 has been inputted, the system 10 may evaluate whether the position of the IR position sensor 14 relative to the IR reflective tracking arrays 16, 20 is optimal for data acquisition. According to some implementations, the system 10 may provide instructions as to where to move the IR position sensor 14 to achieve optimal placement in message box 132 (e.g., towards the patient, away from the patient, towards the feet, towards the head). According to some implementations, the display screen 100 may present the status of one or more tracked objects via tracked object status field 144. It is to be appreciated that any visual indicator (textual, graphic, color) may be used to indicate status of the tracked object or objects relative to the IR position sensor 14. As shown by way of example in FIG. 9, the tracked object status field 144 shows three surgical instruments being tracked: the C-arm, the initial dilator, and the rasp. The tracked object status field 144 also indicates (via text and/or color indicators) that the C-arm and the initial dilator are within view of the IR position sensor 14 but that the rasp is not. Should the user wish to use the rasp, it will be necessary to move it within the tracking volume of the IR position sensor 14. Once the optimum IR position sensor 14 placement has been achieved, the user may then select the next button to continue to the patient registration protocol. Again, various reminders may be provided to the user via message box 132 (e.g., the user may be reminded not to move the IR position sensor for the remainder of the procedure).
With the C-arm 26, patient, and IR position sensor 14 properly positioned, a user may proceed to the registration and scaling process. In some implementations, the patient registration portion of the process involves defining two or more points of interest for each fluoroscopy view. FIG. 10 depicts a flowchart of the patient registration process for a given view. First, the user identifies a first point of interest (Point A) at Step 150. Next, the user positions the C-arm 26 such that Point A is in the center of the fluoroscopic image at Step 152 using a virtual center marker as will be described in greater detail below. Once the user indicates that this point is satisfactory, the position tracking software captures the current image, the x, y, z position of the reticle 30, and the rotational position of the reticle 30 with respect to Point A at Step 154. At Step 156, the user identifies a second point of interest (Point B). It is to be appreciated that the user may choose any part of the anatomy for Point B that is not Point A. Next, the user positions the C-arm 26 such that Point B is in the center of the fluoroscopic image at Step 158. Once the user indicates that this point is satisfactory, the position tracking software captures the current image, the x, y, and z positions of the reticle 30, and the rotational position of the reticle 30 with respect to Point B at Step 164. In some implementations, points of interest may be acquired in two or more views (e.g. lateral and A/P views). The steps described above are repeated for each view and once all desired views are registered, the images for each view (i.e. the A/P view's scale and/or the lateral view's scale) may be scaled.
Scaling may be accomplished in a number of ways (e.g., manually or automatically). In some automatic implementations, the system 10 lines up the images to create the best anatomy match via image recognition. In manual implementations, this may be done, by way of example, by presenting a user with a scaling image and a backdrop image and prompting the user to define where one or more points are located on a backdrop image (as will be described in further detail below). The system 10 measures the physical distance between Point A and Point B (Step 188) and determines a scale factor by correlating the physical distance between Points A and B with the number of pixels between Points A and B on the display 24 (Step 190). This “scale” factor may then be used to calculate the movements of the 3D-generated virtual instruments and implants against the 2D x-ray image in that particular view, among other things. With both of the A/P and lateral views registered and scaled, the surgical volume is defined for the system and the C-arm 26 may then be articulated to a primary position (e.g., a lateral position). The C-arm 26, in this position, acts as the reference object for all other objects being tracked. Effectively, the other tracked objects (i.e. instrumentation, implants, etc.) will be referenced from the C-Arm array's 16 local coordinate system, thereby allowing the movements of the 3D-generated instruments and implants against the 2D image in each view.
The screen display 100 (explained, for illustrative purposes, with reference to FIG. 11) may include various features for assisting the user with the patient registration and scaling process. Screen display 100 may include an image acquisition panel 162, a patient registration task panel 164, and an instruction panel 166. By way of example only, the image acquisition panel 162 preferably includes the live fluoroscopic image that is inputted into the system 10 from the intraoperative imaging system 18 via a C-arm input cable. This fluoroscopic image may be the image that the user selects to be the anatomical lateral backdrop (i.e. the virtual lateral fluoroscopic image). By way of further example, the patient registration task panel 166 may include a list of the registration steps completed and those yet to be completed and the instruction panel 166 may include specific directions to the user for what actions to perform to complete the first step in the acquisition process. The screen display 100 may also include a plurality of buttons for navigating through the patient registration process. For example, selecting the “Capture” button 168 obtains the position and image information for the first lateral fluoroscopy shot, selecting the “Reset” button 170 restarts the patient registration process, selecting the “Back” button allows the user to see what was captured for a previously-acquired image, and selecting the “Next” button 174 advances to the next step in the patient registration process.
FIGS. 11-14 depict the screen display 100 during the steps of the patient registration process according to one exemplary embodiment. As shown in FIG. 11, Point A 176 is defined in the anatomical lateral view of the patient with the C-arm 26 in the A/P position. By way of example only, the chosen anatomy for Point A 176 is the anterior border of the disc space at the superior vertebra. An image is taken with Point A 176 in the center of the fluoroscopic image as shown in image acquisition panel 162. Once the user indicates that this point is satisfactory by selecting the “Capture” button 168, the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point A 176. This lateral image will be used for the “virtual” patient-lateral 3D backdrop 178. Once the fluoroscopic image has been taken and the positional data captured, the user can select the “Next” button 174 to proceed to the next registration step.
As shown in FIG. 12, Point B 180 is then defined in the anatomical lateral view of the patient with the C-arm 26 in the A/P position. It is to be appreciated that the user may choose any part of the anatomy for Point B 180 that is not Point A 176. Preferably, Point B 180 is any point on the anatomy that is either anterior or posterior to Point A 176. For convenience, this movement from Point A 176 to Point B 180 is constrained to one axis (i.e., the line formed between Point A 176 and Point B 180 is parallel to the anterior-posterior axis). By way of example only, the chosen anatomy for Point B 180 is the posterior border of the disc space at the superior vertebrae. As shown in instruction panel 166, the system 10 may provide the user with instructions for taking the second image. An image is then taken with Point B 180 in the center of the fluoroscopic image as shown in image acquisition panel 162. Once the user indicates that Point B 180 is satisfactory by selecting the “Capture” button 168, the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point B 180. Once the image has been taken and the positional data captured, the user can select the “Next” button 174 to proceed to the next registration step.
As shown in FIG. 13, Point A′ 182 is then defined in the anatomical A/P view of the patient with the C-arm 26 in the lateral position. By way of example only, the chosen anatomy for Point A′ 182 is the ipsilateral border of the disc space. An image is taken with Point A′ 182 in the center of the fluoroscopic image as shown in image acquisition panel 162. Once the user indicates that this point is satisfactory by selecting the “Capture” button 168, the software captures the following data points: 1) the current x-ray image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point A′ 182. This A/P image will be used for the patient-lateral 3D backdrop 184 (i.e., the virtual A/P fluoroscopic image). Once the image has been taken and the positional data captured, the user can select the “Next” button 174 to proceed to the next registration step.
As shown in FIG. 14, Point B′ 186 is then defined in the anatomical A/P view of the patient with the C-arm 26 in the lateral position. By way of example only, the chosen anatomy for Point B′ 186 is the contralateral border of the disc space. An image is taken with Point B′ 186 in the center of the fluoroscopic image as shown in image acquisition panel 162. Once the user indicates that Point B′ 186 is satisfactory by selecting the “Capture” button 168, the software captures the following data points: 1) the current image and 2) the x, y, z, and rotational position of the reticle 30 with respect to Point B′ 186. Once the image has been taken and the positional data captured, the user can select the “Next” button 174 to complete the patient registration process. In some implementations, the user may be advised via message box 132 or a pop-up window that the C-arm 26 must be kept in this last registration position for the duration of the surgical procedure. In other implementations, the user may choose between the first, second, and merged image in both the A/P and lateral views for display. In yet other implementations, the image may be manipulated to orient the image according to user preference.
As shown in FIG. 15, Point A 176 and Point B 180 are then chosen and entered into the software and the physical distance between Point A 176 and Point B 180 is calculated. Next, the user specifies where Point B 180 is located on backdrop image 178. By way of example only, this can be done manually (as shown in FIG. 15) by presenting the user with the backdrop image 178 alongside the scaling image 192 and prompting the user to define where Point B 180 is on the backdrop image 178 (e.g., using a touch-screen or a mouse). Next, as shown in FIG. 16, Point A′ 182 and Point B′ 186 are then chosen and entered into the software and the physical distance between Point A′ 182 and Point B′ 186 is calculated. Next, the user specifies where Point B′ 186 is located on backdrop image 184. By way of example only, this can be done manually (as shown in FIG. 16) by presenting the user with the backdrop image 184 alongside the scaling image 194 and prompting to define where Point B′ 186 is on the backdrop image 184. These actions allow the software to correlate the physical distance between points with pixel distances between points on the x-ray images. This “scale” factor is used to properly calculate the movements of the 3D generated instrumentation against the 2D x-ray image in the A/P and lateral views, respectively. With both of the A/P and lateral views registered and scaled, the surgical volume is defined for the system 10. Selecting the “Next” button 174 allows the user to proceed to the main tracking screen 200.
According to some implementations, the main tracking screen 200 serves as a starting point for all other available features and functions. FIG. 17 depicts a main tracking screen 200 according to one exemplary embodiment. Main tracking screen 200 includes left tracking panel 202 and right tracking panel 204. Left tracking panel 202 includes backdrop image 178, data box 206, and a plurality of image orientation tabs 208. Right tracking panel 204 includes backdrop image 184, data box 206, and a plurality of image orientation tabs 208. Data box 206 preferably includes the spatial position data acquired regarding the backdrop image 178 or 184. The plurality of image manipulation tabs 208 may be three buttons for reorienting the backdrop image 178 or 184 to suit the user's preference—for example, selecting the “Flip Horizontal” button will invert the backdrop image horizontally, selecting the “Flip Vertical” button will invert the backdrop image vertically, and selecting the “Reset” button will reset the backdrop image 178 or 184 back to its original configuration. As shown by way of illustration in FIG. 17, image 184 is flipped horizontally.
The system 10 may perform various functions that facilitate the surgical procedure depending on the user's preferences or the patient's requirements. The main tracking screen 200 may include numerous features (e.g., represented by tabs 212, 214, 216, 218, and 220 shown within navigation panel 210) for use during the surgical procedure. By way of example only, the system 10 may identify an incision location on the skin, determine the distance from the skin to a desired location within the body, project one or more angles between anatomical points of interest, select the ideal direction for directing or docking a surgical object within the patient, choose the optimal size of implants based on the patient's anatomy, and map neurophysiologic data onto virtual anatomic locations of the patient (each of which will be explained in detail below with reference to a lateral trans-psoas procedure).
Prior to commencing the surgical procedure, the user may wish to verify one or more ideal incision locations. This may be accomplished using the initial dilator 32 with an IR-reflective tracking array 20 attached. To do so, the user selects the “Loaded Tools” tab 218 from the main tracking screen 200. Once the “Loaded Tools” tab 218 is selected, the display screen 100 displays a surgical object menu table of pre-loaded 3D CAD renderings of each instrument and implant available during the procedure in a variety of sizes. The user may use the display 24 to select the surgical object to be tracked (here, an initial dilator 32) and selecting the “Instrument Tracking” tab 222. Next, the user may place the distal tip 224 of the initial dilator 32 laterally over an area of the patient and maneuver it as necessary until a lateral incision location that is substantially lateral to the surgical target site (e.g., the intervertebral disc space) is identified.
The user may also wish to determine the distance between the incision site at the skin to a spinal target site (e.g., the intervertebral disc). The user may select the “Distance” tab 212 from the navigation panel 210 on the main tracking screen 200. The screen 200 will then instruct the user to place the distal tip 224 of the initial dilator 32 on the patient's skin and select a point of interest (e.g. the ipsilateral annulus of the intervertebral disc) and indicate the distance as the distance from the distal tip 228 of the virtual initial dilator 226 to the ipsilateral annulus identified on the virtual fluoroscopic image(s)). This allows the user to know, for example, the length of retractor blades required for a particular patient. This feature has the added benefit of decreasing surgical time because a surgical assistant may assemble the retractor with the correct length of retractor blades while the user-surgeon proceeds with the surgical procedure. It is to be appreciated that the “Distance” function is not limited to the implementation explained above. Indeed, using the “Distance” tab 212, the system 10 can determine the distance from the tip of any actively tracked surgical object to any user-selected location on the virtual fluoroscopic image.
According to other implementations, the screen 200 may instruct the user to select clinically significant boundaries for the surgical procedure on the virtual fluoroscopic images. By way of example, the user may select the lateral borders of the superior and inferior endplates in the A/P view and the anterior and posterior aspects of the vertebral body in the lateral view. The system 10 will provide the user the distance between the x, y, and z coordinates between these locations via the screen 200. Based on these clinical boundaries, the system 10 may also provide the user with an idealized axial view (not shown) of the disc space via an appropriately-scaled 3D rendering of a disc space. Armed with this view, the user may track instruments within the disc space and redirect them as necessary based on this third projected plane.
The user may also wish to determine the angle between two anatomical points on the patient. The user may select the “Angle” tab 214 from the navigation panel 210 on the main tracking screen 200. The screen 200 will instruct the user to place the distal tip 224 of the initial dilator 32 on the patient's skin and select a point of interest on the patient (e.g. the iliac crest at L4-5). Via the screen 200, the system 10 will then project the angle between the virtual initial dilator 226 to the patient's iliac crest at L4-5. This allows the user to know whether the patient's anatomy will accommodate a substantially lateral, retroperitoneal approach to the spine prior to making an incision. This feature has the added benefit of decreasing surgical time and/or preparing for contingency procedures necessitated by challenging anatomy (e.g. a high iliac crest). It is to be appreciated that the Angle function is not limited to the implementation explained above. Indeed, using the “Angle” tab 214, the system 10 can determine the angle between the tip of any actively tracked surgical object to any user-selected location on the virtual fluoroscopic image.
The user may also wish to determine the appropriate size of one or more surgical instruments to be used during the procedure and/or spinal implants to be implanted. The user may select the “Sizing” tab 216 from the navigation panel on the main tracking screen 200. Based on the clinically significant boundary information explained above, the change in distance along the y axis may indicate implant width, the change in distance along the x axis may indicate implant height, and the change in distance along the z axis may indicate the implant depth as well as instrument depth. The system 10 may provide the user with suggestions for instrument size, implant size, etc. prior to commencing the surgical procedure based on patient size, pathology, and the like.
A scalpel or other instrument is used to make an incision of sufficient size to receive a distal end 224 of the initial dilator 32 at a lateral incision location. The distal end 224 of the initial dilator 32 may be guided through the retroperitoneal space toward the psoas muscle using a fingertip (for example), to protect the peritoneum. The distal end of the initial dilator is then advanced in a substantially lateral direction through the psoas muscle toward the intervertebral disc space at or near the surgical target site. The fibers of the psoas muscle are split using blunt dissection until the spinal target site is reached.
According to some embodiments, the user may track the path of the initial dilator 32 in real time from the incision site to the target spinal site via the instrument tracking mode of the system 10 from the main tracking screen 200. As shown in FIG. 17, the main tracking screen 200 will track the A/P and lateral views of the virtual initial dilator 226 as it advances through the patient to ensure the initial dilator 32 will dock on an ideal location of the psoas muscle prior psoas muscle splitting. In such embodiments, it is to be appreciated that the user need not verify the location of the distal tip of the initial dilator 224 using fluoroscopic imaging, thereby decreasing the exposure to x-ray radiation.
According to some embodiments, the neuromonitoring system 36 may be used to allow for safe passage of the initial dilator 32 through the psoas muscle to the spinal target site. In such embodiments, initial dilator 32 has a proximal end configured to attach to a stimulation connector (e.g. a clip cable) in electrical communication with the neuromonitoring system (not shown) and at least one electrode (not shown) at the distal tip 224 of the initial dilator 32. The stimulation connector is coupled to both the initial dilator 32 and the neuromonitoring system 36 to provide a stimulation signal as the initial dilator 32 is advanced through the psoas muscle.
As the initial dilator 32 is advanced through the psoas muscle to the surgical target site, the electrode located at the distal tip 224 the initial dilator 32 emits one or more stimulation pulses and recording electrodes positioned over muscles innervated by nerves traversing through the psoas registers the presence or absence of a significant EMG response. The neuromonitoring system will continuously search for the stimulus threshold that elicits an EMG response in the myotomes monitored and then reports such thresholds on the display 24. According to preferred embodiments, this may be done with a hunting algorithm that quickly and automatically determines stimulation thresholds. As the initial dilator 32 is advanced through the psoas muscle, the stimulus necessary (i.e. the threshold intensity) to elicit an EMG response will vary with distance from one or more nerves. The neuromonitoring system 36 may then report the relative nerve distance indicated by these threshold intensities to the user by any number of color, graphic, or alpha-numeric indicia. By way of example, the user may be presented with the threshold intensity in units of mA and/or a color associated with the predetermined range in which the determined threshold lies. By way of example only, threshold intensities between 1 mA and 6 mA may also display a red color, threshold intensities between 7 mA and 10 mA may also display a yellow color, and threshold intensities greater than 10 mA may display a green color.
According to one or more preferred embodiments, the system 10 is configured simultaneously track the movement of the initial dilator 32 via virtual initial dilator 226 and display neurophysiologic data as the initial dilator 32 is being advanced through the psoas muscle to the spinal target site. For any given position and orientation of the initial dilator 32, EMG threshold intensity information can be captured and displayed on the virtual fluoroscopic images 178, 184 corresponding to the position and orientation the initial dilator 32 occupied when said threshold intensity was elicited. Thus, the system can map out the location of nearby nerves as the initial dilator 32 is advanced through the psoas muscle to the spinal target site. This may be accomplished by selecting the “Show Nerve Mapping” tab 220 on the main tracking screen 200. Various illustrative embodiments are explained in detail below.
In some implementations (illustrated by way of example, in FIG. 18), the position of the initial dilator 32 within the surgical corridor may be tracked as the relative distance of nearby nerves is monitored within the surgical corridor relative to the presence of and indicated via a color (e.g. red/yellow/green) and a number based on the threshold intensities required to elicit a significant threshold response as the initial dilator 32 is advanced to the surgical site. At the same time, the location of the initial dilator 32 (preferably, the distal tip 224) may be tracked on the virtual fluoroscopic images 178, 184 as set forth above (via the virtual distal tip 228 of the virtual initial dilator 226). The system 10 may then map each neurophysiologic response to the precise spatial orientation where each neurophysiologic response was obtained onto the virtual fluoroscopic images 178, 184 as the initial dilator 32 moves towards the spinal target site. As illustrated in FIG. 18, the main tracking screen 200 shows that the neuromonitoring system 36 registered a significant EMG response with 19 mA as the virtual initial dilator 226 was positioned at location 230 (depicted as green on the screen 200), a significant EMG response at 8 mA as the virtual initial dilator 226 advanced to location 232 (depicted as yellow the screen 200), a significant EMG response at 4 mA as the virtual initial 226 dilator advanced to location 234 (depicted as red on the screen 200), and a significant EMG response at 22 mA as the virtual initial dilator 226 advanced to location 236 which is the spinal target site (depicted as green on the screen 200). Armed with this information, the user knows that at least one nerve lies near location 234.
According to one or more implementations, the system 10 not only possesses the capability to register spatial positioning and neurophysiologic responses during advancement of the initial dilator 32, but also spatial positioning and neurophysiologic responses during retreat and/or repositioning of the initial dilator 32. As illustrated in FIG. 19, the main tracking screen 200 shows the virtual initial dilator 226 retreating from the spinal target site (location 236). As the virtual initial dilator 226 retreats, the neurophysiologic responses obtained at locations 236, 234, and 232 are removed (as shown in FIG. 20, the numeric results may be grayed-out and/or with dashed lines). From here, the initial dilator 32 may be repositioned and the repositioned path will be shown with virtual initial dilator 226 along with new neurophysiologic responses as the initial dilator 32 is advanced again.
Neurophysiologic data may also be used to track the orientation of one or more nerves relative to the distal tip 224 of the initial dilator 32. Preferably once the depth of the nerves has been ascertained (as explained above with respect to FIG. 19), the user can track the radial position of the distal electrode (not shown) with neurophysiologic data indicative of the relative radial position of one or more nearby nerves mapped onto the virtual fluoroscopic images 178, 164.
In some implementations, once the depth has been ascertained (a first discrete position), the surgeon may then rotate the initial dilator 32 (shown as virtual initial dilator 226) about its longitudinal axis through any number of discrete positions and allow the neuromonitoring system 36 to capture neurophysiologic data for each discrete point. FIGS. 20-23 illustrate this concept at one discrete position, two discrete positions, and four discrete positions, respectively. FIG. 20 shows the neurophysiologic data at one discrete point (with no rotation of the initial dilator 32, a first significant neurophysiologic response was obtained at 27 mA (depicted as green on screen 200) which may convey to the user that no nerve lies close to the electrode on the initial dilator 32 at the first discrete position. As shown in FIG. 21, as the user rotates the initial dilator 32 about its longitudinal axis to a second radial position (shown here as 180 degrees from the first discrete point), a second significant neurophysiologic response was obtained at 4 mA (depicted as red on the screen 200) which may convey to the user that at least one nerve lies close to the second discrete position. FIG. 22 shows the initial dilator 32 rotated about its longitudinal axis to four radial positions. Here, as the user rotates the initial dilator 32 about its longitudinal axis approximately 90 degrees from the first discrete position, a second neurophysiologic response was obtained at 7 mA (depicted as yellow on screen 200). As the user rotates the initial dilator 32 about its longitudinal axis approximately 90 degrees from its second discrete position, a third neurophysiologic response was obtained at 4 mA (depicted as red on screen 200). As the user rotates the initial dilator 32 about its longitudinal axis approximately another 90 degrees from its third discrete position, a fourth neurophysiologic response was obtained at 9 mA (depicted as yellow on screen 200). As the user rotates the initial dilator 32 to more radial positions, more refined information as to the exact position of a nearby nerve may be ascertained. It is to be appreciated that the system may be used as frequently as the surgeon wants to allow the system to give the discrete position discrete neurophysiologic data (e.g., minimum of two discrete points, maximum of surgeon preference. Additionally, it is to be appreciated that the system may give the orientation data at multiple depth locations as shown in FIG. 23.
The examples set forth above are illustrative and the system 10 may represent the data differently (e.g. an intensity chart over anatomy (FIG. 24) or a color map (FIG. 25), etc. . . .). Furthermore, the system 10 may incorporate more data points (e.g., time of the data capture, trend data) that assist the user in integrating the positioning and neurophysiologic mapping into the surgical procedure. While the foregoing was explained with respect to an initial dilator 32, it is contemplated that the virtual image/mapping may be used for subsequent dilators, one or more retractor blades, etc.).
After the initial dilator 32 has docked at the spinal target site, the position of the distal end of the dilator may be confirmed by selecting the “Instrument Tracking” tab 222 and verifying the position of the distal tip 224 of the initial dilator 32 (via the distal tip 228 of the virtual initial dilator 226) is at the spinal target site (thereby obviating the need to confirm this position using fluoroscopic imaging). A K-wire of the initial dilator 32 is then introduced into the targeted disc space after the initial dilator 32 is passed through the psoas muscle. A sequential dilation system including one or more supplemental dilators may be guided over the initial dilator for the purpose of further dilating the tissue down to the surgical target site. It is to be appreciated that each component of the sequential dilation system may be outfitted with an IR-reflective array 16 and its location mapped onto the virtual fluoroscopic image. Also, each component may be outfitted with one or more electrodes such that it neurophysiologic data obtained while each component is advanced to the spinal target site may be mapped onto the virtual fluoroscopic image as well.
The retractor blades of the access system are introduced over the supplemental dilator (or the initial dilator if the sequential dilation system is not employed) toward the disc space. Again, the neuromonitoring system 36 may be used to perform neuromonitoring as the blades are positioned helps provide safe passage through the psoas muscle. In some embodiments, a posterior shim element and/or retractor extenders (not shown) are engaged with the retractor blades. After the retractor blades are introduced along the distraction corridor, the surgical tracking system 10 may be used to confirm the position of the blades proximal to the disc space (again, thereby obviating the need to confirm the position of the blades with fluoroscopic imaging). Once the retractor assembly is fully assembled, the blades may be used to retract the distraction corridor so as to form an operative corridor. Tracking may be used to verify the position of the distal ends of the blades without the need for fluoroscopic imaging.
Various instruments may be inserted through the operative corridor to prepare the targeted disc space. At least one preparation tool such as a disc cutter, pituitary, scrapper, curette, or the like is inserted through the operative corridor to prepare the disc space. Any one of these may be outfitted with an IR-reflective tracking array 20 and its location tracked during the procedure, to verify for the user, for example, the extent of discectomy completed and the extent of discectomy left to be done. Following disc preparation, one or more sizers are inserted into the disc space to provide appropriate disc height restoration. This can also be monitored via position tracking obviating the need for fluoroscopy.
An appropriately-sized implant (preferably determined by using the “Sizing” tab 216 as set forth above) is then advanced into the disc space with an inserter tool. The implant is releasably secured to the inserter tool such that the surgeon may release the implant when it is properly positioned in the disc space.
Particularly in surgical procedures in which more than one spinal level is being operated on, additional fluoroscopic images are traditionally needed to localize the next spinal level (e.g. the superior or inferior spinal level) and to align subsequent fluoroscopic images used in tracking surgical instruments. The system 10 of the present invention may decrease the need for additional fluoroscopic images when transitioning from one spinal level to the next. According to one embodiment, the system 10 may allow the C-arm 26 to be moved during the procedure. As the C-arm is moved from one spinal level to another, the virtual marker (e.g. dot 130) captures the C-arm's 26 current position as it moves up and down or left and right relative to the virtual A/P and lateral fluoroscopic images. The user may then manipulate the C-arm 26 such that the dot 130 aligns with a point of interest at the next spinal level. It is to be appreciated that while this may not obviate the need for ionizing radiation entirely, it allows the user a close approximation of the location of the C-arm 26 without the use of localizing fluoroscopy.
It may also be advantageous for a user to verify that the C-arm 26 is displaying images accurately (i.e. the images are not rotated clockwise or counterclockwise) such that they are inputted into the system 10 accurately. Thus, the system 10 also provides an image rotation feature which verifies and/or corrects the images displayed so that they are accurately displayed with respect to gravity. First, the user takes a fluoroscopic image with an instrument that is known to be positioned vertically with respect to gravity (e.g., a plumb bob, a probe, or instrument with an attached orientation sensor, etc.) which is imported into the system 10. Next, the user takes a fluoroscopic image of the instrument with angular feedback (e.g. a plumb bob, a probe, or an instrument with an attached orientation sensor, etc.) registering at 0°/0° which is also imported into the system 10. If the second image does not appear properly positioned, the image on the system 10 may need to be rotated relative to the screen 100 until the image with the instrument in it does appear properly positioned.
The system 10 of present invention also provides manners in which a user may verify the virtual surgical objects are tracking “true” to the actual surgical object. In some instances, a virtual surgical object may not track “true” if the patient has moved on the surgical table or the image is distorted due to external factors.
According to a first implementation, the user may select the “Offset” mode 230 from the main tracking screen 200 and take a fluoroscopic image with a surgical object positioned within the disc space and compare the fluoroscopic image with the virtual instrument overlaid onto the same fluoroscopic image. If the position of the actual and virtual surgical objects do not align adequately, modifications may be made to the position of the virtual surgical object on the display screen 100. As shown by way of example in FIG. 26, a fluoroscopic image may be taken with the initial dilator 32 positioned within the disc space and the virtual initial dilator 226 projected onto the fluoroscopic image. In this illustration, it can be seen that the two instruments do not line up exactly. The user may instruct the system 10 to make adjustments to the x and y position of the virtual initial dilator 226 to align the virtual tip 228 of the virtual initial dilator 226 with the actual tip 224 of the actual initial dilator 32 using the x and y position adjustment fields 230, 232. In this example, an upward correction in the y direction and a correction to the right in the x direction would realign the virtual initial dilator 226 to the actual initial dilator 32.
According to another implementation, the Offset feature of the surgical tracking system 10 may be used without having to take additional fluoroscopic images during the procedure. By way of example, the user may repeatedly touch a tracked surgical object to a home position (located within the surgical corridor) that is easily identified both by visual identification and on the virtual fluoroscopic and comparing the offset (if any) that the virtual surgical object has relative to the home location. If, for example, the virtual instrument tracks “true”, the position of the distal end of the tracked surgical object will appear directly on top of the home position. If however, the virtual instrument no longer tracks “true,” the position of the distal end of the tracked instrument may appear offset from the home position. In that case, the user may make adjustments to the virtual surgical object in the x and y directions using the x and y position adjustment fields 230, 232 as explained above. The user can come back to the recorded “home position” as the surgery progresses and verify accurate tracking through the same virtual to real-world comparison multiple times during the surgery if so desired. The home position may be an anatomical landmark (e.g., an ipsilateral osteophyte), a radiolucent marker positioned within the surgical corridor (e.g., a radiolucent sphere) such that a tracked surgical object can be rotated around the sphere's surface and the user can confirm that the movements correlate to the sphere's location and diameter; and a radiodense marker that can be used to produce a definite mark on the virtual fluoroscopic image to confirm that the location of the instrument is on the “home position.”
According to yet another implementation, the Offset feature may detect patient movement using accelerometers (such as those disclosed in the '121 application). By way of example, a two- or three-axis accelerometer affixed to the patient in a known position and orientation can be used to alert the user and system 10 of possible patient movement. The accelerometers' measurement of static acceleration provides tilt and orientation information and the accelerometers' measurement of dynamic acceleration provides vibration and other movement information. If significant changes in patient positioning are detected by the accelerometers, the system 10 may display a warning that the tracking accuracy may be decreased such that re-registration and scaling may be advisable.
Details of the surgical tracking system 10 are now discussed in conjunction with a second exemplary use thereof for monitoring the trajectory of a surgical instrument used during pedicle fixation to ensure proper placement of pedicle screws. According to one embodiment, the system 10 may be used in conjunction with, or further integrated with, the surgical trajectory monitoring system of the '121 application.
Prior to forming a pedicle pilot hole or placing pedicle screws (preferably prior to starting the surgical procedure), it may be of interest to capture the starting point and stopping point of the pedicle cannulation using the surgical tracking system 10. With the IR tracking array 16 positioned on the reticle 30, the control unit 22 captures the spatial information of the C-arm 26 at a first spinal level of interest via the IR position sensor 14. A first lateral fluoroscopic image is taken with the lateral start points centered in the center of the image. A second lateral fluoroscopic image is then taken with the lateral stopping points centered in the center of the image. The first lateral image may be used as the lateral virtual backdrop image for that spinal level and scaled as set forth above, with reference to the Points A and B in the lateral, trans-psoas access procedure. According to some implementations, a virtual protractor may be projected off of this lateral backdrop image such that the cranial-caudal angles may be measured and inputted into the system 10 as set forth in the '121 application. Next, a first A/P fluoroscopic image is taken with the left start point centered in the image (for example, the lateral border of the left pedicle). A second A/P fluoroscopic image is taken with the right start point centered in the image (for example, the lateral border of the right pedicle). The first A/P image may be used as the A/P virtual backdrop image for that spinal level and scaled as set forth above, with reference to the Points A′ and B′ in the lateral, trans-psoas access procedure. It is to be appreciated that selecting the ideal start and stop points for pedicle cannulation in this way is advantageous for at least two reasons. First, fluoroscopic radiation to transition to this spinal level during the procedure is eliminated because the system 10 will alert the user when the C-arm 26 has been brought into the correct position. Second, fluoroscopic radiation is reduced when transitioning between lateral and A/P images. Aside from acquiring the starting points, no further imaging is required.
During the procedure, a surgical instrument may be advanced to the target site and positioned on the lateral margin of the pedicle, the preferred starting point according to the embodiment set forth above. By way of example only, the surgical instrument is the pedicle access probe including the orientation sensor described in the '121 application and further outfitted with an IR tracking array 20. As the depth of the pedicle access probe is penetrated deeper into the pedicle, its location does not need to be checked via fluoroscopic radiation. Instead, the location of the pedicle access probe may be projected onto the screen 100 in both the lateral and A/P views via a virtual pedicle access probe (not shown). According to some implementations, the neuromonitoring system 36 may provide neurophysiologic data and/or mapping during pedicle cannulation. Thus, the system 10 can show the dynamic relation of the distal tip of the pedicle access probe relative to the medial pedicle wall and the depth within the vertebral body via the virtual pedicle access probe instead of relying on fluoroscopic imaging.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown, by way of example only, in the drawings and are herein described in detail. It should be understood, however, that the description herein of specific embodiments is not intended to limit the invention to the particular forms disclosed. On the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined herein.

Claims (49)

What is claimed is:
1. A method for tracking a position of an instrument relative to a surgical target site, said method comprising the steps of:
using an imaging device to take a first image of a patient's body in a first anatomical view, wherein said imaging device displays a virtual center location when taking said first image and said first image has a user defined first point location at said virtual center location, and wherein said imaging device is in a first position;
obtaining three-dimensional positional data of the location of said imaging device in said first position while taking said first image;
using said imaging device to take a second image of said patient's body in said first anatomical view, wherein said imaging device displays a second virtual center location when taking said second image and said second image has a user defined second point location at said second virtual center location, and wherein said imaging device is in a second position, and wherein movement between said first and second positions of said imaging device is constrained to one axis;
obtaining three-dimensional positional data of the location of said imaging device in said second position while taking said second image;
scaling the first and second images to create a virtual backdrop image in said first anatomical view for viewing on a display unit, wherein said scaling comprises selecting the user defined first and second point locations to calculate a physical distance between said user defined first and second point locations and selecting at least one said user defined first and second point locations from at least one of said first and second images to correlate said physical distance between said user defined first and second point locations on said patient's body and a number of pixels between said user defined first and second point locations as represented on said display unit; and
monitoring three-dimensional position data of at least one surgical instrument via said display unit as said surgical instrument is moved within said patient's body by tracking a virtual representation of said surgical instrument overlaid onto said virtual backdrop image.
2. The method of claim 1, wherein said imaging device is a c-arm fluoroscope.
3. The method of claim 1, wherein said user defined first point location is on the spine of said patient.
4. The method of claim 1, wherein said first anatomical view is at least one of a lateral view and an anterior-posterior view.
5. The method of claim 1, wherein said three-dimensional positional data of the first and second positions of the imaging device is registered using an infrared position tracking system.
6. The method of claim 1, wherein said user defined second point location is on the spine of said patient.
7. The method of claim 1, wherein the scaling is accomplished manually.
8. The method of claim 1, wherein the at least one surgical instrument comprises at least one of a cannula, a dilator, a retractor blade, a k-wire, a cobb, a rasp, a drill, a tap, and a screw driver.
9. The method of claim 1, wherein the position of said at least one surgical instrument is monitored during a spine surgery procedure.
10. The method of claim 9, wherein the spinal surgery procedure is a lateral lumbar spinal surgery procedure.
11. A method for tracking a position of a surgical instrument during a surgical procedure, said method comprising the steps of:
using an intraoperative imaging device in a first position to generate a first radiographic image of a patient's body in a first anatomical view, wherein said first radiographic image displays a first user defined point location in the center of the first image;
using said intraoperative imaging device in a second position to generate a second radiographic image of the patient's body in said first anatomical view, wherein said second radiographic image displays a second user defined point location in the center of the second image;
obtaining three-dimensional positional data of said first position of said imaging device during generation of said first image and said second position of said imaging device during generation of said second image;
scaling the first and second images to create a first virtual backdrop image in said first anatomical view for viewing on a display unit, said scaling step comprising correlating a physical distance between the first and second user defined point locations on said patient's body and a number of pixels between said first and second user defined point locations as represented on said display unit;
using said imaging device in a third position to generate a third radiographic image of the patient's body in a second anatomical view, wherein said third radiographic image displays a third user defined point location in the center of the third image;
using said imaging device in a fourth position to generate a fourth radiographic image of the patient's body in said second anatomical view, wherein said fourth radiographic image displays a fourth user defined point location in the center of the fourth image;
obtaining three-dimensional positional data of said third position of said imaging device during generation of said third image and said fourth position of said imaging device during generation of said fourth image;
scaling the third and fourth images to create a second virtual backdrop image in said second anatomical view for viewing on said display unit, said scaling the third and fourth images step comprising correlating a physical distance between the third and fourth user defined point locations on said patient's body and a number of pixels between said third and fourth user defined point locations as represented on said display unit; and
tracking a virtual representation of a surgical instrument against said first and second virtual backdrop images as the actual surgical instrument is moved within said patient's body.
12. The method of claim 11, wherein said imaging device is a c-arm fluoroscope.
13. The method of claim 11, wherein said three-dimensional positional data of the first, second, third, and fourth positions of the imaging device is registered using an infrared position tracking system.
14. The method of claim 11, wherein said first and second anatomical views are anterior-posterior and lateral views.
15. The method of claim 11, wherein the surgical instrument is at least one of an intervertebral implant, an intervertebral trial, a rasp, a cobb, a retractor, a dilator, a cannula, a k-wire, a drill, a tap, and a screw driver.
16. The method of claim 11, wherein each said scaling step is accomplished manually.
17. The method of claim 11, wherein the position of said surgical instrument is monitored during a spine surgery procedure.
18. The method of claim 17, wherein the spinal surgery procedure is a lateral lumbar spinal surgery procedure.
19. A method comprising, with one or more processors:
receiving movement data of an instrument as the instrument is advanced to a surgical target site;
receiving neurophysiological data, wherein the neurophysiological data corresponds to the movement data;
determining a virtual location on an image of the surgical target site corresponding to a location of the instrument relative to the surgical target site when a neurophysiological response was elicited by the instrument, the determining being based on the movement data and the neurophysiological data; and
overlaying a graphic representative of the neurophysiological data onto the image of the surgical target site in association with the determined virtual location.
20. The method of claim 19, wherein the neurophysiological data comprises one or more electromyography responses received at a distal end of the instrument.
21. The method of claim 20, further comprising:
determining one or more threshold ranges based on a plurality of predetermined electromyography responses; and
based on an electromyography response of the neurophysiological data being within a range of the one or more threshold ranges, determining at least one or more visual effects associated with the graphic to be displayed on the image of the surgical target site.
22. The method of claim 19, further comprising:
overlaying a virtual representation of the instrument onto the image of the surgical target site as the instrument is advanced to the surgical target site.
23. The method of claim 19, wherein the image of the surgical target site comprises at least one of an anterior-posterior view and a lateral view.
24. The method of claim 19, wherein the instrument comprises at least one of a cannula, a dilator, a retractor blade, a k-wire, a cobb, a rasp, a drill, a tap, and a screw driver.
25. The method of claim 19, wherein the instrument is advanced through the psoas muscle to the surgical target site.
26. The method of claim 19, wherein overlaying the graphic representative of the neurophysiological data at the determined virtual location includes:
providing, as part of the graphic representation, a number based on a threshold intensity required to elicit the neurophysiological response.
27. The method of claim 19, wherein overlaying the graphic representative of the neurophysiological data at the determined virtual location includes:
providing, as part of the graphic, a color associated with a relative distance to nearby nerve tissue.
28. The method of claim 19, wherein overlaying the graphic representative of the neurophysiological data at the determined virtual location includes:
providing, as part of the graphic, an indication of a radial position of the instrument corresponding to the neurophysiological response.
29. The method of claim 19, further comprising:
causing a neuromonitoring system to provide an electrical stimulation signal to tissue of a patient having the surgical target site via at least one electrode of the initial dilator.
30. The method of claim 19, further comprising:
simultaneously displaying onto the image each respective graphic of a plurality of graphics at a respective virtual location of where a respective neurophysiological response was detected while the instrument was at that respective virtual location.
31. The method of claim 30, further comprising:
as the instrument is retreated or repositioned, modifying one or more of the plurality of graphics.
32. A method comprising:
receiving movement data of a dilator as the dilator is advanced through a patient's psoas muscle to a surgical target site;
overlaying a virtual representation of the dilator onto an image of the surgical target site as the dilator is advanced to the surgical target site;
receiving neurophysiological data, wherein the neurophysiological data corresponds to the movement data;
determining a virtual location on an image of the surgical target site corresponding to a location of the instrument relative to the surgical target site when a neurophysiological response was elicited by the instrument, the determining being based on the movement data and the neurophysiological data; and
overlaying a graphic representative of the neurophysiological data onto the image of the surgical target site in association with the determined virtual location,
wherein overlaying the graphic representative of the neurophysiological data at the determined virtual location includes:
providing, as part of the graphic representation, a number based on a threshold intensity required to elicit the neurophysiological response;
providing, as part of the graphic, a color associated with a relative distance to nearby nerve tissue; or
providing, as part of the graphic, an indication of a radial position of the instrument corresponding to the neurophysiological response.
33. The method of claim 30, wherein overlaying the graphic representative of the neurophysiological response at the determined virtual location includes:
providing, as part of the graphic representation, a number based on a threshold intensity required to elicit the neurophysiological response;
providing, as part of the graphic, a color associated with a relative distance to nearby nerve tissue; and
providing, as part of the graphic, an indication of a radial position of the instrument corresponding to the neurophysiological response.
34. A system comprising:
a position tracking system capable of tracking one or more movements of an instrument during a surgical procedure;
a display unit; and
a control unit in communication with the position tracking system and the display unit, wherein the control unit is configured to:
receive, from the position tracking system, movement data of the instrument as the instrument is advanced to a surgical target site;
receive neurophysiological data, wherein the neurophysiological data corresponds to the movement data;
determine a virtual location on an image of the surgical target site corresponding to a location of the instrument when a neurophysiological response was elicited, the determining being based on the neurophysiological data and the movement data;
overlay a graphic representative of the neurophysiological response onto the image of the surgical target site in association with the determined virtual location; and
display the overlaid graphic via the display unit.
35. The system of claim 34, wherein the neurophysiological data comprises one or more electromyography responses received at a distal end of the instrument.
36. The system of claim 35, wherein the control unit is further configured to:
determine one or more threshold ranges based on a plurality of predetermined electromyography responses; and
based on an electromyography response of the neurophysiological data being within a range of the one or more threshold ranges, determine at least one or more visual effects associated with the graphic to be displayed on the image of the surgical target site.
37. The system of claim 34, wherein the control unit is further configured to:
overlay a virtual representation of the instrument onto the image of the surgical target site as the instrument is advanced to the surgical target site; and
display the overlaid virtual representation onto the image via the display unit.
38. The system of claim 34, wherein the image of the surgical target site comprises at least one of an anterior-posterior view and a lateral view.
39. The system of claim 34, wherein the instrument comprises at least one of a cannula, a dilator, a retractor blade, a k-wire, a cobb, a rasp, a drill, a tap, and a screw driver.
40. The system of claim 34, wherein the instrument is advanced through the psoas muscle to the surgical target site.
41. A computer-implemented method comprising:
receiving a first neurophysiological response associated with an instrument being in a first position at a surgical target site;
receiving a second neurophysiological response associated with the instrument being in a second position at the surgical target site;
determining first and second virtual locations corresponding to the first and second positions; and
overlaying a first graphic representative of the first neurophysiological response at the first virtual location onto the image of the surgical target site;
overlaying a second graphic representative of the second neurophysiological response at the second virtual location onto the image of the surgical target site; and
displaying the overlaid first and second graphics.
42. The method of claim 41, wherein the second position is based on a given rotational movement of the instrument.
43. The method of claim 41, wherein the first and second neurophysiological responses comprise electromyography responses, the method further comprising:
determining one or more threshold ranges based on a plurality of predetermined electromyography responses; and
based on a given electromyography response being within a given range of the one or more threshold ranges, determining at least one or more visual effects associated with at least one of the first or second graphic to be displayed on the image of the surgical target site.
44. The method of claim 41, further comprising:
overlaying a virtual representation of the instrument onto the image of the surgical target site.
45. The method of claim 41, wherein the image of the surgical target site comprises at least one of an anterior-posterior view and a lateral view.
46. The method of claim 41, wherein the instrument comprises at least one of a cannula, a dilator, a retractor blade, a k-wire, a cobb, a rasp, a drill, a tap, and a screw driver.
47. The method of claim 41, wherein the overlaying includes:
providing, as part of the first graphic, a first number based on a threshold intensity required to elicit the first neurophysiological response; and
providing, as part of the second graphic, a second number based on a threshold intensity required to elicit the second neurophysiological response.
48. The method of claim 41, wherein the overlaying includes:
providing, as part of the first graphic, a color associated with a relative distance to nearby nerve tissue.
49. The method of claim 41, wherein the overlaying includes:
providing, as part of the first graphic, a first indication of a radial position of the instrument corresponding to the first neurophysiological response; and
providing, as part of the second graphic, a second indication of a radial position of the instrument corresponding to the second neurophysiological response.
US16/211,219 2011-10-28 2018-12-05 Systems and methods for performing spine surgery Active USRE49094E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/211,219 USRE49094E1 (en) 2011-10-28 2018-12-05 Systems and methods for performing spine surgery

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161552466P 2011-10-28 2011-10-28
US201261671537P 2012-07-13 2012-07-13
US13/663,459 US9510771B1 (en) 2011-10-28 2012-10-29 Systems and methods for performing spine surgery
US16/211,219 USRE49094E1 (en) 2011-10-28 2018-12-05 Systems and methods for performing spine surgery

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/663,459 Reissue US9510771B1 (en) 2011-10-28 2012-10-29 Systems and methods for performing spine surgery

Publications (1)

Publication Number Publication Date
USRE49094E1 true USRE49094E1 (en) 2022-06-07

Family

ID=57399764

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/663,459 Ceased US9510771B1 (en) 2011-10-28 2012-10-29 Systems and methods for performing spine surgery
US16/211,219 Active USRE49094E1 (en) 2011-10-28 2018-12-05 Systems and methods for performing spine surgery

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/663,459 Ceased US9510771B1 (en) 2011-10-28 2012-10-29 Systems and methods for performing spine surgery

Country Status (1)

Country Link
US (2) US9510771B1 (en)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011078212B4 (en) 2011-06-28 2017-06-29 Scopis Gmbh Method and device for displaying an object
CN104994805B (en) 2013-03-13 2018-04-27 史赛克公司 System and method for establishing virtual constraint boundaries
FR3010628B1 (en) 2013-09-18 2015-10-16 Medicrea International METHOD FOR REALIZING THE IDEAL CURVATURE OF A ROD OF A VERTEBRAL OSTEOSYNTHESIS EQUIPMENT FOR STRENGTHENING THE VERTEBRAL COLUMN OF A PATIENT
FR3012030B1 (en) 2013-10-18 2015-12-25 Medicrea International METHOD FOR REALIZING THE IDEAL CURVATURE OF A ROD OF A VERTEBRAL OSTEOSYNTHESIS EQUIPMENT FOR STRENGTHENING THE VERTEBRAL COLUMN OF A PATIENT
CN107072591B (en) * 2014-09-05 2021-10-26 普罗赛普特生物机器人公司 Physician-controlled tissue ablation in conjunction with treatment mapping of target organ images
US20160354161A1 (en) * 2015-06-05 2016-12-08 Ortho Kinematics, Inc. Methods for data processing for intra-operative navigation systems
EP3777749A3 (en) 2015-12-31 2021-07-21 Stryker Corporation System and method for preparing surgery on a patient at a target site defined by a virtual object
US11369436B2 (en) * 2016-01-15 2022-06-28 7D Surgical Ulc Systems and methods for displaying guidance images with spatial annotations during a guided medical procedure
US11064904B2 (en) * 2016-02-29 2021-07-20 Extremity Development Company, Llc Smart drill, jig, and method of orthopedic surgery
US11707203B2 (en) 2016-10-11 2023-07-25 Wenzel Spine, Inc. Systems for generating image-based measurements during diagnosis
WO2018109556A1 (en) 2016-12-12 2018-06-21 Medicrea International Systems and methods for patient-specific spinal implants
US10449006B2 (en) * 2017-04-05 2019-10-22 Warsaw Orthopedic, Inc. Surgical instrument and method
AU2018255892A1 (en) 2017-04-21 2019-11-07 Medicrea International A system for providing intraoperative tracking to assist spinal surgery
EP3445048A1 (en) 2017-08-15 2019-02-20 Holo Surgical Inc. A graphical user interface for a surgical navigation system for providing an augmented reality image during operation
US10918422B2 (en) 2017-12-01 2021-02-16 Medicrea International Method and apparatus for inhibiting proximal junctional failure
CN111970986A (en) 2018-04-09 2020-11-20 7D外科有限公司 System and method for performing intraoperative guidance
FR3092748A1 (en) * 2019-02-18 2020-08-21 Sylorus Robotics Image processing methods and systems
JP2020156825A (en) * 2019-03-27 2020-10-01 富士フイルム株式会社 Position information display device, method, and program, and radiography apparatus
JP2020156824A (en) * 2019-03-27 2020-10-01 富士フイルム株式会社 Position information acquisition device, method, and program, and radiography apparatus
US11925417B2 (en) 2019-04-02 2024-03-12 Medicrea International Systems, methods, and devices for developing patient-specific spinal implants, treatments, operations, and/or procedures
WO2020201353A1 (en) 2019-04-02 2020-10-08 Medicrea International Systems, methods, and devices for developing patient-specific spinal implants, treatments, operations, and/or procedures
US11399965B2 (en) * 2019-09-09 2022-08-02 Warsaw Orthopedic, Inc. Spinal implant system and methods of use
WO2021062064A1 (en) * 2019-09-24 2021-04-01 Nuvasive, Inc. Systems and methods for adjusting appearance of objects in medical images
US11769251B2 (en) 2019-12-26 2023-09-26 Medicrea International Systems and methods for medical image analysis
USD921655S1 (en) * 2020-01-13 2021-06-08 Stryker European Operations Limited Display screen with animated graphical user interface
US20220183755A1 (en) 2020-12-11 2022-06-16 Nuvasive, Inc. Robotic Surgery
US20220296326A1 (en) 2021-03-22 2022-09-22 Nuvasive, Inc. Multi-user surgical cart
CA3213787A1 (en) * 2021-04-14 2022-10-20 Arthrex, Inc. System and method for using detectable radiation in surgery

Citations (155)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5189690A (en) 1991-09-09 1993-02-23 Ronald Samuel Fluoroscopy orientation device
US5283808A (en) 1992-07-01 1994-02-01 Diasonics, Inc. X-ray device having a co-axial laser aiming system in an opposed configuration
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US5823960A (en) 1995-07-27 1998-10-20 Picker International, Inc. Imaging systems
US5835562A (en) 1993-11-22 1998-11-10 Hologic, Inc. Medical radiological apparatus including optical crosshair device for patient positioning and forearm and spinal positioning aides
US6050724A (en) 1997-01-31 2000-04-18 U. S. Philips Corporation Method of and device for position detection in X-ray imaging
US6079876A (en) 1997-10-17 2000-06-27 Siemens Aktiengesellschaft X-ray exposure system for 3D imaging
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US6236875B1 (en) 1994-10-07 2001-05-22 Surgical Navigation Technologies Surgical navigation systems including reference and localization frames
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US6266394B1 (en) 1998-06-09 2001-07-24 Nuvasive, Inc. Image intensifier reticle system
US6267502B1 (en) 1998-04-10 2001-07-31 Minrad Inc. Alignment verification device and method of using the same with a visual light beam and an x-ray
US6285902B1 (en) 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery
US20010053204A1 (en) 2000-02-10 2001-12-20 Nassir Navab Method and apparatus for relative calibration of a mobile X-ray C-arm and an external pose tracking system
US6340363B1 (en) 1998-10-09 2002-01-22 Surgical Navigation Technologies, Inc. Image guided vertebral distractor and method for tracking the position of vertebrae
US6347240B1 (en) 1990-10-19 2002-02-12 St. Louis University System and method for use in displaying images of a body part
US20020032375A1 (en) 2000-09-11 2002-03-14 Brainlab Ag Method and system for visualizing a body volume and computer program product
US6379041B1 (en) 1998-11-02 2002-04-30 Siemens Aktiengesellschaft X-ray apparatus for producing a 3D image from a set of 2D projections
US20020077543A1 (en) 2000-06-27 2002-06-20 Robert Grzeszczuk Method and apparatus for tracking a medical instrument based on image registration
US20020085681A1 (en) 2000-12-28 2002-07-04 Jensen Vernon Thomas Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US20020172328A1 (en) 2001-05-17 2002-11-21 Doron Dekel 3-D Navigation for X-ray imaging system
US6490477B1 (en) 1998-05-05 2002-12-03 Koninklijke Philips Electronics N.V. Imaging modality for image guided surgery
US6519319B1 (en) 1999-02-19 2003-02-11 Nuvasive, Inc. Image intensifier reticle system
US6527443B1 (en) 1999-04-20 2003-03-04 Brainlab Ag Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system
US6533455B2 (en) 2000-08-31 2003-03-18 Siemens Aktiengesellschaft Method for determining a coordinate transformation for use in navigating an object
US6542770B2 (en) 2000-02-03 2003-04-01 Koninklijke Philips Electronics N.V. Method of determining the position of a medical instrument
US6608884B1 (en) 1999-07-20 2003-08-19 Lunar Corporation Fluoroscopy machine with head mounted display
US20030181809A1 (en) 2002-03-11 2003-09-25 Hall Andrew F. 3D imaging for catheter interventions by use of 2D/3D image fusion
US6701174B1 (en) 2000-04-07 2004-03-02 Carnegie Mellon University Computer-aided bone distraction
US6714629B2 (en) 2000-05-09 2004-03-30 Brainlab Ag Method for registering a patient data set obtained by an imaging process in navigation-supported surgical operations by means of an x-ray image assignment
US6714810B2 (en) 2000-09-07 2004-03-30 Cbyon, Inc. Fluoroscopic registration system and method
US20040077939A1 (en) 2001-02-22 2004-04-22 Rainer Graumann Device and method for controlling surgical instruments
US6776526B2 (en) 2002-02-22 2004-08-17 Brainlab Ag Method for navigation-calibrating x-ray image data and a height-reduced calibration instrument
US20040171924A1 (en) 2003-01-30 2004-09-02 Mire David A. Method and apparatus for preplanning a surgical procedure
US6804547B2 (en) 2000-09-25 2004-10-12 Carl-Zeiss-Stiftung Medical therapeutic and/or diagnostic apparatus having a position detecting device
US6817762B2 (en) 2001-04-10 2004-11-16 Koninklijke Philips Electronics N.V. Fluoroscopy intervention method with a cone-beam
US20050004449A1 (en) 2003-05-20 2005-01-06 Matthias Mitschke Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US20050027193A1 (en) 2003-05-21 2005-02-03 Matthias Mitschke Method for automatically merging a 2D fluoroscopic C-arm image with a preoperative 3D image with one-time use of navigation markers
US6851855B2 (en) 2002-04-10 2005-02-08 Siemens Aktiengesellschaft Registration method for navigation-guided medical interventions
US20050059886A1 (en) 1998-07-24 2005-03-17 Webber Richard L. Method and system for creating task-dependent three-dimensional images
US20050119561A1 (en) 2000-11-17 2005-06-02 Ge Medical Systems Global Technology Company Enhanced graphics features for computer assisted surgery system
US20050165292A1 (en) 2002-04-04 2005-07-28 Simon David A. Method and apparatus for virtual digital subtraction angiography
US6925319B2 (en) 1999-03-15 2005-08-02 General Electric Company Integrated multi-modality imaging system
US6932506B2 (en) 2002-03-08 2005-08-23 Siemens Aktiengesellschaft Registration method and apparatus for navigation-guided medical interventions, without the use of patient-associated markers
US20050203386A1 (en) 2004-03-11 2005-09-15 Siemens Aktiengesellschaft Method of calibrating an X-ray imaging device
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
US20060025681A1 (en) 2000-01-18 2006-02-02 Abovitz Rony A Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
US20060036162A1 (en) 2004-02-02 2006-02-16 Ramin Shahidi Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
US7010080B2 (en) 2003-05-20 2006-03-07 Siemens Aktiengesellschaft Method for marker-free automatic fusion of 2-D fluoroscopic C-arm images with preoperative 3D images using an intraoperatively obtained 3D data record
US7035371B2 (en) 2004-03-22 2006-04-25 Siemens Aktiengesellschaft Method and device for medical imaging
US7117027B2 (en) 2001-02-07 2006-10-03 Synthes (Usa) Method for establishing a three-dimensional representation of a bone from image data
US7125165B2 (en) 2002-12-11 2006-10-24 Koninklijke Philips Electronics, N.V. C-arm X-ray apparatus having means of calibration
US20060293582A1 (en) 2005-05-24 2006-12-28 General Electric Company Method and system of acquiring images with a medical imaging device
US20070016005A1 (en) 2003-05-21 2007-01-18 Koninklijke Philips Electronics N.V. Apparatus and method for recording the movement of organs of the body
US7167738B2 (en) 2000-08-01 2007-01-23 Stryker Leibinger Gmbh & Co., Kg Method for navigating in the interior of the body using three-dimensionally visualized structures
US20070173717A1 (en) 2006-01-23 2007-07-26 Siemens Aktiengesellschaft Medical apparatus with a multi-modality interface
US7251522B2 (en) 2002-09-12 2007-07-31 Brainlab Ag X-ray image-assisted navigation using original, two-dimensional x-ray images
US20070238986A1 (en) 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
US20070242869A1 (en) 2006-04-12 2007-10-18 Eastman Kodak Company Processing and measuring the spine in radiographs
US7324626B2 (en) 2004-08-06 2008-01-29 Brainlab Ag Volumetric imaging on a radiotherapy apparatus
USRE40176E1 (en) 1996-05-15 2008-03-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US20080147086A1 (en) 2006-10-05 2008-06-19 Marcus Pfister Integrating 3D images into interventional procedures
US7391846B2 (en) 2000-10-02 2008-06-24 Koninklijke Philips N.V. Method and X-ray apparatus for optimally imaging anatomical parts of the human anatomy
US20080177176A1 (en) 2006-09-27 2008-07-24 Juan Manuel Casso Basterrechea Medical system comprising a detection device for detecting an object and comprising a storage device and method thereof
US7450743B2 (en) 2004-01-21 2008-11-11 Siemens Medical Solutions Usa, Inc. Method and system of affine registration of inter-operative two dimensional images and pre-operative three dimensional images
US20080285724A1 (en) 2007-05-05 2008-11-20 Ziehm Imaging Gmbh X-ray diagnostic imaging system with a plurality of coded markers
US20080306378A1 (en) 2007-06-05 2008-12-11 Yves Lucien Trousset Method and system for images registration
US20080312528A1 (en) 2007-06-15 2008-12-18 Bertolina James A Guidance of medical instrument using flouroscopy scanner with multple x-ray sources
US20090074139A1 (en) 2007-08-03 2009-03-19 Eckhard Hempel Method and an apparatus for detecting and localizing a metabolic marker
US7508388B2 (en) 2005-05-19 2009-03-24 Siemens Aktiengesellschaft Method for extending the display of a 2D image of an object region
US7519415B2 (en) 2003-12-19 2009-04-14 Siemens Aktiengesellschaft Method and apparatus for image support of an operative procedure implemented with a medical instrument
US20090135191A1 (en) 2007-07-12 2009-05-28 Siemens Corporate Research, Inc. Coregistration and analysis of multi-modal images obtained in different geometries
US20090143788A1 (en) 2007-12-04 2009-06-04 National Cheng Kung University Navigation method and system for drilling operation in spinal surgery
US7567834B2 (en) 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7570987B2 (en) 2003-04-04 2009-08-04 Brainlab Ag Perspective registration and visualization of internal areas of the body
US7570791B2 (en) 2003-04-25 2009-08-04 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US7587235B2 (en) 2002-01-18 2009-09-08 Brainlab Ag Method for assigning digital image information to the navigational data of a medical navigation system
US7587076B2 (en) 2004-08-31 2009-09-08 Brainlab Ag Fluoroscopy image verification
US7590442B2 (en) 2005-02-21 2009-09-15 Siemens Aktiengesellschaft Method for determining the position of an instrument with an x-ray system
US7590440B2 (en) 2005-11-14 2009-09-15 General Electric Company System and method for anatomy labeling on a PACS
US7603163B2 (en) 2003-10-31 2009-10-13 Minrad Inc. Targeting system and method of targeting
US20100016712A1 (en) 2007-02-27 2010-01-21 Meir Bartal Method and Device for Visually Assisting a Catheter Application
US7657075B2 (en) 2005-05-06 2010-02-02 Stereotaxis, Inc. Registration of three dimensional image data with X-ray imaging system
US7677801B2 (en) 2008-06-09 2010-03-16 Peyman Pakzaban Non-invasive method and apparatus to locate incision site for spinal surgery
US20100067773A1 (en) 2008-09-16 2010-03-18 Fujifilm Corporation Method and device for detecting placement error of an imaging plane of a radiographic image detector, as well as method and device for correcting images
US7689019B2 (en) 2005-05-19 2010-03-30 Siemens Aktiengesellschaft Method and device for registering 2D projection images relative to a 3D image data record
US7689042B2 (en) 2005-06-30 2010-03-30 Siemens Aktiengesellschaft Method for contour visualization of regions of interest in 2D fluoroscopy images
US7693565B2 (en) 2006-03-31 2010-04-06 General Electric Company Method and apparatus for automatically positioning a structure within a field of view
US20100106010A1 (en) 2006-09-25 2010-04-29 Joseph Rubner Ct-free spinal surgical imaging system
US7756567B2 (en) 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US7764984B2 (en) 2003-07-10 2010-07-27 Koninklijke Philips Electronics N.V. Apparatus and method for navigating an instrument through an anatomical structure
US7778690B2 (en) 2006-05-24 2010-08-17 Siemens Aktiengesellschaft Method for locating a medical instrument during an intervention performed on the human body
US7787932B2 (en) 2002-04-26 2010-08-31 Brainlab Ag Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US7801342B2 (en) 2005-06-21 2010-09-21 Siemens Aktiengesellschaft Method for determining the position and orientation of an object, especially of a catheter, from two-dimensional X-ray images
US20100239152A1 (en) 2009-03-18 2010-09-23 Furst Armin Method for ascertaining the position of a structure in a body
US7831294B2 (en) 2004-10-07 2010-11-09 Stereotaxis, Inc. System and method of surgical imagining with anatomical overlay for navigation of surgical devices
US7835778B2 (en) * 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US20100292565A1 (en) 2009-05-18 2010-11-18 Andreas Meyer Medical imaging medical device navigation from at least two 2d projections from different angles
US20100290690A1 (en) 2009-05-13 2010-11-18 Medtronic Navigation, Inc. System And Method For Automatic Registration Between An Image And A Subject
US20100295931A1 (en) 2009-03-31 2010-11-25 Robert Schmidt Medical navigation image output comprising virtual primary images and actual secondary images
US7873403B2 (en) 2003-07-15 2011-01-18 Brainlab Ag Method and device for determining a three-dimensional form of a body from two-dimensional projection images
US20110054308A1 (en) 1999-05-18 2011-03-03 Amit Cohen Method and system for superimposing virtual anatomical landmarks on an image
US20110054293A1 (en) 2009-08-31 2011-03-03 Medtronic, Inc. Combination Localization System
US20110071389A1 (en) 2009-05-13 2011-03-24 Medtronic Navigation, Inc. System and Method for Automatic Registration Between an Image and a Subject
US7922391B2 (en) 2008-05-15 2011-04-12 Brainlab Ag Determining calibration information for an x-ray apparatus
US7970190B2 (en) 2006-09-01 2011-06-28 Brainlab Ag Method and device for determining the location of pelvic planes
US7995827B2 (en) 2006-12-19 2011-08-09 Brainlab Ag Artefact elimination for a medical pelvic registration using a tracked pelvic support known to the system
US8010177B2 (en) 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
US20110213379A1 (en) 2010-03-01 2011-09-01 Stryker Trauma Gmbh Computer assisted surgery system
US8045677B2 (en) 2006-09-25 2011-10-25 Koninklijke Philips Electronics N V Eindhoven Shifting an object for complete trajectories in rotational X-ray imaging
US8055046B2 (en) 2006-06-14 2011-11-08 Brainlab Ag Shape reconstruction using X-ray images
US20110276179A1 (en) * 2008-10-14 2011-11-10 University Of Florida Research Foundation, Inc. Imaging platform to provide integrated navigation capabilities for surgical guidance
US20110282189A1 (en) 2010-05-12 2011-11-17 Rainer Graumann Method and system for determination of 3d positions and orientations of surgical objects from 2d x-ray images
US8090174B2 (en) 2006-04-12 2012-01-03 Nassir Navab Virtual penetrating mirror device for visualizing virtual objects in angiographic applications
US20120008741A1 (en) 2008-12-11 2012-01-12 Koninklijke Philips Electronics N.V. System and method for generating images of a patient's interior and exterior
US8104958B2 (en) 2008-08-22 2012-01-31 Brainlab Ag Assigning X-ray markers to image markers imaged in the X-ray image
US8106905B2 (en) 2008-04-18 2012-01-31 Medtronic, Inc. Illustrating a three-dimensional nature of a data set on a two-dimensional display
US8126111B2 (en) 2008-01-22 2012-02-28 Brainlab Ag Displaying recordings in a superimposed or oriented way
US8165660B2 (en) 2008-05-02 2012-04-24 Siemens Aktiengesellschaft System and method for selecting a guidance mode for performing a percutaneous procedure
US8180130B2 (en) 2009-11-25 2012-05-15 Imaging Sciences International Llc Method for X-ray marker localization in 3D space in the presence of motion
US20120188352A1 (en) 2009-09-07 2012-07-26 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Concept of superimposing an intraoperative live image of an operating field with a preoperative image of the operating field
US8244064B2 (en) 2004-01-29 2012-08-14 Siemens Aktiengesellschaft Method for registering and merging medical image data
US8275445B2 (en) 2007-04-26 2012-09-25 Siemens Aktiengesellschaft System and method for determining the position of an instrument
DE102011006562A1 (en) 2011-03-31 2012-10-04 Siemens Aktiengesellschaft Method for supporting navigation of medical instrument e.g. catheter in medical treatment device, involves displaying three-dimensional representation of position and direction of medical instrument on display device
US20120259204A1 (en) 2011-04-08 2012-10-11 Imactis Device and method for determining the position of an instrument in relation to medical images
US20130051647A1 (en) 2011-08-23 2013-02-28 Siemens Corporation Automatic Initialization for 2D/3D Registration
US20130066196A1 (en) 2010-05-18 2013-03-14 Siemens Aktiengesellschaft Determining and verifying the coordinate transformation between an x-ray system and a surgery navigation system
DE102011114332A1 (en) 2011-09-24 2013-03-28 Ziehm Imaging Gmbh Method for registering X-ray volume of calibrated C-arm X-ray device utilized for medical interventions at living object i.e. patient, involves determining coordinate transformation, and storing transformation in memory of computing unit
DE102011114333A1 (en) 2011-09-24 2013-03-28 Ziehm Imaging Gmbh Method for registering C-arm X-ray device suitable for three-dimensional reconstruction of X-ray volume, involves producing X-ray projection of X-ray marks with C-arm X-ray unit
US8457373B2 (en) 2009-03-16 2013-06-04 Siemens Aktiengesellschaft System and method for robust 2D-3D image registration
US8463005B2 (en) 2009-03-03 2013-06-11 Brainlab Ag Stent and method for determining the position of a stent
DE102012200536A1 (en) 2011-12-21 2013-06-27 Siemens Aktiengesellschaft Method for superimposed display of medical images, involves superimposing fluoroscopy image and reference image and visualizing contour of three-dimensional structures of object in reference image
US20130172732A1 (en) 2012-01-04 2013-07-04 Siemens Aktiengesellschaft Method for performing dynamic registration, overlays, and 3d views with fluoroscopic images
US8483434B2 (en) 2009-09-21 2013-07-09 Stryker Leibinger Gmbh & Co. Kg Technique for registering image data of an object
WO2013102827A1 (en) 2012-01-03 2013-07-11 Koninklijke Philips Electronics N.V. Position determining apparatus
US20130245477A1 (en) 2011-09-08 2013-09-19 Apn Health, Llc R-Wave Detection Method
US20130253312A1 (en) 2010-12-02 2013-09-26 National University Corporation Kochi University Medical tool that emits near infrared fluorescence and medical tool position-confirming system
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
WO2013156893A1 (en) 2012-04-19 2013-10-24 Koninklijke Philips N.V. Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US20130314440A1 (en) 2012-05-23 2013-11-28 Stryker Trauma Gmbh Virtual 3d overlay as reduction aid for complex fractures
US8600138B2 (en) 2010-05-21 2013-12-03 General Electric Company Method for processing radiological images to determine a 3D position of a needle
US8600478B2 (en) 2007-02-19 2013-12-03 Medtronic Navigation, Inc. Automatic identification of instruments used with a surgical navigation system
US20130324839A1 (en) 2012-06-05 2013-12-05 Synthes Usa, Llc Methods and apparatus for estimating the position and orientation of an implant using a mobile device
US8634896B2 (en) 2010-09-20 2014-01-21 Apn Health, Llc 3D model creation of anatomic structures using single-plane fluoroscopy
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20140049629A1 (en) 2011-04-29 2014-02-20 The Johns Hopkins University Sytem and method for tracking and navigation
US8694075B2 (en) 2009-12-21 2014-04-08 General Electric Company Intra-operative registration for navigated surgical procedures
US20140114180A1 (en) 2011-06-27 2014-04-24 Koninklijke Philips N.V. Live 3d angiogram using registration of a surgical tool curve to an x-ray image
US20140140598A1 (en) 2012-11-21 2014-05-22 General Electric Company Systems and methods for 2d and 3d image integration and synchronization
US20140171962A1 (en) 2012-12-13 2014-06-19 Mako Surgical Corp. Registration and navigation using a three-dimensional tracking sensor
US20140180062A1 (en) 2012-12-26 2014-06-26 Biosense Webster (Israel), Ltd. Reduced x-ray exposure by simulating images

Patent Citations (177)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6347240B1 (en) 1990-10-19 2002-02-12 St. Louis University System and method for use in displaying images of a body part
US5189690A (en) 1991-09-09 1993-02-23 Ronald Samuel Fluoroscopy orientation device
US5283808A (en) 1992-07-01 1994-02-01 Diasonics, Inc. X-ray device having a co-axial laser aiming system in an opposed configuration
US5661775A (en) 1992-07-01 1997-08-26 Oec, Inc. X-ray device having a co-axial laser aiming system in an opposed configuration
US7139601B2 (en) 1993-04-26 2006-11-21 Surgical Navigation Technologies, Inc. Surgical navigation systems including reference and localization frames
US5835562A (en) 1993-11-22 1998-11-10 Hologic, Inc. Medical radiological apparatus including optical crosshair device for patient positioning and forearm and spinal positioning aides
US6236875B1 (en) 1994-10-07 2001-05-22 Surgical Navigation Technologies Surgical navigation systems including reference and localization frames
US20060122483A1 (en) 1994-10-07 2006-06-08 Surgical Navigation Technologies, Inc. System for use in displaying images of a body part
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
US5823960A (en) 1995-07-27 1998-10-20 Picker International, Inc. Imaging systems
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
USRE40176E1 (en) 1996-05-15 2008-03-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6050724A (en) 1997-01-31 2000-04-18 U. S. Philips Corporation Method of and device for position detection in X-ray imaging
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
USRE42194E1 (en) 1997-09-24 2011-03-01 Medtronic Navigation, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US6079876A (en) 1997-10-17 2000-06-27 Siemens Aktiengesellschaft X-ray exposure system for 3D imaging
US6267502B1 (en) 1998-04-10 2001-07-31 Minrad Inc. Alignment verification device and method of using the same with a visual light beam and an x-ray
US6490477B1 (en) 1998-05-05 2002-12-03 Koninklijke Philips Electronics N.V. Imaging modality for image guided surgery
US6266394B1 (en) 1998-06-09 2001-07-24 Nuvasive, Inc. Image intensifier reticle system
US20050059886A1 (en) 1998-07-24 2005-03-17 Webber Richard L. Method and system for creating task-dependent three-dimensional images
US6340363B1 (en) 1998-10-09 2002-01-22 Surgical Navigation Technologies, Inc. Image guided vertebral distractor and method for tracking the position of vertebrae
US6379041B1 (en) 1998-11-02 2002-04-30 Siemens Aktiengesellschaft X-ray apparatus for producing a 3D image from a set of 2D projections
US6285902B1 (en) 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery
US6697664B2 (en) 1999-02-10 2004-02-24 Ge Medical Systems Global Technology Company, Llc Computer assisted targeting device for use in orthopaedic surgery
US6519319B1 (en) 1999-02-19 2003-02-11 Nuvasive, Inc. Image intensifier reticle system
US6925319B2 (en) 1999-03-15 2005-08-02 General Electric Company Integrated multi-modality imaging system
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US7996064B2 (en) 1999-03-23 2011-08-09 Medtronic Navigation, Inc. System and method for placing and determining an appropriately sized surgical implant
US20100041985A1 (en) 1999-03-23 2010-02-18 Surgical Navigation Technologies, Inc. Navigational Guidance Via Computer-Assisted Fluoroscopic Imaging
US20110268248A1 (en) 1999-03-23 2011-11-03 Medtronic Navigation, Inc. System and Method for Placing and Determining an Appropriately Sized Surgical Implant
US6527443B1 (en) 1999-04-20 2003-03-04 Brainlab Ag Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system
US20110054308A1 (en) 1999-05-18 2011-03-03 Amit Cohen Method and system for superimposing virtual anatomical landmarks on an image
US6608884B1 (en) 1999-07-20 2003-08-19 Lunar Corporation Fluoroscopy machine with head mounted display
US20010011175A1 (en) 1999-10-28 2001-08-02 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US7804991B2 (en) 2000-01-18 2010-09-28 Mako Surgical Corp. Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
US20060025681A1 (en) 2000-01-18 2006-02-02 Abovitz Rony A Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
US7689014B2 (en) 2000-01-18 2010-03-30 Z-Kat Inc Apparatus and method for measuring anatomical objects using coordinated fluoroscopy
US6542770B2 (en) 2000-02-03 2003-04-01 Koninklijke Philips Electronics N.V. Method of determining the position of a medical instrument
US20010053204A1 (en) 2000-02-10 2001-12-20 Nassir Navab Method and apparatus for relative calibration of a mobile X-ray C-arm and an external pose tracking system
US6701174B1 (en) 2000-04-07 2004-03-02 Carnegie Mellon University Computer-aided bone distraction
US6714629B2 (en) 2000-05-09 2004-03-30 Brainlab Ag Method for registering a patient data set obtained by an imaging process in navigation-supported surgical operations by means of an x-ray image assignment
US20020077543A1 (en) 2000-06-27 2002-06-20 Robert Grzeszczuk Method and apparatus for tracking a medical instrument based on image registration
US7167738B2 (en) 2000-08-01 2007-01-23 Stryker Leibinger Gmbh & Co., Kg Method for navigating in the interior of the body using three-dimensionally visualized structures
US6533455B2 (en) 2000-08-31 2003-03-18 Siemens Aktiengesellschaft Method for determining a coordinate transformation for use in navigating an object
US6714810B2 (en) 2000-09-07 2004-03-30 Cbyon, Inc. Fluoroscopic registration system and method
US20020032375A1 (en) 2000-09-11 2002-03-14 Brainlab Ag Method and system for visualizing a body volume and computer program product
US6804547B2 (en) 2000-09-25 2004-10-12 Carl-Zeiss-Stiftung Medical therapeutic and/or diagnostic apparatus having a position detecting device
US7391846B2 (en) 2000-10-02 2008-06-24 Koninklijke Philips N.V. Method and X-ray apparatus for optimally imaging anatomical parts of the human anatomy
US20050119561A1 (en) 2000-11-17 2005-06-02 Ge Medical Systems Global Technology Company Enhanced graphics features for computer assisted surgery system
US20020085681A1 (en) 2000-12-28 2002-07-04 Jensen Vernon Thomas Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system
US7117027B2 (en) 2001-02-07 2006-10-03 Synthes (Usa) Method for establishing a three-dimensional representation of a bone from image data
US20040077939A1 (en) 2001-02-22 2004-04-22 Rainer Graumann Device and method for controlling surgical instruments
US6817762B2 (en) 2001-04-10 2004-11-16 Koninklijke Philips Electronics N.V. Fluoroscopy intervention method with a cone-beam
US20020172328A1 (en) 2001-05-17 2002-11-21 Doron Dekel 3-D Navigation for X-ray imaging system
US7587235B2 (en) 2002-01-18 2009-09-08 Brainlab Ag Method for assigning digital image information to the navigational data of a medical navigation system
US6776526B2 (en) 2002-02-22 2004-08-17 Brainlab Ag Method for navigation-calibrating x-ray image data and a height-reduced calibration instrument
US20090262111A1 (en) 2002-02-28 2009-10-22 Surgical Navigation Technologies, Inc. Method and Apparatus for Perspective Inversion
US7630753B2 (en) 2002-02-28 2009-12-08 Medtronic Navigation, Inc. Method and apparatus for perspective inversion
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US6932506B2 (en) 2002-03-08 2005-08-23 Siemens Aktiengesellschaft Registration method and apparatus for navigation-guided medical interventions, without the use of patient-associated markers
US20030181809A1 (en) 2002-03-11 2003-09-25 Hall Andrew F. 3D imaging for catheter interventions by use of 2D/3D image fusion
US20050165292A1 (en) 2002-04-04 2005-07-28 Simon David A. Method and apparatus for virtual digital subtraction angiography
US6851855B2 (en) 2002-04-10 2005-02-08 Siemens Aktiengesellschaft Registration method for navigation-guided medical interventions
US7787932B2 (en) 2002-04-26 2010-08-31 Brainlab Ag Planning and navigation assistance using two-dimensionally adapted generic and detected patient data
US7251522B2 (en) 2002-09-12 2007-07-31 Brainlab Ag X-ray image-assisted navigation using original, two-dimensional x-ray images
US7125165B2 (en) 2002-12-11 2006-10-24 Koninklijke Philips Electronics, N.V. C-arm X-ray apparatus having means of calibration
US20090234217A1 (en) 2003-01-30 2009-09-17 Surgical Navigation Technologies, Inc. Method And Apparatus For Preplanning A Surgical Procedure
US7974677B2 (en) 2003-01-30 2011-07-05 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
US20040171924A1 (en) 2003-01-30 2004-09-02 Mire David A. Method and apparatus for preplanning a surgical procedure
US7570987B2 (en) 2003-04-04 2009-08-04 Brainlab Ag Perspective registration and visualization of internal areas of the body
US7570791B2 (en) 2003-04-25 2009-08-04 Medtronic Navigation, Inc. Method and apparatus for performing 2D to 3D registration
US20090290771A1 (en) 2003-04-25 2009-11-26 Surgical Navigation Technologies, Inc. Method and Apparatus for Performing 2D to 3D Registration
US7010080B2 (en) 2003-05-20 2006-03-07 Siemens Aktiengesellschaft Method for marker-free automatic fusion of 2-D fluoroscopic C-arm images with preoperative 3D images using an intraoperatively obtained 3D data record
US20050004449A1 (en) 2003-05-20 2005-01-06 Matthias Mitschke Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US20070016005A1 (en) 2003-05-21 2007-01-18 Koninklijke Philips Electronics N.V. Apparatus and method for recording the movement of organs of the body
US20050027193A1 (en) 2003-05-21 2005-02-03 Matthias Mitschke Method for automatically merging a 2D fluoroscopic C-arm image with a preoperative 3D image with one-time use of navigation markers
US7764984B2 (en) 2003-07-10 2010-07-27 Koninklijke Philips Electronics N.V. Apparatus and method for navigating an instrument through an anatomical structure
US7873403B2 (en) 2003-07-15 2011-01-18 Brainlab Ag Method and device for determining a three-dimensional form of a body from two-dimensional projection images
US7756567B2 (en) 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US7835778B2 (en) * 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US7603163B2 (en) 2003-10-31 2009-10-13 Minrad Inc. Targeting system and method of targeting
US7519415B2 (en) 2003-12-19 2009-04-14 Siemens Aktiengesellschaft Method and apparatus for image support of an operative procedure implemented with a medical instrument
US7450743B2 (en) 2004-01-21 2008-11-11 Siemens Medical Solutions Usa, Inc. Method and system of affine registration of inter-operative two dimensional images and pre-operative three dimensional images
US8244064B2 (en) 2004-01-29 2012-08-14 Siemens Aktiengesellschaft Method for registering and merging medical image data
US20060036162A1 (en) 2004-02-02 2006-02-16 Ramin Shahidi Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
US20050203386A1 (en) 2004-03-11 2005-09-15 Siemens Aktiengesellschaft Method of calibrating an X-ray imaging device
US7035371B2 (en) 2004-03-22 2006-04-25 Siemens Aktiengesellschaft Method and device for medical imaging
US7567834B2 (en) 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7953471B2 (en) 2004-05-03 2011-05-31 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7324626B2 (en) 2004-08-06 2008-01-29 Brainlab Ag Volumetric imaging on a radiotherapy apparatus
US7587076B2 (en) 2004-08-31 2009-09-08 Brainlab Ag Fluoroscopy image verification
US7831294B2 (en) 2004-10-07 2010-11-09 Stereotaxis, Inc. System and method of surgical imagining with anatomical overlay for navigation of surgical devices
US7590442B2 (en) 2005-02-21 2009-09-15 Siemens Aktiengesellschaft Method for determining the position of an instrument with an x-ray system
US7657075B2 (en) 2005-05-06 2010-02-02 Stereotaxis, Inc. Registration of three dimensional image data with X-ray imaging system
US7508388B2 (en) 2005-05-19 2009-03-24 Siemens Aktiengesellschaft Method for extending the display of a 2D image of an object region
US7689019B2 (en) 2005-05-19 2010-03-30 Siemens Aktiengesellschaft Method and device for registering 2D projection images relative to a 3D image data record
US7603155B2 (en) 2005-05-24 2009-10-13 General Electric Company Method and system of acquiring images with a medical imaging device
US20060293582A1 (en) 2005-05-24 2006-12-28 General Electric Company Method and system of acquiring images with a medical imaging device
US7801342B2 (en) 2005-06-21 2010-09-21 Siemens Aktiengesellschaft Method for determining the position and orientation of an object, especially of a catheter, from two-dimensional X-ray images
US7689042B2 (en) 2005-06-30 2010-03-30 Siemens Aktiengesellschaft Method for contour visualization of regions of interest in 2D fluoroscopy images
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US7590440B2 (en) 2005-11-14 2009-09-15 General Electric Company System and method for anatomy labeling on a PACS
US20070173717A1 (en) 2006-01-23 2007-07-26 Siemens Aktiengesellschaft Medical apparatus with a multi-modality interface
US20070238986A1 (en) 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
US8374678B2 (en) 2006-02-21 2013-02-12 Siemens Aktiengesellschaft Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
US7693565B2 (en) 2006-03-31 2010-04-06 General Electric Company Method and apparatus for automatically positioning a structure within a field of view
US8090174B2 (en) 2006-04-12 2012-01-03 Nassir Navab Virtual penetrating mirror device for visualizing virtual objects in angiographic applications
US20070242869A1 (en) 2006-04-12 2007-10-18 Eastman Kodak Company Processing and measuring the spine in radiographs
US7778690B2 (en) 2006-05-24 2010-08-17 Siemens Aktiengesellschaft Method for locating a medical instrument during an intervention performed on the human body
US8055046B2 (en) 2006-06-14 2011-11-08 Brainlab Ag Shape reconstruction using X-ray images
US7970190B2 (en) 2006-09-01 2011-06-28 Brainlab Ag Method and device for determining the location of pelvic planes
US8045677B2 (en) 2006-09-25 2011-10-25 Koninklijke Philips Electronics N V Eindhoven Shifting an object for complete trajectories in rotational X-ray imaging
US20100106010A1 (en) 2006-09-25 2010-04-29 Joseph Rubner Ct-free spinal surgical imaging system
US20080177176A1 (en) 2006-09-27 2008-07-24 Juan Manuel Casso Basterrechea Medical system comprising a detection device for detecting an object and comprising a storage device and method thereof
US8301220B2 (en) 2006-09-27 2012-10-30 Siemens Aktiengesellschaft Medical system comprising a detection device for detecting an object and comprising a storage device and method thereof
US20080147086A1 (en) 2006-10-05 2008-06-19 Marcus Pfister Integrating 3D images into interventional procedures
US7995827B2 (en) 2006-12-19 2011-08-09 Brainlab Ag Artefact elimination for a medical pelvic registration using a tracked pelvic support known to the system
US8600478B2 (en) 2007-02-19 2013-12-03 Medtronic Navigation, Inc. Automatic identification of instruments used with a surgical navigation system
US20100016712A1 (en) 2007-02-27 2010-01-21 Meir Bartal Method and Device for Visually Assisting a Catheter Application
US8010177B2 (en) 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
US8275445B2 (en) 2007-04-26 2012-09-25 Siemens Aktiengesellschaft System and method for determining the position of an instrument
US20080285724A1 (en) 2007-05-05 2008-11-20 Ziehm Imaging Gmbh X-ray diagnostic imaging system with a plurality of coded markers
US20080306378A1 (en) 2007-06-05 2008-12-11 Yves Lucien Trousset Method and system for images registration
US20080312528A1 (en) 2007-06-15 2008-12-18 Bertolina James A Guidance of medical instrument using flouroscopy scanner with multple x-ray sources
US20090135191A1 (en) 2007-07-12 2009-05-28 Siemens Corporate Research, Inc. Coregistration and analysis of multi-modal images obtained in different geometries
US20090074139A1 (en) 2007-08-03 2009-03-19 Eckhard Hempel Method and an apparatus for detecting and localizing a metabolic marker
US20090143788A1 (en) 2007-12-04 2009-06-04 National Cheng Kung University Navigation method and system for drilling operation in spinal surgery
US8126111B2 (en) 2008-01-22 2012-02-28 Brainlab Ag Displaying recordings in a superimposed or oriented way
US8106905B2 (en) 2008-04-18 2012-01-31 Medtronic, Inc. Illustrating a three-dimensional nature of a data set on a two-dimensional display
US8165660B2 (en) 2008-05-02 2012-04-24 Siemens Aktiengesellschaft System and method for selecting a guidance mode for performing a percutaneous procedure
US7922391B2 (en) 2008-05-15 2011-04-12 Brainlab Ag Determining calibration information for an x-ray apparatus
US7677801B2 (en) 2008-06-09 2010-03-16 Peyman Pakzaban Non-invasive method and apparatus to locate incision site for spinal surgery
US8104958B2 (en) 2008-08-22 2012-01-31 Brainlab Ag Assigning X-ray markers to image markers imaged in the X-ray image
US20100067773A1 (en) 2008-09-16 2010-03-18 Fujifilm Corporation Method and device for detecting placement error of an imaging plane of a radiographic image detector, as well as method and device for correcting images
US20110276179A1 (en) * 2008-10-14 2011-11-10 University Of Florida Research Foundation, Inc. Imaging platform to provide integrated navigation capabilities for surgical guidance
US20120008741A1 (en) 2008-12-11 2012-01-12 Koninklijke Philips Electronics N.V. System and method for generating images of a patient's interior and exterior
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8463005B2 (en) 2009-03-03 2013-06-11 Brainlab Ag Stent and method for determining the position of a stent
US8457373B2 (en) 2009-03-16 2013-06-04 Siemens Aktiengesellschaft System and method for robust 2D-3D image registration
US20100239152A1 (en) 2009-03-18 2010-09-23 Furst Armin Method for ascertaining the position of a structure in a body
US20100295931A1 (en) 2009-03-31 2010-11-25 Robert Schmidt Medical navigation image output comprising virtual primary images and actual secondary images
US8238631B2 (en) 2009-05-13 2012-08-07 Medtronic Navigation, Inc. System and method for automatic registration between an image and a subject
US20100290690A1 (en) 2009-05-13 2010-11-18 Medtronic Navigation, Inc. System And Method For Automatic Registration Between An Image And A Subject
US20110071389A1 (en) 2009-05-13 2011-03-24 Medtronic Navigation, Inc. System and Method for Automatic Registration Between an Image and a Subject
US8503745B2 (en) 2009-05-13 2013-08-06 Medtronic Navigation, Inc. System and method for automatic registration between an image and a subject
US20100292565A1 (en) 2009-05-18 2010-11-18 Andreas Meyer Medical imaging medical device navigation from at least two 2d projections from different angles
US20110054293A1 (en) 2009-08-31 2011-03-03 Medtronic, Inc. Combination Localization System
US20120188352A1 (en) 2009-09-07 2012-07-26 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Concept of superimposing an intraoperative live image of an operating field with a preoperative image of the operating field
US8483434B2 (en) 2009-09-21 2013-07-09 Stryker Leibinger Gmbh & Co. Kg Technique for registering image data of an object
US8180130B2 (en) 2009-11-25 2012-05-15 Imaging Sciences International Llc Method for X-ray marker localization in 3D space in the presence of motion
US8694075B2 (en) 2009-12-21 2014-04-08 General Electric Company Intra-operative registration for navigated surgical procedures
US20110213379A1 (en) 2010-03-01 2011-09-01 Stryker Trauma Gmbh Computer assisted surgery system
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US20110282189A1 (en) 2010-05-12 2011-11-17 Rainer Graumann Method and system for determination of 3d positions and orientations of surgical objects from 2d x-ray images
US20130066196A1 (en) 2010-05-18 2013-03-14 Siemens Aktiengesellschaft Determining and verifying the coordinate transformation between an x-ray system and a surgery navigation system
US8600138B2 (en) 2010-05-21 2013-12-03 General Electric Company Method for processing radiological images to determine a 3D position of a needle
US8634896B2 (en) 2010-09-20 2014-01-21 Apn Health, Llc 3D model creation of anatomic structures using single-plane fluoroscopy
US20130253312A1 (en) 2010-12-02 2013-09-26 National University Corporation Kochi University Medical tool that emits near infrared fluorescence and medical tool position-confirming system
DE102011006562A1 (en) 2011-03-31 2012-10-04 Siemens Aktiengesellschaft Method for supporting navigation of medical instrument e.g. catheter in medical treatment device, involves displaying three-dimensional representation of position and direction of medical instrument on display device
US20120259204A1 (en) 2011-04-08 2012-10-11 Imactis Device and method for determining the position of an instrument in relation to medical images
US20140049629A1 (en) 2011-04-29 2014-02-20 The Johns Hopkins University Sytem and method for tracking and navigation
US20140114180A1 (en) 2011-06-27 2014-04-24 Koninklijke Philips N.V. Live 3d angiogram using registration of a surgical tool curve to an x-ray image
US20130051647A1 (en) 2011-08-23 2013-02-28 Siemens Corporation Automatic Initialization for 2D/3D Registration
US20130245477A1 (en) 2011-09-08 2013-09-19 Apn Health, Llc R-Wave Detection Method
DE102011114332A1 (en) 2011-09-24 2013-03-28 Ziehm Imaging Gmbh Method for registering X-ray volume of calibrated C-arm X-ray device utilized for medical interventions at living object i.e. patient, involves determining coordinate transformation, and storing transformation in memory of computing unit
DE102011114333A1 (en) 2011-09-24 2013-03-28 Ziehm Imaging Gmbh Method for registering C-arm X-ray device suitable for three-dimensional reconstruction of X-ray volume, involves producing X-ray projection of X-ray marks with C-arm X-ray unit
DE102012200536A1 (en) 2011-12-21 2013-06-27 Siemens Aktiengesellschaft Method for superimposed display of medical images, involves superimposing fluoroscopy image and reference image and visualizing contour of three-dimensional structures of object in reference image
WO2013102827A1 (en) 2012-01-03 2013-07-11 Koninklijke Philips Electronics N.V. Position determining apparatus
US20130172732A1 (en) 2012-01-04 2013-07-04 Siemens Aktiengesellschaft Method for performing dynamic registration, overlays, and 3d views with fluoroscopic images
WO2013156893A1 (en) 2012-04-19 2013-10-24 Koninklijke Philips N.V. Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images
US20130314440A1 (en) 2012-05-23 2013-11-28 Stryker Trauma Gmbh Virtual 3d overlay as reduction aid for complex fractures
US20130324839A1 (en) 2012-06-05 2013-12-05 Synthes Usa, Llc Methods and apparatus for estimating the position and orientation of an implant using a mobile device
US20140140598A1 (en) 2012-11-21 2014-05-22 General Electric Company Systems and methods for 2d and 3d image integration and synchronization
US20140171962A1 (en) 2012-12-13 2014-06-19 Mako Surgical Corp. Registration and navigation using a three-dimensional tracking sensor
US20140180062A1 (en) 2012-12-26 2014-06-26 Biosense Webster (Israel), Ltd. Reduced x-ray exposure by simulating images

Also Published As

Publication number Publication date
US9510771B1 (en) 2016-12-06

Similar Documents

Publication Publication Date Title
USRE49094E1 (en) Systems and methods for performing spine surgery
US20210186617A1 (en) Surgical Monitoring System and Related Methods For Spinal Pedicle Screw Alignment
JP7170631B2 (en) Surgical navigation system and related methods
US20220409308A1 (en) Surgical robot platform
CN110475509B (en) Systems, devices, and methods for improving surgical accuracy using inertial measurement units
US7835778B2 (en) Method and apparatus for surgical navigation of a multiple piece construct for implantation
US9125556B2 (en) Robotic guided endoscope
EP2023811B1 (en) Surgical trajectory monitoring system
CN108601629A (en) The 3D visualizations of radioactive exposure are reduced during surgical operation
JP7086977B2 (en) Alignment device used in surgery
US20160015469A1 (en) Surgical tissue recognition and navigation apparatus and method
JP2008119472A (en) System and method for measuring distance between implants
EP3906879A1 (en) Spinal surgery system
US20230097125A1 (en) Robotically guiding the trajectory of a second surgical device
McLaughlin et al. Overview of Spinal Navigation

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NORTH CAROLINA

Free format text: SECURITY INTEREST;ASSIGNORS:NUVASIVE, INC.;NUVASIVE CLINICAL SERVICES MONITORING, INC.;NUVASIVE CLINICAL SERVICES, INC.;AND OTHERS;REEL/FRAME:052918/0595

Effective date: 20200224

AS Assignment

Owner name: NUVASIVE, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FINLEY, ERIC;KIM, ALBERT;SCHOLL, THOMAS;AND OTHERS;SIGNING DATES FROM 20140814 TO 20140820;REEL/FRAME:059121/0821

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8