US20230368418A1 - Accuracy check and automatic calibration of tracked instruments - Google Patents

Accuracy check and automatic calibration of tracked instruments Download PDF

Info

Publication number
US20230368418A1
US20230368418A1 US17/663,024 US202217663024A US2023368418A1 US 20230368418 A1 US20230368418 A1 US 20230368418A1 US 202217663024 A US202217663024 A US 202217663024A US 2023368418 A1 US2023368418 A1 US 2023368418A1
Authority
US
United States
Prior art keywords
tracked instrument
determining
virtual position
virtual
instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/663,024
Inventor
Sanjay M. Joshi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Globus Medical Inc
Original Assignee
Globus Medical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Globus Medical Inc filed Critical Globus Medical Inc
Priority to US17/663,024 priority Critical patent/US20230368418A1/en
Assigned to GLOBUS MEDICAL, INC. reassignment GLOBUS MEDICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOSHI, SANJAY M.
Publication of US20230368418A1 publication Critical patent/US20230368418A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • A61B2034/207Divots for calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • A61B2090/3764Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30021Catheter; Guide wire
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present disclosure relates to medical devices and systems, and more particularly, checking accuracy and performing automatic calibration of tracked instruments in a camera tracking systems used for computer assisted navigation during surgery.
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, medical imaging devices (e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.), and surgical robots.
  • medical imaging devices e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.
  • a computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient’s anatomy.
  • Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track pose of a reference array on a surgical tool, which is being positioned by a surgeon during surgery, relative to a patient reference array (also “dynamic reference base” (“DRB”)) attached to a patient.
  • the reference arrays allow the camera tracking system to determine a pose of the surgical tool relative to anatomical structure imaged by a medical image of the patient and relative to the patient. The surgeon can thereby use real-time visual feedback of the pose to navigate the surgical tool during a surgical procedure on the patient.
  • FIG. 10 illustrates an example of a trackable instrument 1010 .
  • the CAD model of an instrument 1010 is associated with a reference element 1020 , so that the CAD model can be overlaid on registered images of patient’s anatomy.
  • accuracy of the instrument 1010 needs to be verified prior to use.
  • the accuracy check is typically done via bringing the tip 1040 of the tracked instrument into a divot 1050 associated with another reference element.
  • the divot 1050 is typically a cone-shaped depression ending in an apex.
  • the theoretical position of the tip 1040 is then compared with theoretical position of the divot 1050 . Assuming the user has properly positioned the instrument 1010 in the divot 1050 , the distance between the two positions determines the accuracy of tracked instrument 1010 . If the accuracy check does not pass, that instrument 1010 may not be used.
  • a source of inaccuracy during the accuracy check arises due to it being challenging for a user to place an instrument accurately in the divot.
  • the ideal position for a sharp instrument is along normal from the apex to the base of the cone of the divot. Any deviation of the angle introduces small errors.
  • a bad-acting user may move the position of the instrument to produce a false accuracy number (that appears more accurate).
  • a source of inaccuracy during the accuracy check arises due inaccuracy in tracking of the two reference elements (one associated with the tracked instrument and one associated with the divot).
  • the reference element arrays are typically small in size (e.g., on a few centimeters wide) to minimize obstruction of the surgical area.
  • the number of markers is also usually limited to optimize costs and workflow. A larger array with more markers can improve the accuracy of divot position.
  • a source of inaccuracy during the accuracy check arises due to a shape of the instrument tip.
  • Blunt tip instruments may not fit well inside the divot and instruments with angled tips or a hook shape can make it even more difficult to properly place the instrument tip in the divot.
  • a sources of inaccuracies during the accuracy check includes a deformed instrument.
  • the source of inaccuracies includes a deformed reference element. Note that a slight angular shift in the reference element can result in very small error for tracking of the reference element, but may result in a much larger error at instrument tip.
  • the source of inaccuracies include inaccuracies in optical markers due to manufacturing defects, smudges, or inaccurate mounting of optical markers on mounting posts. All these are solvable problems, though. If an instrument can be calibrated at the time of use, the fidelity of tracking can be improved so that the physical tip matches the estimated tip.
  • Some embodiments of the present disclosure are directed to performing an accuracy check and calibrating tracked instruments used in surgical procedures.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a virtual position within a virtual space of a display device.
  • the operations further include determining a virtual position within the virtual space of the tracked instrument.
  • the operations further include determining a point of contact on the display device between the tracked instrument and the display device.
  • the operations further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument.
  • the operations further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a first virtual position within a virtual space of an emitter of an imaging device.
  • the operations further include determining a first virtual position within the virtual space of a detector of the imaging device.
  • the operations further include determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector.
  • the operations further include determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument.
  • the operations further include obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector.
  • the operations further include determining a second virtual position within the virtual space of the emitter of the imaging device.
  • the operations further include determining a second virtual position within the virtual space of the detector of the imaging device.
  • the operations further include determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector.
  • the operations further include determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument.
  • the operations further include obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image.
  • the operations further include determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a virtual position within a virtual space of the tracked instrument relative to a display device.
  • the operations further include displaying an indication of the virtual position of the tracked instrument on the display device.
  • the operations further include receiving an indication of an actual position of the tracked instrument relative to the display device.
  • the operations further include determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and which may further include a surgical robot for robotic assistance according to some embodiments;
  • XR extended reality
  • FIG. 2 illustrates the camera tracking system and the surgical robot positioned relative to a patient according to some embodiments
  • FIG. 3 further illustrates the camera tracking system and the surgical robot configured according to some embodiments
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset, a computer platform, imaging devices, and a surgical robot which are configured to operate according to some embodiments;
  • FIG. 5 illustrates a patient reference array (“DRB”) and a surveillance marker
  • FIGS. 6 A-C respectively illustrate a surgical robot with an end-effector, an expanded view of the end-effector, and a surgical tool in accordance with some embodiments;
  • FIGS. 7 A-B are schematic diagrams illustrating examples of imaging devices according to some embodiments.
  • FIG. 8 is a block diagram illustrating an example of an imaging system according to some embodiments.
  • FIG. 9 is a block diagram illustrating an example of an accuracy and calibration module according to some embodiments.
  • FIG. 10 is a schematic diagram illustrating an example of a tracked instrument according to some embodiments.
  • FIG. 11 is a schematic diagram illustrating an example of a set of display devices configured to interact with a tracked instrument according to some embodiments
  • FIG. 12 is a schematic diagram illustrating an example of the set of display devices of FIG. 11 being contacted by a tracked instrument according to some embodiments;
  • FIG. 13 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on contact with a display device according to some embodiments
  • FIG. 14 is a schematic diagram illustrating an example of a C-arm imaging device according to some embodiments.
  • FIGS. 15 A-B are schematic diagrams illustrating images taken of a tracked instrument using the C-arm imaging device at two different positions according to some embodiments
  • FIG. 16 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on images taken of the tracked instrument according to some embodiments;
  • FIG. 17 is a schematic diagram of a display device configured to show an expected position of a tracked instrument according to some embodiments.
  • FIGS. 18 - 20 are flowcharts of operations performed by a system to perform an accuracy check of tracked instruments according to some embodiments.
  • Various embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of candidate markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras.
  • candidate markers such as a surveillance marker
  • FIGS. 1 - 9 various components that may be used for performing embodiments in a navigated surgery system are described with reference to FIGS. 1 - 9 .
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets 150 during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery during a surgical procedure and which may further include a surgical robot 100 for robotic assistance, according to some embodiments.
  • FIG. 2 illustrates the camera tracking system 200 and the surgical robot 100 positioned relative to a patient, according to some embodiments.
  • FIG. 3 further illustrates the camera tracking system 200 and the surgical robot 100 configured according to some embodiments.
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150 , a computer platform 400 , imaging devices 420 , and the surgical robot 100 which are configured to operate according to some embodiments.
  • FIG. 5 illustrates a patient reference array 116 (also “dynamic reference base” (DRB)) and a surveillance marker 500 .
  • DRB dynamic reference base
  • the XR headset 150 may be configured to augment a real-world scene with computer generated XR images.
  • the XR headset 150 may be configured to provide an augmented reality (“AR”) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user.
  • AR augmented reality
  • VR virtual reality
  • the XR headset 150 may be configured to provide a virtual reality (“VR”) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer-generated AR images on a display screen.
  • the XR headset 150 can be configured to provide both AR and VR viewing environments.
  • the term XR headset can referred to as an AR headset or a VR headset.
  • the surgical robot 100 may include, for example, one or more robot arms 104 , a display 110 , an end-effector 112 , for example, including a guide tube 114 , and an end effector reference array which can include one or more tracking markers.
  • a patient reference array 116 (“DRB”) has a plurality of tracking markers 117 and is secured directly to the patient 210 (e.g., to a bone of the patient 210 ).
  • a spaced apart surveillance marker 500 ( FIG. 5 ) has a single marker 502 connected to a shaft that is secured directly to the patient 210 at a spaced apart location from the patient reference array 116 .
  • Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.
  • the camera tracking system 200 includes tracking cameras 204 which may be spaced apart stereo cameras configured with partially overlapping field-of-views.
  • the camera tracking system 200 can have any suitable configuration of arm(s) 202 to move, orient, and support the tracking cameras 204 in a desired location, and may contain at least one processor operable to track location of an individual marker and pose of an array of markers.
  • the term “pose” refers to the location (e.g., along 3 orthogonal axes) and/or the rotation angle (e.g., about the 3 orthogonal axes) of markers (e.g., DRB) relative to another marker (e.g., surveillance marker) and/or to a defined coordinate system (e.g., camera coordinate system).
  • a pose may therefore be defined based on only the multidimensional location of the markers relative to another marker and/or relative to the defined coordinate system, based on only the multidimensional rotational angles of the markers relative to the other marker and/or to the defined coordinate system, or based on a combination of the multidimensional location and the multidimensional rotational angles.
  • the term “pose” therefore is used to refer to location, rotational angle, or combination thereof.
  • the tracking cameras 204 may include, e.g., infrared cameras (e.g., bifocal or stereophotogrammetric cameras), operable to identify, for example, active and passive tracking markers for single markers (e.g., surveillance marker 500 ) and reference arrays which can be formed on or attached to the patient 210 (e.g., patient reference array, DRB), end effector 112 (e.g., end effector reference array), XR headset(s) 150 worn by a surgeon 120 and/or a surgical assistant 126 , etc. in a given measurement volume of a camera coordinate system while viewable from the perspective of the tracking cameras 204 .
  • infrared cameras e.g., bifocal or stereophotogrammetric cameras
  • the tracking cameras 204 may scan the given measurement volume and detect light that is emitted or reflected from the markers in order to identify and determine locations of individual markers and poses of the reference arrays in three-dimensions.
  • active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (“LEDs”)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking cameras 204 or other suitable device.
  • the XR headsets 150 may each include tracking cameras (e.g., spaced apart stereo cameras) that can track location of a surveillance marker and poses of reference arrays within the XR camera headset field-of-views (“FOVs”) 152 and 154 , respectively. Accordingly, as illustrated in FIG. 1 , the location of the surveillance marker and the poses of reference arrays on various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 and/or a FOV 600 of the tracking cameras 204 .
  • tracking cameras e.g., spaced apart stereo cameras
  • FOVs XR camera headset field-of-views
  • FIGS. 1 - 2 illustrate a potential configuration for the placement of the camera tracking system 200 and the surgical robot 100 in an operating room environment.
  • Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 and/or other displays 34 , 36 , and 110 to display surgical procedure navigation information.
  • the surgical robot 100 is optional during computer-aided navigated surgery.
  • the camera tracking system 200 may operate using tracking information and other information provided by multiple XR headsets 150 such as inertial tracking information and optical tracking information (frames of tracking data).
  • the XR headsets 150 operate to display visual information and may play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 100 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment.
  • the camera tracking system 200 may track markers in 6 degrees-of-freedom (“6DOF”) relative to three axes of a 3D coordinate system and rotational angles about each axis.
  • 6DOF 6 degrees-of-freedom
  • the XR headsets 150 may also operate to track hand poses and gestures to enable gesture-based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 may have a 1-10x magnification digital color camera sensor called a digital loupe. In some embodiments, one or more of the XR headsets 150 are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • An “outside-in” machine vision navigation bar supports the tracking cameras 204 and may include a color camera.
  • the machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 while positioned on wearers’ heads.
  • the patient reference array 116 (DRB) is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112 , instrument reference array 170 , and reference arrays on the XR headsets 150 .
  • the surveillance marker 500 is affixed to the patient to provide information on whether the patient reference array 116 has shifted. For example, during a spinal fusion procedure with planned placement of pedicle screw fixation, two small incisions are made over the posterior superior iliac spine bilaterally. The DRB and the surveillance marker are then affixed to the posterior superior iliac spine bilaterally. If the surveillance marker’s 500 location changes relative to the patient reference array 116 , the camera tracking system 200 may display a meter indicating the amount of movement and/or may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation which is off target.
  • the surgical robot (also “robot”) may be positioned near or next to patient 210 .
  • the robot 100 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the surgical procedure.
  • the camera tracking system 200 may be separated from the robot system 100 and positioned at the foot of patient 210 . This location allows the tracking camera 200 to have a direct visual line of sight to the surgical area 208 .
  • the surgeon 120 may be positioned across from the robot 100 , but is still able to manipulate the end-effector 112 and the display 110 .
  • a surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110 . If desired, the locations of the surgeon 120 and the assistant 126 may be reversed.
  • An anesthesiologist 122 , nurse or scrub tech can operate equipment which may be connected to display information from the camera tracking system 200 on a display 34 .
  • the display 110 can be attached to the surgical robot 100 or in a remote location.
  • End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor.
  • end-effector 112 can comprise a guide tube 114 , which is configured to receive and orient a surgical instrument, tool, or implant used to perform a surgical procedure on the patient 210 .
  • end-effector is used interchangeably with the terms “end-effectuator” and “effectuator element.”
  • instrument is used in a nonlimiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein.
  • Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc.
  • end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery.
  • end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument in a desired manner.
  • the surgical robot 100 is operable to control the translation and orientation of the end-effector 112 .
  • the robot 100 may move the end-effector 112 under computer control along x-, y-, and z-axes, for example.
  • the end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis, such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled.
  • Euler Angles e.g., roll, pitch, and/or yaw
  • selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a 6DOF robot arm comprising only rotational axes.
  • the surgical robot 100 may be used to operate on patient 210 , and robot arm 104 can be positioned above the body of patient 210 , with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210 .
  • the XR headsets 150 can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • surgical robot 100 can be operable to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory.
  • the surgical robot 100 can be operable to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument.
  • a surgeon or other user can use the surgical robot 100 as part of computer assisted navigated surgery, and has the option to stop, modify, or manually control the autonomous or semi-autonomous movement of the end-effector 112 and/or the surgical instrument.
  • Reference arrays of markers can be formed on or connected to robot arms 102 and/or 104 , the end-effector 112 (e.g., end-effector array 114 in FIG. 2 ), and/or a surgical instrument (e.g., instrument array 170 ) to track poses in 6DOF along 3 orthogonal axes and rotation about the axes.
  • the end-effector 112 e.g., end-effector array 114 in FIG. 2
  • a surgical instrument e.g., instrument array 170
  • the reference arrays enable each of the marked objects (e.g., the end-effector 112 , the patient 210 , and the surgical instruments) to be tracked by the tracking camera 200 , and the tracked poses can be used to provide navigated guidance during a surgical procedure and/or used to control movement of the surgical robot 100 for guiding the end-effector 112 and/or an instrument manipulated by the end-effector 112 .
  • the marked objects e.g., the end-effector 112 , the patient 210 , and the surgical instruments
  • the surgical robot 100 may include a display 110 , upper arm 102 , lower arm 104 , end-effector 112 , vertical column 312 , casters 314 , a table 318 , and ring 324 which uses lights to indicate statuses and other information.
  • Cabinet 106 may house electrical components of surgical robot 100 including, but not limited, to a battery, a power distribution module, a platform interface board module, and a computer.
  • the camera tracking system 200 may include a display 36 , tracking cameras 204 , arm(s) 202 , a computer housed in cabinet 330 , and other components.
  • perpendicular 2D scan slices such as axial, sagittal, and/or coronal views, of patient anatomical structure are displayed to enable user visualization of the patient’s anatomy alongside the relative poses of surgical instruments.
  • An XR headset or other display can be controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy.
  • the 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn’t necessarily formed from a scan of the patient.
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150 , a computer platform 400 , imaging devices 420 , and a surgical robot 100 which are configured to operate according to some embodiments.
  • the imaging devices 420 may include a C-arm imaging device, an O-arm imaging device, and/or a patient image database.
  • the XR headset 150 provides an improved human interface for performing navigated surgical procedures.
  • the XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 400 , that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 438 of the XR headset 150 and/or another display device.
  • the display device 438 may include a video projector, flat panel display, etc.
  • the user may view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen.
  • the XR headset 150 may additionally or alternatively be configured to display on the display device 438 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 430 , a microphone 432 , a gesture sensor 434 , a pose sensor (e.g., inertial measurement unit (“IMU”)) 436 , the display device 438 , and a wireless/wired communication interface 440 .
  • the cameras 430 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • the cameras 430 may be configured to operate as the gesture sensor 434 by tracking for identification user hand gestures performed within the field of view of the camera(s) 430 .
  • the gesture sensor 434 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 434 and/or senses physical contact, e.g., tapping on the sensor 434 or its enclosure.
  • the pose sensor 436 e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • a surgical system includes the camera tracking system 200 which may be connected to a computer platform 400 for operational processing and which may provide other operational functionality including a navigation controller 404 and/or of an XR headset controller 410 .
  • the surgical system may include the surgical robot 100 .
  • the navigation controller 404 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 200 .
  • the navigation controller 404 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 100 , where the steering information is displayed through the display device 438 of the XR headset 150 and/or another display device to indicate where the surgical tool and/or the end effector of the surgical robot 100 should be moved to perform the surgical plan.
  • the electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 400 through the wired/wireless interface 440 .
  • the electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 400 or directly connected, to various imaging devices 420 , e.g., the C-arm imaging device, the I/O-arm imaging device, the patient image database, and/or to other medical equipment through the wired/wireless interface 440 .
  • the surgical system may include a XR headset controller 410 that may at least partially reside in the XR headset 150 , the computer platform 400 , and/or in another system component connected via wired cables and/or wireless communication links.
  • Various functionality is provided by software executed by the XR headset controller 410 .
  • the XR headset controller 410 is configured to receive information from the camera tracking system 200 and the navigation controller 404 , and to generate an XR image based on the information for display on the display device 438 .
  • the XR headset controller 410 can be configured to operationally process frames of tracking data from tracking cameras from the cameras 430 (tracking cameras), signals from the microphone 1620 , and/or information from the pose sensor 436 and the gesture sensor 434 , to generate information for display as XR images on the display device 438 and/or as other for display on other display devices for user viewing.
  • the XR headset controller 410 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user.
  • the XR headset controller 410 may reside within the computer platform 400 which, in turn, may reside within the cabinet 330 of the camera tracking system 200 , the cabinet 106 of the surgical robot 100 , etc..
  • the surgical robot system 100 relies on accurate positioning of the end-effector 112 , surgical instruments 608 , and/or the patient 210 (e.g., patient reference array 116 ) relative to the desired surgical area.
  • the reference arrays include tracking markers 118 , 804 which are rigidly attached to a portion of the instrument 608 and/or end-effector 112 .
  • FIG. 6 A depicts part of the surgical robot system 100 with the robot 102 including base 106 , robot arm 104 , and end-effector 112 .
  • the other elements, not illustrated, such as the display, marker tracking cameras, etc. may also be present as described herein.
  • FIG. 6 B depicts a close-up view of the end-effector 112 with guide tube 114 and a reference array that includes a plurality of tracking markers 118 rigidly affixed to the end-effector 112 .
  • the plurality of tracking markers 118 are attached to the end-effector 112 configured as a guide tube.
  • FIG. 6 C depicts an instrument 608 (in this case, a probe) with a plurality of tracking markers 804 rigidly affixed to the instrument 608 .
  • the instrument 608 could include any suitable surgical instrument, such as, but not limited to, guide wire, cannula, a retractor, a drill, a reamer, a screwdriver, an insertion instrument, a removal instrument, or the like.
  • the reference array 612 functions as the handle 620 of the instrument 608 .
  • Four markers 804 are attached to the handle 620 in a manner that is out of the way of the shaft 622 and tip 624 .
  • Stereophotogrammetric tracking by the tracking camera 200 of these four markers 804 allows the instrument 608 to be tracked as a rigid body and for the system 100 to precisely determine the location of the tip 624 and the orientation of the shaft 622 while the instrument 608 is moved within view of tracking camera 200 .
  • the markers 118 , 804 on each instrument 608 , end-effector 112 , or the like may be arranged asymmetrically with a known inter-marker spacing.
  • the reason for asymmetric alignment is so that it is unambiguous which marker 118 , 804 corresponds to a particular pose on the rigid body and whether markers 118 , 804 are being viewed from the front or back, i.e., mirrored.
  • each array 612 and thus each instrument 608 , end-effector 112 , or other object to be tracked should have a unique marker pattern to allow it to be distinguished from other instruments 608 or other objects being tracked.
  • Asymmetry and unique marker patterns allow the tracking camera 200 and system 100 to detect individual markers 118 , 804 then to check the marker spacing against a stored template to determine which instrument 608 , end-effector 112 , or another object they represent. Detected markers 118 , 804 can then be sorted automatically and assigned to each tracked object in the correct order. Without this information, rigid body calculations could not then be performed to extract key geometric information, for example, such as instrument tip 624 and alignment of the shaft 622 , unless the user manually specified which detected marker 118 , 804 corresponded to which position on each rigid body.
  • FIGS. 7 A-B illustrate medical imaging systems 1304 that may be used in conjunction with robot system 100 and/or navigation systems to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210 .
  • Any appropriate subject matter may be imaged for any appropriate procedure using the imaging system 1304 .
  • the imaging system 1304 may be any imaging device such as a C-arm 1308 device, an O-arm 1306 device, a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system. As illustrated in FIG.
  • the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape.
  • C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316 .
  • the space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318 .
  • the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328 , such as a wheeled mobile cart 1330 with wheels 1332 , which may enclose an image capturing portion, not illustrated.
  • the image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion.
  • the image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition.
  • the image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes.
  • FIG. 8 illustrates a block diagram of components of a medical imaging system configured in accordance with some embodiments of the present disclosure.
  • the medical imaging system includes a controller 3200 , a imaging arm 3240 (e.g., a C-arm or an O-arm), a linear actuator and/or rotary actuator 3250 connected to an X-ray beam emitter or collector 3260 .
  • the controller 3200 includes an image processor 3210 , a general processor 3220 , and an I/O interface 3230 .
  • the image processor 3210 performs image processing to combine sets of images to generate a three-dimensional image of the scanned volume.
  • the general processor 3220 is used to perform various embodiments of the present disclosure.
  • the I/O interface 3230 communicatively couples the controller 3200 to other components of the medical imaging system.
  • the imaging arm 3240 includes motors 3245 used to move the collector and emitter along an arc, e.g., three hundred and sixty degrees, during image acquisition. Motors 3245 are controlled by C-arm the controller 3200 .
  • the controller 3200 can also control movement of the linear actuator and/or rotary actuator 3250 .
  • FIG. 9 illustrates an example of an accuracy and calibration module 3300 .
  • the accuracy and calibration module 3330 can include an interface 3310 , a processing circuitry 3320 , and a memory 3330 .
  • the accuracy and calibration module is part of a system (e.g., an imaging system or a camera tracking system).
  • the memory 3330 can include instructions stored therein that are executable by the processing circuitry to perform operations according to some embodiments herein.
  • Embodiments that include performing an accuracy check and/or calibrating of a tracked instrument based on contact with a touch sensor are described below.
  • multiple points of contact can be detected by one or more touchpads that are themselves tracked by navigation camera.
  • the instruments and the pressure touchpads can each have associated reference elements that are tracked by the navigation camera.
  • the touchpads are sensitive to pressure, capacitance, or resistance.
  • FIG. 11 illustrates an example of a set of touchpads 1110 coupled together to create an opening for accepting a tip of the tracked instrument.
  • the associated reference element 1120 is coupled to the touchpads.
  • the touchpads and reference arrays are securely housed in a supporting structure 1130 to reduce movement.
  • the touchpads 1110 can capture location of pressure points. Resistive touchpads are especially useful, since they do not rely on capacitance of the object.
  • an instrument When an instrument is brought in the wedge, it touches at least two points on the touchpads 1110 .
  • the touchpads 1110 then send the location of sensed points to the system.
  • the system also receives the position of pose of the touchpads and instruments via their associated reference elements 1120 . Thus, the system can calculate the theoretical position of the tip of the instrument under test. It can then compare the tip location to the location reported by the three touchpads 1110 .
  • the bottom touchpad would report position of a sharp or semi-sharp instrument tip.
  • a broader instrument such as an Osteotome
  • the approximate position of the CAD model with respect to the touchpads is known already to the system based on the tracking information reported by the camera. Thus, the accuracy of the physical model can be calculated.
  • FIG. 12 illustrates an example of a tip of a tracked instrument 1240 contacting the touchpads 1110 .
  • the wedge shape of the opening between the touchpads 1110 allows an accuracy check of instruments with tips that are too big to fit in a typical divot used in navigation arrays.
  • FIG. 13 illustrates an example of operations to perform an accuracy check and calibrate a tracked instrument based on contact between the tracked instrument and the display devices.
  • the reported touchpad points are compared against the theoretical model.
  • the user touches instrument tip on all three touchpads in a way that reference elements of both the instrument and the touchpad structure are visible to the tracking camera.
  • the theoretical position of the instrument tip with respect to touchpads is then calculated. This serves as the initial position estimate of the instrument tip. Since the relative position of three touchpads is known, the theoretical touchpoints of the CAD model for each touchpad are then calculated.
  • the optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the theoretical touchpoints and the actual ones as shown in the algorithm below.
  • these operations improve accuracy checks for instruments without a sharp tip or instruments that are too wide to fit in a traditional divot. In additional or alternative embodiments, these operation allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • FIG. 18 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a point of contact between the tracked instrument and a touch sensor.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a virtual position of the touch sensor.
  • the term virtual position is used herein to describe a virtual location and a virtual pose of an object.
  • the system includes a camera. Determining the virtual position of the touch sensor includes: determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor; capturing, via the camera, an image of the reference element coupled to the touch sensor; determining a virtual position of the reference element coupled to the touch sensor relative to a dynamic reference base (“DRB”) based on the image of the reference element coupled to the touch sensor; and determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor.
  • DRB dynamic reference base
  • processing circuitry 3320 determines a virtual position of the tracked instrument.
  • the virtual position of the touch sensor and the virtual position of the tracked instrument are within the same virtual space (e.g., relative to a common reference point).
  • the system includes a camera.
  • determining the virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument; and determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • processing circuitry 3320 determines a point of contact on a touch sensor between the tacked instrument and the touch sensor.
  • the system includes the touch sensor and the touch sensor includes a touchscreen (e.g., a pressure sensitive, resistance sensitive, or capacitance sensitive touchscreen).
  • the touch sensor is part of a display device. Determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
  • the touch sensor includes a plurality of touch sensors coupled together to form an opening. Determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
  • processing circuitry 3320 determines an expected point of contact on the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • information about the shape of the tracked instrument is determined and the information an intended position of a tip of the tracked instrument relative to a reference element coupled to the tracked instrument.
  • Determining the point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor.
  • Determining the expected point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • processing circuitry 3320 displays an indication of the expected point of contact.
  • the system includes a display device that includes the touch sensor. Determining the point of contact on the touch sensor between the tracked instrument and the touch sensor includes receiving an indication of the point of contact on the touch sensor from a user in response to displaying the indication of the expected point of contact.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
  • FIG. 18 may be optional.
  • blocks 1850 and 1870 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on an image taken by a tracked imaging device are described below.
  • multiple x-ray views of one or more tracked instruments are taken with a Fluoroscope that is tracked by a navigation camera using an attached registration fixture.
  • registration fixtures are commonly used for surgical navigation using fluoroscopy.
  • FIG. 14 illustrates an example of an imaging device 1410 including an x-ray emitter 1420 and a x-ray detector 1430 .
  • the registration fixture 1440 is coupled to a predetermine portion of the imaging device 1410 .
  • the registration fixture 1440 typically includes fiducials in two planes at known positions. These fiducials are then detected in images captured by a navigation camera. Using the known positions, the relative position of the emitter 1420 is then computed. The position of the detector 1440 is tracked using the attached reference element 1440 via a navigation camera. When an instrument tracked with a reference element is brought between the emitter and detector, its relative position with respect to registration fixture 1440 is calculated.
  • the CAD model of the associated instrument tip can then be projected on the fluoroscopy image to achieve navigation. Since the registration fixture can move after the x-ray image is captured, often a different reference element, called a DRB is solidly attached to the patient, so that all tracked positions are relative to the fixed DRB.
  • a DRB a different reference element
  • FIGS. 15 A-B illustrate an example in which a wedge-shaped tracked instrument is placed between the emitter 1420 and detector 1430 , such that its views are captured by the fluoroscope in two positions.
  • the corresponding images 1570a-b below the fluoroscope show the instrument profile in different angles. Note that most instruments are solid and are made up of metal, which absorbs most x-rays and shows up dark on an x-ray image.
  • the accuracy of the projection can be compared to the theoretical projection by detecting the dark instrument shape in a bright image. Thus, the accuracy can be calculated without needing a divot.
  • FIG. 16 illustrates an example of operations for performing an accuracy check and/or calibrating a tracked instrument using images of the tracked instrument.
  • the x-ray views of an instrument are obtained as described above.
  • the theoretical position of the instrument tip projected in the views then calculated. This serves as the initial position estimate of the instrument tip.
  • the theoretical view of the CAD model in each x-ray is then calculated.
  • the optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the CAD view and actual image as shown in the algorithm below.
  • this is the same problem as matching a CT scan to multiple Fluoroscopy images in CTFluoro registration, except in this case a CAD model is used instead of a CT scan to compute dynamically rendered radiograph (“DRR”).
  • DRR dynamically rendered radiograph
  • these operations do not rely on a sharp tipped instrument fitting snugly in a divot, and can be used for accuracy checks of all types of instrument tips.
  • these operations improve accuracy checks for instruments without a sharp or straight tip.
  • these operations allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • these operations enable accuracy checks and re-calibration of multiple instruments simultaneously.
  • FIG. 19 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a pair of images taken by an imaging device.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a first virtual position of an emitter.
  • the system includes a tracking camera and an imaging device including the emitter and a detector. Determining the first virtual position of the emitter includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a dynamic reference base (“DRB”)) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • the virtual position of the emitter is determined based on predetermined information indicating a position of the emitter relative to the detector and a virtual position of the detector.
  • processing circuitry 3320 determines a first virtual position of a detector.
  • the system includes a tracking camera and an imaging device including the emitter and the detector. Determining the first virtual position of the detector includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a DRB) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • processing circuitry 3320 determines a first virtual position of a tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector.
  • the system includes a tracking camera. Determining the first virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument (e.g., relative to the DRB) based on the image of the reference element coupled to the tracked instrument; and determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument.
  • processing circuitry 3320 determines a first expected image of the tracked instrument.
  • the first expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the first virtual position of the emitter, the first virtual position of the detector, the first virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • processing circuitry 3320 obtains a first image of the tracked instrument.
  • obtaining the first image of the tracked instrument includes receiving the first image from the imaging device.
  • processing circuitry 3320 rotates the imaging device (including the emitter and the detector).
  • the imaging device includes a C-arm or an O-arm imaging device.
  • processing circuitry 3320 determines a second virtual position of the emitter. In some embodiments, determining the second virtual position of the emitter includes receiving the second virtual position from a tracking system.
  • processing circuitry 3320 determines a second virtual position of the detector. In some embodiments, determining the second virtual position of the detector includes receiving the second virtual position from a tracking system.
  • processing circuitry 3320 determines a second virtual position of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. In some embodiments, determining the second virtual position of the tracked instrument includes receiving the second virtual position from a tracking system.
  • the first virtual position of the tracked instrument is the second virtual position of the tracked instrument.
  • the imaging device can include at least one of a C-arm and a O-arm and responsive to obtaining the first image, the imaging device can be rotated (block 1935 ) such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector.
  • the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument.
  • the first virtual position of the emitter is the second virtual position of the emitter.
  • the first virtual position of the detector is the second virtual position. For example, without rotating the imaging device an image of the tracked instrument can be taken from a different perspective by moving the tracked instrument.
  • processing circuitry 3320 determines a second expected image of the tracked instrument.
  • the second expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the second virtual position of the emitter, the second virtual position of the detector, the second virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • processing circuitry 3320 obtains a second image of the tracked instrument.
  • obtaining the second image of the tracked instrument includes receiving the second image from the imaging device.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • the first expected image, the second expected image, the first image, and the second image each include an image of the tip of the tracked instrument.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold.
  • performing the action includes outputting an indication that the tracked instrument is not suitable for use.
  • performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
  • FIG. 19 Various operations of FIG. 19 may be optional. For example, blocks 1935 , 1940 , 1945 , and 1970 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on comparison of an actual position with an expected position on a display device are described below.
  • a display screen is available to show tracked instruments.
  • the display screen is near the surgical area and is already covered with sterile drape.
  • the screen may be large size (e.g., 22 inches or larger).
  • a reference element can be coupled to the display screen to allow it to be tracked by a navigation camera.
  • a large reference element array can yield improved accuracy of tracking and, in some examples, due to the large physical size, more than four optical markers can be used to improve the fidelity of tracking.
  • a user when a user brings a navigated instrument near the display screen, its position with respect to the reference element on the display screen is calculated.
  • the theoretical position of the tracked tip of the instrument CAD is then shown on the display screen.
  • the user can visually compare the accuracy of the physical position of the instrument tip with the position displayed on the screen. With aid of a virtual measurement tool, the user can then assess the accuracy.
  • FIG. 17 illustrates an example of a display device 1710 displaying a theoretical position (front view 1730 and side view 1740 ) of the tip of a tracked instrument 1750 .
  • the display device 1710 has reference elements 1720 and the tracked instrument 1750 has reference elements 1760 for being tracked by a navigation camera.
  • the front view 1730 of the theoretical position of the tip of the tracked instrument 1750 is shown as a hollow triangle on the right half of the screen.
  • the left half of the screen shows a side view 1740 of the theoretical position of the tip of the tracked instrument 1750 , allowing assessment of theoretical height above the screen of the tracked instrument 1750 .
  • the display device can be used for performing an accuracy check of any shape of tracked instrument tip. Even unconventional tips, such as a hook can be easily visualized on the screen.
  • the same display screen can be used for an accuracy check of multiple instruments.
  • the screen array is unlikely to be damaged during surgery due to splatter of blood or other smudges, since it is typically much farther from the surgical field compared to tracked instruments.
  • the surface of the display screen can sense the touch of the instrument tip, the accuracy can be calculated as well instead of relying on visual assessment.
  • using the display device to perform an accuracy check of a tracked instrument can improve fidelity of reference element array used for accuracy check and consistency of accuracy checks .
  • using the display device to perform an accuracy check of a tracked instrument can improve accuracy check workflow for instruments without a sharp, straight tip.
  • using the display device to perform an accuracy check of a tracked instrument can allow user for visual inspection and assessment of accuracy.
  • FIG. 20 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on displaying a virtual position of the tracked instrument on a display device.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a virtual position of a tracked instrument relative to a display device.
  • processing circuitry 3320 displays an indication of the virtual position of the tracked instrument on the display device.
  • the processing circuitry determines an intended shape of the tracked instrument. For example, an accurate and/or undamaged shape of the tracked instrument.
  • Displaying the indication of the virtual position of the tracked instrument includes: displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
  • processing circuitry 3320 receives an indication of an actual position of the tracked instrument relative to the display device.
  • receiving the actual position of the tracked instrument includes receiving an indication from a user.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • performing the action includes, responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use.
  • performing the action includes, responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
  • block 2050 may be optional in some embodiments.
  • the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
  • the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
  • the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • inventions of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, microcode, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

A system configured to perform an accuracy check of a tracked instrument can include a processing circuitry and memory coupled to the processing circuitry. The memory can include instructions to cause the system to perform operations. The operations can include determining a virtual position of a display device. The operations can further include determining a virtual position of the tracked instrument. The operations can further include determining a point of contact on the display device between the tracked instrument and the display device. The operations can further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument. The operations can further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.

Description

    CROSS-REFRENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. Pat. Application No. 17/662,666, filed, May 10, 2022, which is incorporated herein by reference.
  • FIELD
  • The present disclosure relates to medical devices and systems, and more particularly, checking accuracy and performing automatic calibration of tracked instruments in a camera tracking systems used for computer assisted navigation during surgery.
  • BACKGROUND
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, medical imaging devices (e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.), and surgical robots.
  • A computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient’s anatomy. Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track pose of a reference array on a surgical tool, which is being positioned by a surgeon during surgery, relative to a patient reference array (also “dynamic reference base” (“DRB”)) attached to a patient. The reference arrays allow the camera tracking system to determine a pose of the surgical tool relative to anatomical structure imaged by a medical image of the patient and relative to the patient. The surgeon can thereby use real-time visual feedback of the pose to navigate the surgical tool during a surgical procedure on the patient.
  • Surgical navigation of instruments using reference elements has become a well-established technique in the operating room. FIG. 10 illustrates an example of a trackable instrument 1010. The CAD model of an instrument 1010 is associated with a reference element 1020, so that the CAD model can be overlaid on registered images of patient’s anatomy. To ensure fidelity of the overlay, accuracy of the instrument 1010 needs to be verified prior to use. The accuracy check is typically done via bringing the tip 1040 of the tracked instrument into a divot 1050 associated with another reference element. The divot 1050 is typically a cone-shaped depression ending in an apex.
  • The theoretical position of the tip 1040 is then compared with theoretical position of the divot 1050. Assuming the user has properly positioned the instrument 1010 in the divot 1050, the distance between the two positions determines the accuracy of tracked instrument 1010. If the accuracy check does not pass, that instrument 1010 may not be used.
  • In some examples, a source of inaccuracy during the accuracy check arises due to it being challenging for a user to place an instrument accurately in the divot. The ideal position for a sharp instrument is along normal from the apex to the base of the cone of the divot. Any deviation of the angle introduces small errors. Furthermore, a bad-acting user may move the position of the instrument to produce a false accuracy number (that appears more accurate).
  • In additional or alternative examples, a source of inaccuracy during the accuracy check arises due inaccuracy in tracking of the two reference elements (one associated with the tracked instrument and one associated with the divot). The reference element arrays are typically small in size (e.g., on a few centimeters wide) to minimize obstruction of the surgical area. The number of markers is also usually limited to optimize costs and workflow. A larger array with more markers can improve the accuracy of divot position.
  • In additional or alternative examples, a source of inaccuracy during the accuracy check arises due to a shape of the instrument tip. Blunt tip instruments may not fit well inside the divot and instruments with angled tips or a hook shape can make it even more difficult to properly place the instrument tip in the divot.
  • In additional or alternative examples, a sources of inaccuracies during the accuracy check includes a deformed instrument. In additional or alternative examples, the source of inaccuracies includes a deformed reference element. Note that a slight angular shift in the reference element can result in very small error for tracking of the reference element, but may result in a much larger error at instrument tip. In additional or alternative examples, the source of inaccuracies include inaccuracies in optical markers due to manufacturing defects, smudges, or inaccurate mounting of optical markers on mounting posts. All these are solvable problems, though. If an instrument can be calibrated at the time of use, the fidelity of tracking can be improved so that the physical tip matches the estimated tip.
  • SUMMARY
  • Some embodiments of the present disclosure are directed to performing an accuracy check and calibrating tracked instruments used in surgical procedures.
  • In some embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a virtual position within a virtual space of a display device. The operations further include determining a virtual position within the virtual space of the tracked instrument. The operations further include determining a point of contact on the display device between the tracked instrument and the display device. The operations further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument. The operations further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • In other embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a first virtual position within a virtual space of an emitter of an imaging device. The operations further include determining a first virtual position within the virtual space of a detector of the imaging device. The operations further include determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector. The operations further include determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument. The operations further include obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector. The operations further include determining a second virtual position within the virtual space of the emitter of the imaging device. The operations further include determining a second virtual position within the virtual space of the detector of the imaging device. The operations further include determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. The operations further include determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument. The operations further include obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image. The operations further include determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • In other embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a virtual position within a virtual space of the tracked instrument relative to a display device. The operations further include displaying an indication of the virtual position of the tracked instrument on the display device. The operations further include receiving an indication of an actual position of the tracked instrument relative to the display device. The operations further include determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • Other systems and corresponding methods and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional camera tracking system, methods. and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
  • DESCRIPTION OF THE DRAWINGS
  • Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and which may further include a surgical robot for robotic assistance according to some embodiments;
  • FIG. 2 illustrates the camera tracking system and the surgical robot positioned relative to a patient according to some embodiments;
  • FIG. 3 further illustrates the camera tracking system and the surgical robot configured according to some embodiments;
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset, a computer platform, imaging devices, and a surgical robot which are configured to operate according to some embodiments;
  • FIG. 5 illustrates a patient reference array (“DRB”) and a surveillance marker;
  • FIGS. 6A-C respectively illustrate a surgical robot with an end-effector, an expanded view of the end-effector, and a surgical tool in accordance with some embodiments;
  • FIGS. 7A-B are schematic diagrams illustrating examples of imaging devices according to some embodiments;
  • FIG. 8 is a block diagram illustrating an example of an imaging system according to some embodiments;
  • FIG. 9 is a block diagram illustrating an example of an accuracy and calibration module according to some embodiments;
  • FIG. 10 is a schematic diagram illustrating an example of a tracked instrument according to some embodiments;
  • FIG. 11 is a schematic diagram illustrating an example of a set of display devices configured to interact with a tracked instrument according to some embodiments;
  • FIG. 12 is a schematic diagram illustrating an example of the set of display devices of FIG. 11 being contacted by a tracked instrument according to some embodiments;
  • FIG. 13 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on contact with a display device according to some embodiments;
  • FIG. 14 is a schematic diagram illustrating an example of a C-arm imaging device according to some embodiments;
  • FIGS. 15A-B are schematic diagrams illustrating images taken of a tracked instrument using the C-arm imaging device at two different positions according to some embodiments;
  • FIG. 16 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on images taken of the tracked instrument according to some embodiments;
  • FIG. 17 is a schematic diagram of a display device configured to show an expected position of a tracked instrument according to some embodiments; and
  • FIGS. 18-20 are flowcharts of operations performed by a system to perform an accuracy check of tracked instruments according to some embodiments.
  • DETAILED DESCRIPTION
  • It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
  • The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.
  • Various embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of candidate markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras. Before describing these embodiments is detail, various components that may be used for performing embodiments in a navigated surgery system are described with reference to FIGS. 1-9 .
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets 150 during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery during a surgical procedure and which may further include a surgical robot 100 for robotic assistance, according to some embodiments. FIG. 2 illustrates the camera tracking system 200 and the surgical robot 100 positioned relative to a patient, according to some embodiments. FIG. 3 further illustrates the camera tracking system 200 and the surgical robot 100 configured according to some embodiments. FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and the surgical robot 100 which are configured to operate according to some embodiments. FIG. 5 illustrates a patient reference array 116 (also “dynamic reference base” (DRB)) and a surveillance marker 500.
  • The XR headset 150 may be configured to augment a real-world scene with computer generated XR images. The XR headset 150 may be configured to provide an augmented reality (“AR”) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user. Alternatively, the XR headset 150 may be configured to provide a virtual reality (“VR”) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer-generated AR images on a display screen. The XR headset 150 can be configured to provide both AR and VR viewing environments. Thus, the term XR headset can referred to as an AR headset or a VR headset.
  • Referring to FIGS. 1-5 , the surgical robot 100 may include, for example, one or more robot arms 104, a display 110, an end-effector 112, for example, including a guide tube 114, and an end effector reference array which can include one or more tracking markers. A patient reference array 116 (“DRB”) has a plurality of tracking markers 117 and is secured directly to the patient 210 (e.g., to a bone of the patient 210). A spaced apart surveillance marker 500 (FIG. 5 ) has a single marker 502 connected to a shaft that is secured directly to the patient 210 at a spaced apart location from the patient reference array 116. Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.
  • The camera tracking system 200 includes tracking cameras 204 which may be spaced apart stereo cameras configured with partially overlapping field-of-views. The camera tracking system 200 can have any suitable configuration of arm(s) 202 to move, orient, and support the tracking cameras 204 in a desired location, and may contain at least one processor operable to track location of an individual marker and pose of an array of markers. As used herein, the term “pose” refers to the location (e.g., along 3 orthogonal axes) and/or the rotation angle (e.g., about the 3 orthogonal axes) of markers (e.g., DRB) relative to another marker (e.g., surveillance marker) and/or to a defined coordinate system (e.g., camera coordinate system). A pose may therefore be defined based on only the multidimensional location of the markers relative to another marker and/or relative to the defined coordinate system, based on only the multidimensional rotational angles of the markers relative to the other marker and/or to the defined coordinate system, or based on a combination of the multidimensional location and the multidimensional rotational angles. The term “pose” therefore is used to refer to location, rotational angle, or combination thereof.
  • The tracking cameras 204 may include, e.g., infrared cameras (e.g., bifocal or stereophotogrammetric cameras), operable to identify, for example, active and passive tracking markers for single markers (e.g., surveillance marker 500) and reference arrays which can be formed on or attached to the patient 210 (e.g., patient reference array, DRB), end effector 112 (e.g., end effector reference array), XR headset(s) 150 worn by a surgeon 120 and/or a surgical assistant 126, etc. in a given measurement volume of a camera coordinate system while viewable from the perspective of the tracking cameras 204. The tracking cameras 204 may scan the given measurement volume and detect light that is emitted or reflected from the markers in order to identify and determine locations of individual markers and poses of the reference arrays in three-dimensions. For example, active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (“LEDs”)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking cameras 204 or other suitable device.
  • The XR headsets 150 may each include tracking cameras (e.g., spaced apart stereo cameras) that can track location of a surveillance marker and poses of reference arrays within the XR camera headset field-of-views (“FOVs”) 152 and 154, respectively. Accordingly, as illustrated in FIG. 1 , the location of the surveillance marker and the poses of reference arrays on various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 and/or a FOV 600 of the tracking cameras 204.
  • FIGS. 1-2 illustrate a potential configuration for the placement of the camera tracking system 200 and the surgical robot 100 in an operating room environment. Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 and/or other displays 34, 36, and 110 to display surgical procedure navigation information. The surgical robot 100 is optional during computer-aided navigated surgery.
  • The camera tracking system 200 may operate using tracking information and other information provided by multiple XR headsets 150 such as inertial tracking information and optical tracking information (frames of tracking data). The XR headsets 150 operate to display visual information and may play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 100 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment. The camera tracking system 200 may track markers in 6 degrees-of-freedom (“6DOF”) relative to three axes of a 3D coordinate system and rotational angles about each axis. The XR headsets 150 may also operate to track hand poses and gestures to enable gesture-based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 may have a 1-10x magnification digital color camera sensor called a digital loupe. In some embodiments, one or more of the XR headsets 150 are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • An “outside-in” machine vision navigation bar supports the tracking cameras 204 and may include a color camera. The machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 while positioned on wearers’ heads. The patient reference array 116 (DRB) is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112, instrument reference array 170, and reference arrays on the XR headsets 150.
  • During a surgical procedure using surgical navigation, the surveillance marker 500 is affixed to the patient to provide information on whether the patient reference array 116 has shifted. For example, during a spinal fusion procedure with planned placement of pedicle screw fixation, two small incisions are made over the posterior superior iliac spine bilaterally. The DRB and the surveillance marker are then affixed to the posterior superior iliac spine bilaterally. If the surveillance marker’s 500 location changes relative to the patient reference array 116, the camera tracking system 200 may display a meter indicating the amount of movement and/or may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation which is off target.
  • When present, the surgical robot (also “robot”) may be positioned near or next to patient 210. The robot 100 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the surgical procedure. The camera tracking system 200 may be separated from the robot system 100 and positioned at the foot of patient 210. This location allows the tracking camera 200 to have a direct visual line of sight to the surgical area 208. In the configuration shown, the surgeon 120 may be positioned across from the robot 100, but is still able to manipulate the end-effector 112 and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. An anesthesiologist 122, nurse or scrub tech can operate equipment which may be connected to display information from the camera tracking system 200 on a display 34.
  • With respect to the other components of the robot 100, the display 110 can be attached to the surgical robot 100 or in a remote location. End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor. In some embodiments, end-effector 112 can comprise a guide tube 114, which is configured to receive and orient a surgical instrument, tool, or implant used to perform a surgical procedure on the patient 210.
  • As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” The term “instrument” is used in a nonlimiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein. Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc. Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument in a desired manner.
  • The surgical robot 100 is operable to control the translation and orientation of the end-effector 112. The robot 100 may move the end-effector 112 under computer control along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis, such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled. In some embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a 6DOF robot arm comprising only rotational axes. For example, the surgical robot 100 may be used to operate on patient 210, and robot arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.
  • In some example embodiments, the XR headsets 150 can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • In some further embodiments, surgical robot 100 can be operable to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory. The surgical robot 100 can be operable to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument. Thus, in use, a surgeon or other user can use the surgical robot 100 as part of computer assisted navigated surgery, and has the option to stop, modify, or manually control the autonomous or semi-autonomous movement of the end-effector 112 and/or the surgical instrument.
  • Reference arrays of markers can be formed on or connected to robot arms 102 and/or 104, the end-effector 112 (e.g., end-effector array 114 in FIG. 2 ), and/or a surgical instrument (e.g., instrument array 170) to track poses in 6DOF along 3 orthogonal axes and rotation about the axes. The reference arrays enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments) to be tracked by the tracking camera 200, and the tracked poses can be used to provide navigated guidance during a surgical procedure and/or used to control movement of the surgical robot 100 for guiding the end-effector 112 and/or an instrument manipulated by the end-effector 112.
  • Referring to FIG. 3 the surgical robot 100 may include a display 110, upper arm 102, lower arm 104, end-effector 112, vertical column 312, casters 314, a table 318, and ring 324 which uses lights to indicate statuses and other information. Cabinet 106 may house electrical components of surgical robot 100 including, but not limited, to a battery, a power distribution module, a platform interface board module, and a computer. The camera tracking system 200 may include a display 36, tracking cameras 204, arm(s) 202, a computer housed in cabinet 330, and other components.
  • In computer-assisted navigated surgeries, perpendicular 2D scan slices, such as axial, sagittal, and/or coronal views, of patient anatomical structure are displayed to enable user visualization of the patient’s anatomy alongside the relative poses of surgical instruments. An XR headset or other display can be controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy. The 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn’t necessarily formed from a scan of the patient.
  • Example Surgical System
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and a surgical robot 100 which are configured to operate according to some embodiments.
  • The imaging devices 420 may include a C-arm imaging device, an O-arm imaging device, and/or a patient image database. The XR headset 150 provides an improved human interface for performing navigated surgical procedures. The XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 400, that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 438 of the XR headset 150 and/or another display device. The display device 438 may include a video projector, flat panel display, etc. The user may view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen. The XR headset 150 may additionally or alternatively be configured to display on the display device 438 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 430, a microphone 432, a gesture sensor 434, a pose sensor (e.g., inertial measurement unit (“IMU”)) 436, the display device 438, and a wireless/wired communication interface 440. The cameras 430 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • The cameras 430 may be configured to operate as the gesture sensor 434 by tracking for identification user hand gestures performed within the field of view of the camera(s) 430. Alternatively, the gesture sensor 434 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 434 and/or senses physical contact, e.g., tapping on the sensor 434 or its enclosure. The pose sensor 436, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • As explained above, a surgical system includes the camera tracking system 200 which may be connected to a computer platform 400 for operational processing and which may provide other operational functionality including a navigation controller 404 and/or of an XR headset controller 410. The surgical system may include the surgical robot 100. The navigation controller 404 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 200. The navigation controller 404 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 100, where the steering information is displayed through the display device 438 of the XR headset 150 and/or another display device to indicate where the surgical tool and/or the end effector of the surgical robot 100 should be moved to perform the surgical plan.
  • The electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 400 through the wired/wireless interface 440. The electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 400 or directly connected, to various imaging devices 420, e.g., the C-arm imaging device, the I/O-arm imaging device, the patient image database, and/or to other medical equipment through the wired/wireless interface 440.
  • The surgical system may include a XR headset controller 410 that may at least partially reside in the XR headset 150, the computer platform 400, and/or in another system component connected via wired cables and/or wireless communication links. Various functionality is provided by software executed by the XR headset controller 410. The XR headset controller 410 is configured to receive information from the camera tracking system 200 and the navigation controller 404, and to generate an XR image based on the information for display on the display device 438.
  • The XR headset controller 410 can be configured to operationally process frames of tracking data from tracking cameras from the cameras 430 (tracking cameras), signals from the microphone 1620, and/or information from the pose sensor 436 and the gesture sensor 434, to generate information for display as XR images on the display device 438 and/or as other for display on other display devices for user viewing. Thus, the XR headset controller 410 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user. For example, the XR headset controller 410 may reside within the computer platform 400 which, in turn, may reside within the cabinet 330 of the camera tracking system 200, the cabinet 106 of the surgical robot 100, etc..
  • Turning now to FIGS. 6A-6C, the surgical robot system 100 relies on accurate positioning of the end-effector 112, surgical instruments 608, and/or the patient 210 (e.g., patient reference array 116) relative to the desired surgical area. In the embodiments shown in FIGS. FIGS. 6A-6C, the reference arrays include tracking markers 118, 804 which are rigidly attached to a portion of the instrument 608 and/or end-effector 112.
  • FIG. 6A depicts part of the surgical robot system 100 with the robot 102 including base 106, robot arm 104, and end-effector 112. The other elements, not illustrated, such as the display, marker tracking cameras, etc. may also be present as described herein. FIG. 6B depicts a close-up view of the end-effector 112 with guide tube 114 and a reference array that includes a plurality of tracking markers 118 rigidly affixed to the end-effector 112. In this embodiment, the plurality of tracking markers 118 are attached to the end-effector 112 configured as a guide tube. FIG. 6C depicts an instrument 608 (in this case, a probe) with a plurality of tracking markers 804 rigidly affixed to the instrument 608. As described elsewhere herein, the instrument 608 could include any suitable surgical instrument, such as, but not limited to, guide wire, cannula, a retractor, a drill, a reamer, a screwdriver, an insertion instrument, a removal instrument, or the like.
  • In FIG. 6C, the reference array 612 functions as the handle 620 of the instrument 608. Four markers 804 are attached to the handle 620 in a manner that is out of the way of the shaft 622 and tip 624. Stereophotogrammetric tracking by the tracking camera 200 of these four markers 804 allows the instrument 608 to be tracked as a rigid body and for the system 100 to precisely determine the location of the tip 624 and the orientation of the shaft 622 while the instrument 608 is moved within view of tracking camera 200.
  • To enable automatic tracking of one or more instruments 608, end-effector 112, or other object to be tracked in 3D (e.g., multiple rigid bodies), the markers 118, 804 on each instrument 608, end-effector 112, or the like, may be arranged asymmetrically with a known inter-marker spacing. The reason for asymmetric alignment is so that it is unambiguous which marker 118, 804 corresponds to a particular pose on the rigid body and whether markers 118, 804 are being viewed from the front or back, i.e., mirrored. For example, if the markers 118, 804 were arranged in a square on the instrument 608 or end-effector 112, it would be unclear to the system 100, 300, 600 which marker 118, 804 corresponded to which corner of the square. For example, for the instrument 608, it would be unclear which marker 804 was closest to the shaft 622. Thus, it would be unknown which way the shaft 622 was extending from the array 612. Accordingly, each array 612 and thus each instrument 608, end-effector 112, or other object to be tracked should have a unique marker pattern to allow it to be distinguished from other instruments 608 or other objects being tracked.
  • Asymmetry and unique marker patterns allow the tracking camera 200 and system 100 to detect individual markers 118, 804 then to check the marker spacing against a stored template to determine which instrument 608, end-effector 112, or another object they represent. Detected markers 118, 804 can then be sorted automatically and assigned to each tracked object in the correct order. Without this information, rigid body calculations could not then be performed to extract key geometric information, for example, such as instrument tip 624 and alignment of the shaft 622, unless the user manually specified which detected marker 118, 804 corresponded to which position on each rigid body.
  • FIGS. 7A-B illustrate medical imaging systems 1304 that may be used in conjunction with robot system 100 and/or navigation systems to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210. Any appropriate subject matter may be imaged for any appropriate procedure using the imaging system 1304. The imaging system 1304 may be any imaging device such as a C-arm 1308 device, an O-arm 1306 device, a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system. As illustrated in FIG. 7A, the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape. C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316. The space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318. As illustrated in FIG. 7B, the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328, such as a wheeled mobile cart 1330 with wheels 1332, which may enclose an image capturing portion, not illustrated. The image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion. The image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition. The image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes. Although certain imaging systems 1304 are exemplified herein, it will be appreciated that any suitable imaging system may be selected by one of ordinary skill in the art.
  • FIG. 8 illustrates a block diagram of components of a medical imaging system configured in accordance with some embodiments of the present disclosure. The medical imaging system includes a controller 3200, a imaging arm 3240 (e.g., a C-arm or an O-arm), a linear actuator and/or rotary actuator 3250 connected to an X-ray beam emitter or collector 3260. The controller 3200 includes an image processor 3210, a general processor 3220, and an I/O interface 3230. The image processor 3210 performs image processing to combine sets of images to generate a three-dimensional image of the scanned volume. The general processor 3220 is used to perform various embodiments of the present disclosure. The I/O interface 3230 communicatively couples the controller 3200 to other components of the medical imaging system. The imaging arm 3240 includes motors 3245 used to move the collector and emitter along an arc, e.g., three hundred and sixty degrees, during image acquisition. Motors 3245 are controlled by C-arm the controller 3200. The controller 3200 can also control movement of the linear actuator and/or rotary actuator 3250.
  • FIG. 9 illustrates an example of an accuracy and calibration module 3300. The accuracy and calibration module 3330 can include an interface 3310, a processing circuitry 3320, and a memory 3330. In some examples, the accuracy and calibration module is part of a system (e.g., an imaging system or a camera tracking system). The memory 3330 can include instructions stored therein that are executable by the processing circuitry to perform operations according to some embodiments herein.
  • Embodiments that include performing an accuracy check and/or calibrating of a tracked instrument based on contact with a touch sensor (e.g., a touchscreen of a display device) are described below.
  • In some embodiments, multiple points of contact (e.g., touch positions from the tip of a tracked instrument) can be detected by one or more touchpads that are themselves tracked by navigation camera. The instruments and the pressure touchpads can each have associated reference elements that are tracked by the navigation camera. In some examples, the touchpads are sensitive to pressure, capacitance, or resistance.
  • FIG. 11 illustrates an example of a set of touchpads 1110 coupled together to create an opening for accepting a tip of the tracked instrument. The associated reference element 1120 is coupled to the touchpads. In this example, the touchpads and reference arrays are securely housed in a supporting structure 1130 to reduce movement.
  • The touchpads 1110 can capture location of pressure points. Resistive touchpads are especially useful, since they do not rely on capacitance of the object. When an instrument is brought in the wedge, it touches at least two points on the touchpads 1110. The touchpads 1110 then send the location of sensed points to the system. The system also receives the position of pose of the touchpads and instruments via their associated reference elements 1120. Thus, the system can calculate the theoretical position of the tip of the instrument under test. It can then compare the tip location to the location reported by the three touchpads 1110.
  • Typically, the bottom touchpad would report position of a sharp or semi-sharp instrument tip. For a broader instrument, such as an Osteotome, there will be multiple touch-points on the bottom touchpads while the side touchpads will report straight lines of touch-points. The approximate position of the CAD model with respect to the touchpads is known already to the system based on the tracking information reported by the camera. Thus, the accuracy of the physical model can be calculated.
  • FIG. 12 illustrates an example of a tip of a tracked instrument 1240 contacting the touchpads 1110. The wedge shape of the opening between the touchpads 1110 allows an accuracy check of instruments with tips that are too big to fit in a typical divot used in navigation arrays.
  • FIG. 13 illustrates an example of operations to perform an accuracy check and calibrate a tracked instrument based on contact between the tracked instrument and the display devices. To calibrate an instrument, the reported touchpad points are compared against the theoretical model. First, the user touches instrument tip on all three touchpads in a way that reference elements of both the instrument and the touchpad structure are visible to the tracking camera. The theoretical position of the instrument tip with respect to touchpads is then calculated. This serves as the initial position estimate of the instrument tip. Since the relative position of three touchpads is known, the theoretical touchpoints of the CAD model for each touchpad are then calculated. The optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the theoretical touchpoints and the actual ones as shown in the algorithm below.
  • In some embodiments, these operations improve accuracy checks for instruments without a sharp tip or instruments that are too wide to fit in a traditional divot. In additional or alternative embodiments, these operation allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • FIG. 18 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a point of contact between the tracked instrument and a touch sensor. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 1810, processing circuitry 3320 determines a virtual position of the touch sensor. In some examples, the term virtual position is used herein to describe a virtual location and a virtual pose of an object. In some embodiments, the system includes a camera. Determining the virtual position of the touch sensor includes: determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor; capturing, via the camera, an image of the reference element coupled to the touch sensor; determining a virtual position of the reference element coupled to the touch sensor relative to a dynamic reference base (“DRB”) based on the image of the reference element coupled to the touch sensor; and determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor.
  • At block 1820, processing circuitry 3320 determines a virtual position of the tracked instrument. In some embodiments, the virtual position of the touch sensor and the virtual position of the tracked instrument are within the same virtual space (e.g., relative to a common reference point).
  • In additional or alternative embodiments, the system includes a camera. determining the virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument; and determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • At block 1830, processing circuitry 3320 determines a point of contact on a touch sensor between the tacked instrument and the touch sensor. In some embodiments, the system includes the touch sensor and the touch sensor includes a touchscreen (e.g., a pressure sensitive, resistance sensitive, or capacitance sensitive touchscreen). In some examples the touch sensor is part of a display device. Determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
  • In additional or alternative embodiments, the touch sensor includes a plurality of touch sensors coupled together to form an opening. Determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
  • At block 1840, processing circuitry 3320 determines an expected point of contact on the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • In some embodiments, information about the shape of the tracked instrument is determined and the information an intended position of a tip of the tracked instrument relative to a reference element coupled to the tracked instrument. Determining the point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor. Determining the expected point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • At block 1850, processing circuitry 3320 displays an indication of the expected point of contact. In some embodiments, the system includes a display device that includes the touch sensor. Determining the point of contact on the touch sensor between the tracked instrument and the touch sensor includes receiving an indication of the point of contact on the touch sensor from a user in response to displaying the indication of the expected point of contact.
  • At block 1860, processing circuitry 3320 determines whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • In some embodiments, determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor. Determining the expected point of contact on the touch sensor includes determining a plurality of expected points of contact between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument. Determining whether the tracked instrument is accurate includes determining whether the tracked instrument is accurate based on a difference between the plurality of points of contact and the plurality of expected points of contact.
  • At block 1870, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • In some embodiments, determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
  • Various operations of FIG. 18 may be optional. For example, blocks 1850 and 1870 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on an image taken by a tracked imaging device are described below.
  • In some embodiments, multiple x-ray views of one or more tracked instruments are taken with a Fluoroscope that is tracked by a navigation camera using an attached registration fixture. Such registration fixtures are commonly used for surgical navigation using fluoroscopy.
  • FIG. 14 illustrates an example of an imaging device 1410 including an x-ray emitter 1420 and a x-ray detector 1430. The registration fixture 1440 is coupled to a predetermine portion of the imaging device 1410.
  • The registration fixture 1440 typically includes fiducials in two planes at known positions. These fiducials are then detected in images captured by a navigation camera. Using the known positions, the relative position of the emitter 1420 is then computed. The position of the detector 1440 is tracked using the attached reference element 1440 via a navigation camera. When an instrument tracked with a reference element is brought between the emitter and detector, its relative position with respect to registration fixture 1440 is calculated.
  • The CAD model of the associated instrument tip can then be projected on the fluoroscopy image to achieve navigation. Since the registration fixture can move after the x-ray image is captured, often a different reference element, called a DRB is solidly attached to the patient, so that all tracked positions are relative to the fixed DRB.
  • Since the rendered position of an instrument is only in 2D, at least two views, roughly orthogonal to each other, are used to track the instrument on two roughly orthogonal views to obtain pseudo-3D navigation.
  • FIGS. 15A-B illustrate an example in which a wedge-shaped tracked instrument is placed between the emitter 1420 and detector 1430, such that its views are captured by the fluoroscope in two positions. The corresponding images 1570a-b below the fluoroscope show the instrument profile in different angles. Note that most instruments are solid and are made up of metal, which absorbs most x-rays and shows up dark on an x-ray image.
  • Since the theoretical position of the tip of the instrument 1550 is known via the attached reference element 1560, the accuracy of the projection can be compared to the theoretical projection by detecting the dark instrument shape in a bright image. Thus, the accuracy can be calculated without needing a divot.
  • If multiple instruments can be placed within the field of view of the x-ray image, accuracy of all of them can be calculated simultaneously.
  • FIG. 16 illustrates an example of operations for performing an accuracy check and/or calibrating a tracked instrument using images of the tracked instrument. The x-ray views of an instrument are obtained as described above. The theoretical position of the instrument tip projected in the views then calculated. This serves as the initial position estimate of the instrument tip. Using the projection matrix, the theoretical view of the CAD model in each x-ray is then calculated. The optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the CAD view and actual image as shown in the algorithm below.
  • In some examples, this is the same problem as matching a CT scan to multiple Fluoroscopy images in CTFluoro registration, except in this case a CAD model is used instead of a CT scan to compute dynamically rendered radiograph (“DRR”).
  • In some embodiments, these operations do not rely on a sharp tipped instrument fitting snugly in a divot, and can be used for accuracy checks of all types of instrument tips.
  • In additional or alternative embodiments, these operations improve accuracy checks for instruments without a sharp or straight tip.
  • In additional or alternative embodiments, these operations allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • In additional or alternative embodiments, these operations enable accuracy checks and re-calibration of multiple instruments simultaneously.
  • FIG. 19 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a pair of images taken by an imaging device. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 1910, processing circuitry 3320 determines a first virtual position of an emitter. In some embodiments, the system includes a tracking camera and an imaging device including the emitter and a detector. Determining the first virtual position of the emitter includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a dynamic reference base (“DRB”)) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device. In additional or alternative embodiments, the virtual position of the emitter is determined based on predetermined information indicating a position of the emitter relative to the detector and a virtual position of the detector.
  • At block 1915, processing circuitry 3320 determines a first virtual position of a detector. In some embodiments, the system includes a tracking camera and an imaging device including the emitter and the detector. Determining the first virtual position of the detector includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a DRB) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • At block 1920, processing circuitry 3320 determines a first virtual position of a tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector. In some embodiments, the system includes a tracking camera. Determining the first virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument (e.g., relative to the DRB) based on the image of the reference element coupled to the tracked instrument; and determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • In additional or alternative embodiments, determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument.
  • At block 1925, processing circuitry 3320 determines a first expected image of the tracked instrument. In some embodiments, the first expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the first virtual position of the emitter, the first virtual position of the detector, the first virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • At block 1930, processing circuitry 3320 obtains a first image of the tracked instrument. In some embodiments, obtaining the first image of the tracked instrument includes receiving the first image from the imaging device.
  • At block 1935, processing circuitry 3320 rotates the imaging device (including the emitter and the detector). In some examples, the imaging device includes a C-arm or an O-arm imaging device.
  • At block 1940, processing circuitry 3320 determines a second virtual position of the emitter. In some embodiments, determining the second virtual position of the emitter includes receiving the second virtual position from a tracking system.
  • At block 1945, processing circuitry 3320 determines a second virtual position of the detector. In some embodiments, determining the second virtual position of the detector includes receiving the second virtual position from a tracking system.
  • At block 1950, processing circuitry 3320 determines a second virtual position of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. In some embodiments, determining the second virtual position of the tracked instrument includes receiving the second virtual position from a tracking system.
  • In additional or alternative embodiments, the first virtual position of the tracked instrument is the second virtual position of the tracked instrument. For example, the imaging device can include at least one of a C-arm and a O-arm and responsive to obtaining the first image, the imaging device can be rotated (block 1935) such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector. As a result an image of the tracked instrument from a different perspective can be taken without moving the tracked instrument.
  • In additional or alternative embodiments, the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument. The first virtual position of the emitter is the second virtual position of the emitter The first virtual position of the detector is the second virtual position. For example, without rotating the imaging device an image of the tracked instrument can be taken from a different perspective by moving the tracked instrument.
  • At block 1955, processing circuitry 3320 determines a second expected image of the tracked instrument. In some embodiments, the second expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the second virtual position of the emitter, the second virtual position of the detector, the second virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • At block 1960, processing circuitry 3320 obtains a second image of the tracked instrument. In some embodiments, obtaining the second image of the tracked instrument includes receiving the second image from the imaging device.
  • At block 1965, processing circuitry 3320 determines whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image. In some embodiments, the first expected image, the second expected image, the first image, and the second image each include an image of the tip of the tracked instrument.
  • At block 1970, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate. In some embodiments, determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
  • Various operations of FIG. 19 may be optional. For example, blocks 1935, 1940, 1945, and 1970 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on comparison of an actual position with an expected position on a display device are described below.
  • In some embodiments, a display screen is available to show tracked instruments. In some examples, the display screen is near the surgical area and is already covered with sterile drape. The screen may be large size (e.g., 22 inches or larger). A reference element can be coupled to the display screen to allow it to be tracked by a navigation camera. A large reference element array can yield improved accuracy of tracking and, in some examples, due to the large physical size, more than four optical markers can be used to improve the fidelity of tracking.
  • In additional or alternative embodiments, when a user brings a navigated instrument near the display screen, its position with respect to the reference element on the display screen is calculated. The theoretical position of the tracked tip of the instrument CAD is then shown on the display screen. The user can visually compare the accuracy of the physical position of the instrument tip with the position displayed on the screen. With aid of a virtual measurement tool, the user can then assess the accuracy.
  • FIG. 17 illustrates an example of a display device 1710 displaying a theoretical position (front view 1730 and side view 1740) of the tip of a tracked instrument 1750. The display device 1710 has reference elements 1720 and the tracked instrument 1750 has reference elements 1760 for being tracked by a navigation camera.
  • In this example, the front view 1730 of the theoretical position of the tip of the tracked instrument 1750 is shown as a hollow triangle on the right half of the screen. The left half of the screen shows a side view 1740 of the theoretical position of the tip of the tracked instrument 1750, allowing assessment of theoretical height above the screen of the tracked instrument 1750.
  • In some embodiments, the display device can be used for performing an accuracy check of any shape of tracked instrument tip. Even unconventional tips, such as a hook can be easily visualized on the screen.
  • In additional or alternative embodiments, the same display screen can be used for an accuracy check of multiple instruments. In additional or alternative embodiments, the screen array is unlikely to be damaged during surgery due to splatter of blood or other smudges, since it is typically much farther from the surgical field compared to tracked instruments.
  • In additional or alternative embodiments, if the surface of the display screen can sense the touch of the instrument tip, the accuracy can be calculated as well instead of relying on visual assessment.
  • In some embodiments, using the display device to perform an accuracy check of a tracked instrument can improve fidelity of reference element array used for accuracy check and consistency of accuracy checks .
  • In additional or alternative embodiments, using the display device to perform an accuracy check of a tracked instrument can improve accuracy check workflow for instruments without a sharp, straight tip.
  • In additional or alternative embodiments, using the display device to perform an accuracy check of a tracked instrument can allow user for visual inspection and assessment of accuracy.
  • FIG. 20 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on displaying a virtual position of the tracked instrument on a display device. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 2010, processing circuitry 3320 determines a virtual position of a tracked instrument relative to a display device.
  • At block 2020, processing circuitry 3320 displays an indication of the virtual position of the tracked instrument on the display device. In some embodiments, the processing circuitry determines an intended shape of the tracked instrument. For example, an accurate and/or undamaged shape of the tracked instrument. Displaying the indication of the virtual position of the tracked instrument includes: displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
  • At block 2030, processing circuitry 3320 receives an indication of an actual position of the tracked instrument relative to the display device. In some embodiments, receiving the actual position of the tracked instrument includes receiving an indication from a user.
  • At block 2040, processing circuitry 3320 determines whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • At block 2050, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate. In some embodiments, performing the action includes, responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use. In additional or alternative embodiments, performing the action includes, responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
  • Various operations of FIG. 20 may be optional. For example, block 2050 may be optional in some embodiments.
  • Further Definitions and Embodiments:
  • In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
  • When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
  • It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
  • As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, microcode, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
  • It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
  • Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims (20)

What is claimed is:
1. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a virtual position within a virtual space of a touch sensor;
determining a virtual position within the virtual space of the tracked instrument;
determining a point of contact on the touch sensor between the tracked instrument and the touch sensor;
determining an expected point of contact on the touch sensor between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument; and
determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
2. The method of claim 1, wherein determining the virtual position of the touch sensor includes:
determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor;
capturing an image of the reference element coupled to the touch sensor;
determining a virtual position of the reference element coupled to the touch sensor based on the image of the reference element coupled to the touch sensor, the virtual position of the reference element coupled to the touch sensor including a virtual location and a virtual pose of the reference element coupled to the touch sensor; and
determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor, the virtual position of the touch sensor including a virtual location and a virtual pose of the touch sensor, and
wherein determining the virtual position of the tracked instrument includes:
determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument;
capturing an image of the reference element coupled to the tracked instrument;
determining a virtual position of the reference element coupled to the tracked instrument based on the image of the reference element coupled to the tracked instrument, the virtual position of the reference element coupled to the tracked instrument including a virtual location and a virtual pose of the reference element coupled to the tracked instrument; and
determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument, the virtual position of the tracked instrument including a virtual location and a virtual pose of the tracked instrument.
3. The method of claim 2, wherein determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument,
wherein determining the point of contact on the touch sensor includes determining a point of contact between the tip of the tracked instrument and the touch sensor, and
wherein determining the expected point of contact on the touch sensor includes determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
4. The method of claim 1, wherein the touch sensor includes a touchscreen, and
wherein determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
5. The method of claim 4, wherein the touch sensor includes a plurality of touch sensors coupled together to form an opening, and
wherein determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
6. The method of claim 1, wherein determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor,
wherein determining the expected point of contact on the touch sensor includes determining a plurality of expected points of contact between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument, and
wherein determining whether the tracked instrument is accurate includes determining whether the tracked instrument is accurate based on a difference between the plurality of points of contact and the plurality of expected points of contact.
7. The method of claim 1, wherein determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold,
the method further comprising:
outputting an indication that the tracked instrument is not suitable for use.
8. The method of claim 1, wherein determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold,
the method further comprising:
calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
9. The method of claim 1, wherein a display device includes the touch sensor
the method further comprising:
displaying, via the display device, an indication of the expected point of contact,
wherein determining the point of contact on the display device between the tracked instrument and the display device includes receiving an indication of the point of contact on the display device from a user.
10. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a first virtual position within a virtual space of an emitter of an imaging device;
determining a first virtual position within the virtual space of a detector of the image device;
determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector;
determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument;
obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector;
determining a second virtual position within the virtual space of the emitter of the imaging device;
determining a second virtual position within the virtual space of the detector of the imaging device;
determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector;
determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument;
obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image; and determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
11. The method of claim 10, wherein the imaging device includes the emitter and the detector,
wherein determining the first virtual position of the detector includes:
capturing an image of a reference element coupled to the imaging device;
determining a virtual position of the reference element coupled to the imaging device based on the image of the reference element coupled to the imaging device, the virtual position of the refence element coupled to the imaging device including a virtual location and a virtual pose of the reference element coupled to the imaging device; and
determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device, the virtual position of the detector including a virtual location and a virtual pose of the detector,
wherein determining the first virtual position of the emitter includes:
determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the detector and the virtual position of the detector, the virtual position of the emitter including a virtual location and a virtual pose of the emitter, and
wherein determining the first virtual position of the tracked instrument includes:
determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument;
capturing an image of the reference element coupled to the tracked instrument;
determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument, the virtual position of the refence element coupled to the tracked instrument including a virtual location and a virtual pose of the reference element coupled to the tracked instrument; and
determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument, the virtual position of the tracked instrument including a virtual location and a virtual pose of the tracked instrument.
12. The method of claim 11, wherein determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument, and
wherein the first expected image, the second expected image, the first image, and the second image each include an image of the tip.
13. The method of claim 11, wherein the first virtual position of the tracked instrument is the second virtual position of the tracked instrument, and
wherein the imaging device includes at least one of a C-arm and a O-arm,
the method further comprising:
responsive to obtaining the first image, rotating the imaging device such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector.
14. The method of claim 10, wherein the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument,
wherein the first virtual position of the emitter is the second virtual position of the emitter, and
wherein the first virtual position of the detector is the second virtual position of the detector.
15. The method of claim 10, wherein determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold,
the method further comprising:
outputting an indication that the tracked instrument is not suitable for use.
16. The method of claim 10, wherein determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold,
the method further comprising:
calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
17. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a virtual position within a virtual space of the tracked instrument relative to a display device;
displaying an indication of the virtual position of the tracked instrument on the display device;
receiving an indication of an actual position of the tracked instrument relative to the display device; and
determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
18. The method of claim 17, further comprising:
determining an intended shape of the tracked instrument,
wherein displaying the indication of the virtual position of the tracked instrument includes:
displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and
displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
19. The method of claim 17, wherein receiving the actual position of the tracked instrument includes receiving an indication from a user.
20. The method of claim 17, further comprising at least one of:
responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use; and
responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
US17/663,024 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments Pending US20230368418A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/663,024 US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/662,666 US20230363827A1 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments
US17/663,024 US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/662,666 Continuation US20230363827A1 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments

Publications (1)

Publication Number Publication Date
US20230368418A1 true US20230368418A1 (en) 2023-11-16

Family

ID=88699268

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/662,666 Pending US20230363827A1 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments
US17/663,024 Pending US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/662,666 Pending US20230363827A1 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments

Country Status (1)

Country Link
US (2) US20230363827A1 (en)

Also Published As

Publication number Publication date
US20230363827A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
US11737850B2 (en) Methods and systems for display of patient data in computer-assisted surgery
US11819365B2 (en) System and method for measuring depth of instrumentation
US20210022809A1 (en) Robotic fluoroscopic navigation
EP3711700B1 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US20180325610A1 (en) Methods for indicating and confirming a point of interest using surgical navigation systems
US11737831B2 (en) Surgical object tracking template generation for computer assisted navigation during surgical procedure
JP7071078B2 (en) Robot X-ray fluoroscopic navigation
JP7216764B2 (en) Alignment of Surgical Instruments with Reference Arrays Tracked by Cameras in Augmented Reality Headsets for Intraoperative Assisted Navigation
JP7082090B2 (en) How to tune virtual implants and related surgical navigation systems
JP2021194538A (en) Surgical object tracking in visible light via fiducial seeding and synthetic image registration
JP2021146218A (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
JP6979049B2 (en) Robot systems and related methods that provide co-registration using natural standards
US20240108417A1 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
CN113208727A (en) Surgical system
JP2020182842A (en) Systems for robotic trajectory guidance for navigated biopsy needle, and related methods and devices
JP7323672B2 (en) Computer-assisted surgical navigation system for spinal procedures
US20230368418A1 (en) Accuracy check and automatic calibration of tracked instruments
US20230083605A1 (en) Extended reality systems for visualizing and controlling operating room equipment
US20230165640A1 (en) Extended reality systems with three-dimensional visualizations of medical image scan slices
US20240164844A1 (en) Bone landmarks extraction by bone surface palpation using ball tip stylus for computer assisted surgery navigation
US20230310086A1 (en) Camera tracking system identifying phantom markers during computer assisted surgery navigation
US20240115325A1 (en) Camera tracking system for computer assisted surgery navigation
US20200297451A1 (en) System for robotic trajectory guidance for navigated biopsy needle, and related methods and devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: GLOBUS MEDICAL, INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOSHI, SANJAY M.;REEL/FRAME:059954/0789

Effective date: 20220510

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION