US20230263586A1 - Systems and methods for surgical navigation, including image-guided navigation of a patient's head - Google Patents
Systems and methods for surgical navigation, including image-guided navigation of a patient's head Download PDFInfo
- Publication number
- US20230263586A1 US20230263586A1 US18/140,898 US202318140898A US2023263586A1 US 20230263586 A1 US20230263586 A1 US 20230263586A1 US 202318140898 A US202318140898 A US 202318140898A US 2023263586 A1 US2023263586 A1 US 2023263586A1
- Authority
- US
- United States
- Prior art keywords
- camera
- tracker
- patient
- sterile
- cross
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 210000003484 anatomy Anatomy 0.000 claims abstract description 164
- 230000003287 optical effect Effects 0.000 claims description 43
- 230000008859 change Effects 0.000 claims description 5
- 238000001356 surgical procedure Methods 0.000 abstract description 27
- 238000012800 visualization Methods 0.000 abstract description 17
- 230000003190 augmentative effect Effects 0.000 abstract description 8
- 230000004807 localization Effects 0.000 description 46
- 230000007246 mechanism Effects 0.000 description 44
- 239000000523 sample Substances 0.000 description 33
- 210000003128 head Anatomy 0.000 description 28
- 239000000853 adhesive Substances 0.000 description 16
- 230000001070 adhesive effect Effects 0.000 description 16
- 230000004888 barrier function Effects 0.000 description 13
- 230000000295 complement effect Effects 0.000 description 9
- 244000261422 Lysimachia clethroides Species 0.000 description 8
- 241000284156 Clerodendrum quadriloculare Species 0.000 description 7
- 210000004556 brain Anatomy 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 238000007789 sealing Methods 0.000 description 6
- 238000002591 computed tomography Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 210000001624 hip Anatomy 0.000 description 3
- 230000036512 infertility Effects 0.000 description 3
- 210000003127 knee Anatomy 0.000 description 3
- 230000003902 lesion Effects 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- 238000001574 biopsy Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 239000012636 effector Substances 0.000 description 2
- 230000003100 immobilizing effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000013011 mating Effects 0.000 description 2
- 210000004197 pelvis Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000008054 signal transmission Effects 0.000 description 2
- 210000003625 skull Anatomy 0.000 description 2
- 210000002303 tibia Anatomy 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 239000012780 transparent material Substances 0.000 description 2
- 210000000689 upper leg Anatomy 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 238000007428 craniotomy Methods 0.000 description 1
- 238000013129 endoscopic sinus surgery Methods 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 238000011477 surgical intervention Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B46/00—Surgical drapes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B46/00—Surgical drapes
- A61B46/10—Surgical drapes specially adapted for instruments, e.g. microscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/14—Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B90/57—Accessory clamps
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00477—Coupling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00831—Material properties
- A61B2017/00902—Material properties transparent or translucent
- A61B2017/00907—Material properties transparent or translucent for light
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/366—Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/367—Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3991—Markers, e.g. radio-opaque or breast lesions markers having specific anchoring means to fixate the marker to the tissue, e.g. hooks
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B50/00—Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
- A61B50/10—Furniture specially adapted for surgical or diagnostic appliances or instruments
- A61B50/13—Trolleys, e.g. carts
Definitions
- the present application relates to intra-operative localization systems for use in sterile and non-sterile surgical environments and more particularly to systems, methods and devices to track the pose of instruments relative to patient anatomy, to move a camera of an intra-operative localization system from its original mounting position while maintaining a registration between the camera's coordinate-frame and the patient anatomy, to drape an intra-operative localization system to enable use in a sterile environment and to display a visualization of a medical image of the patient anatomy in such environments.
- surgical treatment may include localizing physical lesions identified on a pre-operative image (e.g. MRI) within a patient's brain to perform biopsies, excisions, ablations, etc.
- a pre-operative image e.g. MRI
- FESS Functional Endoscopic Sinus Surgery
- DBS Deep Brain Stimulation
- a camera may be used to determine a registration of the camera coordinate-frame to the patient anatomy or optionally a tracker in relation to the patient anatomy.
- a drape may be applied to permit use in a sterile surgical environment.
- the camera may be moved from its original position to enable access to patient anatomy while maintaining a registration of the camera coordinate-frame with the patient anatomy.
- the camera may be used in a hand-held or head-mounted manner.
- a visualization of the patient anatomy may be displayed on a computing unit, with visualization reference planes defined by the pose of an instrument or the camera. The visualization may be presented on a display of a computing unit or as part of a head mounted augmented reality system.
- a method comprising: releasably coupling a proximal end of a non-sterile camera mounting arm to a surgical clamp immobilizing a patient's anatomy; releasably coupling a non-sterile camera to a distal end of the non-sterile camera mounting arm; following a registration between a coordinate frame of the non-sterile camera and the patient's anatomy in a computing unit: draping the non-sterile camera and the non-sterile camera mounting arm with a camera drape to provide a sterile barrier between the patient's anatomy and the non-sterile camera and non-sterile camera mounting arm; wherein draping is performed without moving a position of the non-sterile camera relative to the patient's anatomy.
- the camera drape may be configured to permit the computing unit to use pose data received from the non-sterile camera after the draping with the registration preformed before the draping.
- the camera drape may be configured to permit a transmission of optical signals to the non-sterile camera without distortion.
- Draping the non-sterile camera may comprise enclosing a closed end of a tube-like camera drape over the non-sterile camera, the camera drape extending to cover the non-sterile camera mounting arm.
- the method may comprise using a holding mechanism to hold the camera drape in place over the non-sterile camera.
- the holding mechanism may comprise a shroud that mechanically clips onto the non-sterile camera.
- the camera drape may comprise a drape optical window and the holding mechanism holds the optical window in a fixed and correct alignment with optics of the non-sterile camera.
- the method may further comprise sealing an interface of an open end of the camera drape to a patient drape to maintain a continuous sterile barrier.
- the patient drape may comprise an opening through which the non-sterile camera mounting arm extends and the interface may be defined by the opening of the patient drape and the open end of the camera drape.
- the camera drape may comprise a sterile elastic band or an adhesive and the method comprises using the sterile elastic band or adhesive when sealing the interface.
- the non-sterile camera may be coupled to the computing unit via a cable and the camera drape encloses a portion of the cable.
- the method may comprise rigidly fixing a non-sterile tracker relative to the patient's anatomy to perform the registration; and following the draping, rigidly fixing a sterile tracker relative to the patient's anatomy to perform surgical navigation without performing a second registration.
- the non-sterile tracker and sterile tracker may be affixed to a same tracker mounting arm having a same position. If a geometry of the non-sterile tracker and a geometry of the sterile tracker are different, a difference may be factored into calculations of poses by the computing unit when the respective non-sterile tracker and sterile tracker are used.
- a computer implemented method comprising: performing, in a computing unit, a registration between a coordinate frame of a non-sterile camera and a patient's anatomy, where the non-sterile camera is releasably coupled to a distal end of a non-sterile camera mounting arm and a proximal end of the non-sterile camera mounting arm is releasably coupled to a surgical clamp immobilizing the patient's anatomy and the non-sterile camera communicates pose data to the computing unit; and, following a draping of the non-sterile camera and the non-sterile camera mounting arm by a camera drape to provide a sterile barrier between the patient's anatomy and the non-sterile camera and non-sterile camera mounting arm, where the draping is performed without moving a position of the camera relative to the patient's anatomy: calculating, by the computing unit, poses of sterile instruments relative to the patient's anatomy using the registration to provide surgical navigation during a surgical procedure.
- the camera drape may be configured to permit a transmission
- the registration may be performed using pose data of non-sterile instruments.
- the camera drape may comprise a drape optical window and a holding mechanism may hold the optical window in a fixed and correct alignment with optics of the non-sterile camera.
- the method may comprise: receiving at the computing unit pose data from the non-sterile camera of a non-sterile tracker rigidly fixed relative to the patient's anatomy to perform the registration; and following the draping, receiving at the computing unit pose data from the non-sterile camera of a sterile tracker rigidly fixed relative to the patient's anatomy to provide the surgical navigation without performing a second registration.
- the non-sterile tracker and sterile tracker may be affixed to a same tracker mounting arm having a same position. If a geometry of the non-sterile tracker and a geometry of the sterile tracker are different, a difference is factored into calculations of poses by the computing unit when the respective non-sterile tracker and sterile tracker are used.
- the patient's anatomy may be a cranium.
- a system to drape a patient, a camera mounting arm and a camera attached thereto to provide a sterile barrier for performing a navigated surgical procedure comprising: a sterile camera drape to cover the camera mounting arm and the camera mounted on a distal end of the camera mounting arm, the sterile camera drape comprising a closed end adjacent the camera when draped and an open end distal from the closed end; and a sealing mechanism to seal the camera drape to a sterile patient drape that covers the patient to maintain a continuous sterile barrier.
- the sterile patient drape may provide an opening to receive a proximal end of the camera mounting arm.
- the open end of the sterile camera drape and the opening of the sterile patient drape may form an interface which is substantially sealed by the sealing mechanism.
- the sealing mechanism may comprises a sterile elastic band, the sterile elastic band engaging the camera drape near or at the open end and the sterile patient drape near or at the opening to form the interface.
- the sterile elastic band may be affixed to the sterile camera drape near or at the open end.
- the sealing mechanism may comprise an adhesive affixed near or at the open end of the sterile camera drape.
- the adhesive may comprise one or more circumferential adhesive rings comprising an adhesive side and a side affixed near or at the open end of the sterile camera drape, the adhesive side of the one or more circumferential adhesive rings engaging the sterile patient drape near or at the aperture to form the interface.
- the sterile patient drape may further comprise: a tubular protrusion with a closed end distal from the sterile patient drape; an adapter near or at the closed end of the tubular protrusion, the adapter comprising a non-sterile connector on a non-sterile side of the sterile patient drape and a sterile connector on a sterile side of the sterile patient drape; where the non-sterile connector is configured to attach to a tracker mounting arm and the sterile connector is configured to attach to a sterile tracker or a sterile tracker mount.
- the sterile patient drape may further comprise a tubular protrusion with a closed end distal from the sterile patient drape; and the closed end of the tubular protrusion may be engaged between the sterile tracker and a tracker mounting arm, and the closed end of the tubular protrusion may be sufficiently thin to not significantly affect the position of the sterile tracker attached to the tracker mounting arm.
- a computer implemented method comprising the steps of: storing, by a computing unit, the differences between geometries of a non-sterile tracker and a sterile tracker, the non-sterile tracker for use during a non-sterile stage of a surgery for a patient and the sterile tracker for use in place of the non-sterile tracker during a sterile stage of the surgery for the patient; calculating, by the computing unit, a registration of the camera with respect to the non-sterile tracker during the non-sterile stage; during the sterile stage where the sterile tracker is used in place of the non-sterile tracker and the patient is draped with a patient drape, using the registration and differences stored by the computing unit when calculating poses.
- the method may further comprise, before the step of calculating the updated registration, the steps of: storing, by the computing unit, a relative position between the non-sterile tracker when mounted to a mounting arm before the patient drape is applied and the sterile tracker when mounted to the mounting arm via the sterile tracker adaptor of the patient drape after the patient drape is applied based on the geometrical properties of the adaptor; during the sterile stage where the sterile tracker is used in place of the non-sterile tracker and the patient is draped with the patient drape, using the relative position stored by the computing unit when calculating poses.
- a system for performing a navigated medical procedure comprising: a computer readable storage medium storing instructions which, when executed on a computing unit, configure the computing unit to: capture a first pose between a camera having a first positional relationship with respect to the anatomy of a patient and a tracker having a second positional relationship with respect to the anatomy of the patient; capture a second pose between the camera having a third positional relationship with respect to the anatomy of the patient and the tracker having the second positional relationship; and calculate an updated registration of a camera with respect to the patient's anatomy based on a first registration, the first pose and the second pose.
- the system may comprise a camera mounting arm configure to hold the camera in the first positional relationship, and further configured to be positionally adjustable such that the camera is moveable to the third positional relationship.
- the system may further comprise the tracker for mounting to have the second positional relationship.
- the system may further comprise a tracker mounting arm configured to hold the tracker in the second positional relationship.
- the instructions may configure the computing unit to calculate the first registration based on pose data and a medical image of the patient's anatomy.
- the instructions may configure the computing unit to provide graphical instructions to a display unit instructing a user to perform at least one of the functions of moving the camera or capturing a pose.
- the instructions may configure the computing unit to, after capturing the first pose and before capturing the second pose, provide to a display unit a graphical indication representing the alignment a working volume of the camera with a surgical site.
- a computer implemented method comprising the steps: capturing, by the computing unit, a first pose between a camera having a first positional relationship with respect to the anatomy of a patient and a tracker having a second positional relationship with respect to the anatomy of the patient; capturing, by the computing unit, a second pose between the camera having a third positional relationship with respect to the anatomy of the patient and the tracker having the second positional relationship; and calculating, by the computing unit, an updated registration of the camera with respect to the patient's anatomy based on a first registration, the first pose and the second pose.
- the method may further comprise the step of calculating, by the computing unit, the first registration based on pose data and a medical image of the patient's anatomy.
- the method may further comprise the step of providing, by the computing unit, graphical instructions to a display unit instructing a user to perform at least one of the functions of moving the camera or capturing a pose
- the method may further comprise, after capturing the first pose and before capturing the second pose, the step of providing, by the computing unit, a graphical indication to a display unit representing the alignment a working volume of the camera with a surgical site.
- a system for visualization of a patient's anatomy comprising: a camera configured to receive optical signals from a tracker having a fixed positional relationship to a patient's anatomy, the optical signals comprising pose data, and to communicate the optical signals to a computing unit; a computer-readable storage device storing instruction which, when executed on the computing unit, configure the computing unit to: provide at least one view of the patient's anatomy based on a medical image of the patient's anatomy to a display unit for display to a user; receive from the camera the optical signals comprising the pose data; calculate, based on the pose data and registration data, the pose of the camera with respect to the patient's anatomy; and modify the view based on camera-based reference planes and the pose of the camera.
- the tracker may be non-invasively attached to the patient.
- the camera may be handheld and movements of the camera may cause the computing unit to modify the view.
- the camera may be head-mounted and head movements may cause the computing unit to modify the view.
- the display unit may comprise a head-mounted display and the view provided to the display may be overlaid on the user's view of the patient.
- the display unit may be transparent to permit a user to see through the display unit.
- the view provided to the display unit may be partially transparent.
- the system may further comprise a projector for projecting a visible pattern onto the patient's anatomy, the visible pattern corresponding to the camera-based reference planes.
- the camera based reference planes may comprise one plane perpendicular to an optical axis of the camera and displaced from the camera by a positive distance, d, along the optical axis such that when an area of interest of the patient's anatomy is located at the distance, d, from the camera, the tracker is in view of the camera.
- the distance, d may be modified.
- a computer implemented method comprising the steps of: providing, by a computing unit, at least one view of a patient's anatomy based on a medical image of the patient's anatomy to a display unit for display to a user; receiving, by the computing unit, optical signals from a camera comprising the pose data of a tracker having a fixed positional relationship with respect to the patient's anatomy; calculating, by the computing unit, based on the pose data and registration data, the pose of the camera with respect to the patient's anatomy; and modifying, by the computing unit, the view based on camera-based reference planes and the pose of the camera.
- FIG. 1 depicts an intra-operative localization system in the context of a navigated cranial procedure using a camera attached to a head-clamp and a tracker attached to a probe.
- FIG. 2 depicts a head-clamp with a starburst mechanism connector.
- FIG. 3 depicts a mounting arm with a starburst mechanism connector.
- FIG. 4 depicts an alternative mounting arm with a gooseneck mechanism.
- FIG. 5 depicts a camera-head-clamp assembly where a camera is mounted to a head-clamp.
- FIG. 6 shows a camera-mounting arm assembly where a camera is mounted to a camera mounting arm.
- FIG. 7 depicts a camera-tracker-head-clamp assembly where a camera and a tracker are mounted to a head-clamp.
- FIG. 8 depicts an intra-operative localization system where a tracker mounted to a head-clamp and a camera separately mounted to a cart.
- FIG. 9 depicts a sterile draped intra-operative localization system with a camera drape.
- FIG. 10 depicts a sterile draped intra-operative localization system with a camera drape and patient drape.
- FIG. 11 depicts a sterile draped intra-operative localization system with a camera drape, a patient drape and a tracker mounted to a mounting arm under the patient drape without breaking the sterile barrier.
- FIG. 12 depicts a computer implemented method for storing differences in geometries of a sterile tracker and a non-sterile tracker and using a registration between a camera and the non-sterile tracker and the stored differences when calculating poses during a sterile stage of a surgery for a patient.
- FIG. 13 depicts a computer implemented method for updating a registration of a camera with respect to a patient's anatomy when the camera is moved from a first pose to a second pose, based on the camera's pose relative to a tracker and the patient's anatomy.
- FIG. 14 depicts a 4-up view of the visualization medical images of a patient's anatomy.
- FIG. 15 depicts orthogonal cut-planes defined relative to a probe.
- FIG. 16 depicts a handheld intra-operative localization system wherein a camera is moveable in a user's hand and a tracker is mounted to a structure on the patient's head.
- FIG. 17 depicts orthogonal reference planes defined by the pose of the camera coordinate frame.
- FIG. 18 depicts a computer implemented method for modifying a view of a patient's anatomy in a medical image based on camera reference planes and the pose of a camera.
- FIG. 19 depicts a head-mounted intra-operative localization system wherein a camera is mounted to a structure of a user's head and a tracker is mounted to a structure on the patient's head.
- FIG. 20 depicts a probe and corresponding virtual probe.
- Described herein are systems and methods for performing a navigated surgical procedure involving a patient's anatomy.
- an image-guided cranial neurosurgical treatment is provided; however, it should be evident that the systems, devices, apparatuses, methods and computer-implemented methods described herein may be applied to any anatomy requiring treatment (e.g. a cranium, a spine, a pelvis, a femur, a tibia, a hip, a knee, a shoulder, or an ankle).
- a computing unit may comprise a laptop, workstation, or other computing device having at least one processing unit and at least one storage device such as memory storing software (instructions and/or data) as further described herein to configure the execution of the computing unit.
- FIG. 1 illustrates an exemplary intra-operative localization system 100 , in the context of a navigated cranial (brain) procedure.
- a camera 102 is shown attached to a mounting arm 104 rigidly mounted to a head-clamp 108 , with its field of view oriented towards the site of the surgery.
- a tracker 106 is attached to an instrument 110 (e.g. a probe), the tracker 106 providing optically detectable features for detection by the camera 102 .
- An instrument may be any type of instrument used in a surgical environment, such as a probe, a tool for cutting or resecting tissue, a tool for retracting tissue, or an imaging too such as an ultrasound probe.
- the camera 102 transmits camera data (including image data or pose data associated with the tracker 106 ) to a computing unit 114 .
- the computing unit 114 performs the necessary processing to calculate the position and orientation (pose) of the instrument 110 with respect to the patient's anatomy 112 (i.e. brain), and to display clinically relevant information to the surgeon.
- the computing unit 114 may also have access to medical image data (such as a magnetic resonance (MRI) image and/or a computed tomography (CT) image of the patient's anatomy, i.e. head/brain), and may further display the navigational information relative to this medical image.
- MRI magnetic resonance
- CT computed tomography
- the intra-operative localization system 100 is registered to the patient's anatomy 112 ; that is, the patient's anatomical planes/axes/features have a relationship, known to the computing unit 114 , with respect to the intra-operative localization system 100 coordinate frame.
- the intra-operative localization system is used in three stages (1) pre-operative set-up, (2) patient set-up, registration and planning, and (3) surgical navigation.
- Pre-operative setup using the intra-operative localization system 100 includes the following exemplary steps: (a) Non-sterile instruments 110 to be used for registration and planning are calibrated, if necessary; (b) medical image data (e.g. Mill data, CT data) of the patient's anatomy 112 is loaded onto the computing unit 114 ; (c) registration landmarks are selected, if necessary depending on the method of registration.
- the pre-operative set-up steps may be performed in advance of entering the operating room. Given the small, portable nature of the exemplary intra-operative localization system 100 , the pre-operative steps may be performed by a trained user off-site at any convenient time or location.
- Calibrating an instrument 110 generally refers to determining or confirming the spatial relationship between the “effector” of the instrument 110 and the tracker 106 associated with that instrument 110 .
- Various tools/jigs/software routines may be used for instrument 110 calibration.
- the “effector” of an instrument 110 refers to the aspect of the instrument 110 for which the navigational information is useful. For example: the tip of a biopsy needle; the shaft axis of a probe; the axis, plane, or pattern of a laser; the position and/or orientation of an implantable device.
- Patient set-up, registration and planning using the intra-operative localization system 100 includes the following exemplary steps: (a) The patient and intra-operative localization system 100 are brought into the operating room; (b) the patient's anatomy 112 (i.e. head) is immobilized via a head-clamp 108 ; (c) the camera 102 is mounted to the mounting arm 104 which is in turn connected to the head-clamp 108 ; (d) landmarks and/or features are localized to generate a registration between the camera coordinate-frame and the patient's anatomy 112 (i.e. head), referred to as Localizer System Registration, and optionally the camera 102 with a medical image of the patient's anatomy 112 (i.e.
- the term “registration” generally refers to at least Localizer System Registration, but may also include Image Registration when required; (e) the registration is verified (e.g. a check to ensure that a virtual spatial representation of a tracked instrument 110 relative to the patient's anatomy 112 (i.e. head) matches the physical spatial relationship between the tracked instrument 110 and the patient's anatomy 112 (i.e. head); (f) the surgical intervention is planned by, for example, identifying the location of a craniotomy that would provide an optimal path towards a lesion to be resected and/or biopsied.
- Registration generally refers to at least Localizer System Registration, but may also include Image Registration when required; (e) the registration is verified (e.g. a check to ensure that a virtual spatial representation of a tracked instrument 110 relative to the patient's anatomy 112 (i.e. head) matches the physical spatial relationship between the tracked instrument 110 and the patient's anatomy 112 (i.e. head); (f) the surgical intervention is planned by, for example, identifying
- Surgical navigation using the intra-operative localization system 100 includes the following exemplary steps during sterile surgical procedures: (a) the patient is prepped and draped; (b) the camera 102 is draped without moving its position relative to the patient's anatomy 112 (i.e. head); (c) the sterile instruments 110 (e.g. a probe connected to a tracker 106 ) are calibrated.
- Surgical navigation regardless of whether it is sterile or non-sterile, includes the step of calculating the pose of at least one instrument 110 and displaying a representation of that instrument 110 on the computing unit 114 in relation to a medical image of the patient's anatomy 112 (i.e. head).
- further registration verifications may be performed to check if the patient's position relative to the camera 102 of the intra-operative localization system 100 is accurate.
- a head-clamp 108 may be any type of clamp for restricting the movement of the patient's head.
- FIG. 2 illustrates an exemplary head-clamp 200 in the form of a Mayfield-style clamp, which immobilizes a patient's skull by clamping it with three points of fixation 204 .
- the head-clamp 200 may include a connector 202 to connect the head-clamp 200 to an operating table and optionally other mounting arms.
- the connector 202 of the head-clamp 200 may be any form of connector.
- the exemplary connector depicted in FIG. 2 is a starburst mechanism.
- a starburst mechanism is advantageous for rigid connections, since it allows rotational adjustment at the time of connection, but when secured, provides a highly rigid structure that is unlikely to shift when forces or impacts are applied.
- a mounting arm 300 configured to mount to a head-clamp 108 , and thus provide a rigid reference to the patient's anatomy 112 (i.e. the skull, head-clamp 108 and mounting arm 300 have a rigid and fixed positional relationship).
- the mounting arm 300 comprises a camera mount 306 connected to the arm mechanism allowing for positional alignment of the camera 102 .
- the mounting arm 300 may provide a complementary connector 302 to the head-clamp connector 202 .
- the complementary connector 302 to may any form of connector.
- the exemplary complementary connector 302 depicted in FIG. 3 is a complementary starburst mechanism.
- the connector 302 on the mounting arm 300 is used to rigidly attach to the head-clamp 108 , and may optionally provide an additional connector interface, such as the same starburst interface, for further attachment thereto (e.g. to attach to the operating table, or to attach a second mounting arm 300 ).
- the mounting arm 300 is preferably adjustable, such that the camera's working volume may be aligned with the working volume of the surgical instruments 110 , referred to as the surgical site. This may be achieved by providing up to 6 Degrees of Freedom (DOF) positional adjustment in the mounting arm 300 with respect to the patient's anatomy 112 (i.e. head).
- DOF Degrees of Freedom
- the mounting arm 300 may incorporate any mechanism to achieve positional alignment.
- FIG. 3 depicts a starburst-style connector 302 which provides 1 DOF; an angular joint 308 provides another DOF; two ball joints 304 provide multiple degrees of freedom each; in aggregate, this exemplary embodiment of a mounting arm 300 provides 6DOF of positional alignment.
- the mounting arm 300 must be rigidly locked in place. This may be accomplished via various mechanisms, including lockable ball joints 304 , lockable angular joints 308 , lockable gooseneck mechanisms, or any other form moveable and rigidly lockable mechanism. Knobs may be provided so that the user can lock the mounting arm in place.
- an alternate goose-neck mounting arm 400 that is also capable of 6DOF of positional alignment is depicted.
- the exemplary embodiment of the goose-neck mounting arm 400 comprises a camera and/or tracker mount 406 connected to a “goose-neck” arm mechanism 404 that allows 6DOF positional alignment, which is in turn connected to a clamp 402 to rigidly connect the goose-neck mounting arm to the operating table, the head-clamp 108 , or any other surface of the surgical environment.
- the exemplary mounting arm 300 and alternative goose-neck mounting arm 400 may be adapted to attach to a tracker instead of a camera.
- FIG. 5 depicts an exemplary camera-head-clamp assembly 500 comprising the head-clamp 108 holding a patient's anatomy 112 (i.e. head), mounting arm 104 and camera 102 , where the camera 102 is generally oriented toward the patient's anatomy 112 (signifying the alignment of the camera's working volume with the surgical site).
- a patient's anatomy 112 i.e. head
- mounting arm 104 i.e. mount 104
- camera 102 is generally oriented toward the patient's anatomy 112 (signifying the alignment of the camera's working volume with the surgical site).
- the camera 102 may be rigidly fixed to the mounting arm 104 via a camera mount 306 , which provides a rigid attachment mechanism fixing the camera 102 to the mounting arm 104 .
- Rigid mechanisms connecting the camera 102 to the mounting arm 104 may include camera 102 being permanently attached to the mounting arm 104 (e.g. by being integrally formed, via welding, adhesion, or fastening), cam locks, threaded connectors, dovetail connections, or magnetic connectors.
- the rigid mechanism of the camera mount 306 holds the camera 102 in a fixed position relative to the mounting arm 104 at least for the duration of the surgically navigated portion of the procedure.
- a releasable and repeatable connection is also contemplated, such that the camera 102 may be removed from the camera mount 306 of the mounting arm 104 (e.g. for convenience), and re-attached in the exact same position.
- Any highly repeatable connection mechanism is contemplated for this purpose, including magnetic kinematic mounts.
- the camera mounting arm 104 is connected to the head-clamp 108 via the connector 202 of the head-clamp 108 being connected to the complimentary connector 302 of the mounting arm 104 .
- FIG. 6 shows an exemplary camera-mounting arm assembly 600 (similar to the assembly 500 of FIG. 5 ) but where a camera 102 is mounted to a mounting arm 104 via a camera clamp 602 (the clamp 602 having an upper portion and a lower portion, a threaded adjustment mechanism and a hinge joint, such that the threaded adjustment mechanism can apply sufficient force to the camera 102 to rigidly hold it in place, the clamp 602 further having a camera mount 608 comprised of a force-applying feature (e.g. magnet(s)) and kinematic locating features (in this case, mating hemispherical features and v-slots)).
- the mounting arm 104 provides a complementary and mating camera mount 306 .
- the result is that (a) the camera 102 /clamp 602 can be attached to the mounting arm 104 via camera mount 306 ; (b) the camera 102 /clamp 602 can be removed from the mounting arm 104 by pulling it off the camera mount 306 , and (c) b) the camera 102 /clamp 602 can be reattached with the exact same positional relationship to the mounting arm 104 via the camera mount 306 .
- the camera 102 includes features 604 configured to optionally connect a shroud.
- Features 604 may be a raised surface from the shape of the camera body (such as is shown) or an indentation (not shown) or a combination of same for receiving a cooperating surface of the shroud.
- the system may include a tracker with a fixed and rigid positional relationship with respect to the head-clamp 108 , for example, via a mounting arm for the tracker.
- FIG. 7 camera-tracker-head-clamp assembly 700 shows both a camera 102 and tracker 708 mounted to a head-clamp 108 restricting the movement of the patient's anatomy 112 (i.e. head).
- the camera 102 may be mounted to the camera mounting arm 104 via the camera mount 306 .
- the camera mounting arm 104 may be connected via a connector 302 to a complementary head-clamp connector 202 .
- the camera 102 may be oriented toward the surgical site, and the tracker 708 is preferably fixed in a position that is within the working volume of the camera 102 (i.e. the tracker 708 is in close proximity to the surgical site, and generally orientated such that it is within the working volume of the camera 102 ).
- the tracker 708 may be rigidly attached to the head-clamp 108 in a permanent manner (for the duration of the procedure).
- the tracker 708 may be integrally formed with its own mounting arm 702 which is in turn connected to the head-clamp 108 via, for example, a clamp 704 .
- the tracker 708 may also be releasably and repeatably coupled to the tracker mounting arm 702 (with a high degree of positional repeatability) via a tracker mount 706 .
- the tracker 708 serves as a rigid reference for the patient's anatomy 112 , and the camera 102 need not maintain a rigid and fixed position with respect to the patient's anatomy 112 .
- software in computing unit 114 may be configured to perform a registration so as to register the positional relationship between the tracker 114 and the patient's anatomy 112 and thereafter track the pose of the patient's anatomy 112 in the field of view (working volume) of the camera 102 even if the camera 102 is in a position that is different from the camera's position at registration.
- Exemplary benefits of this configuration include the ability to adjust the camera's 102 position mid-procedure (e.g. to attain a better viewing angle of the surgical site) and the ability to mount the camera 102 on other objects other than the patient's anatomy 112 such as a cart, the operating table (via standard operating table attachment rails), or using the camera 102 in a handheld or surgeon-mounted (e.g. head mounted) manner.
- FIG. 8 illustrates an exemplary intra-operative localization system 800 where the camera 102 is rigidly attached to a cart 802 providing the computing unit 114 .
- the camera 102 is mounted to a mounting arm 104 which is in turn rigidly connected to the cart 802 .
- a head-clamp 108 restricts the movement of the patient's anatomy 112 .
- a tracker 708 is mounted via a tracker mount 706 to a tracker mounting arm 702 , which is in turn rigidly connected to the head-clamp 108 via, for example, a clamp 704 .
- FIG. 9 depicts an exemplary sterile draped intra-operative localization system 900 .
- a camera 102 is attached to a mounting arm 104 via a camera mount 306 .
- Camera 102 is enclosed in a sterile drape 904 affixed to the camera 102 via a shroud 902 .
- the drape 904 extends to the base of the mounting arm 104 near a mounting arm connector 102 which attaches the mounting arm 104 to the head-clamp 108 at a head-clamp connector 202 .
- the head-clamp 108 restricts the motion of the patient's anatomy 112 (i.e. head).
- the camera 102 is connected via a cable 908 to the computing unit 114 .
- the intra-operative localization system 900 includes a sterile drape 904 with an optical window 906 at its distal end.
- the drape 904 includes a long tube-like body that encloses the camera 102 , a proximate portion of the cable 908 connecting the camera 102 to the computing unit 114 and the mounting arm 104 .
- the drape 904 may terminate at the base of the mounting arm 104 , which may be located at its connector 302 (i.e. the attachment point to the connector 202 of the head-clamp 108 ).
- the proximate portion of the cable is typically at least the portion of the cable that extends from the camera to (more or less) a length of the mounting arm 104 .
- the drape 904 is intended to cover the portion of the cable that may come into contact with personnel or tools (during the surgery when sterility is to be maintained) at or about the location of the surgery near the patient's head.
- the exemplary intra-operative localization system 900 provides a mechanism to hold the drape 904 in place on the camera 102 . This is desired in order to hold the drape window 906 in alignment with the optics of camera 102 (e.g. glass or other material about the lens opening (not shown)), or the optical path.
- This mechanism may be a shroud 902 , that mechanically clips, or otherwise connects, to the camera 102 via the shroud connecting features 604 of the camera 102 while holding the drape window 906 in a fixed and correct alignment with the camera's optics, without puncturing or compromising the sterile barrier (i.e. by sandwiching the sterile drape 904 between the body of the shroud 902 and the camera 102 ).
- the shroud 902 may be capable of deflecting, such that the shroud 902 provides a spring force to hold the camera 102 and the drape 904 assembly together.
- the shroud 902 may provide locating features intended to mate with complementary locating features on the camera 102 , to enforce the correct alignment between the shroud 902 and the camera 102 .
- the drape window 906 is designed to allow for undistorted transmission of optical signals; it may be constructed of a rigid, thin and flat optically transparent material.
- the sterile draped intra-operative localization system 1000 is similar to system 900 and is used where the patient is also draped for the surgical procedure.
- the camera 102 , shroud 902 , mounting arm 104 , head-clamp 108 (not shown), camera drape 904 and drape window 906 may be arranged as described above for the exemplary localization system 900 depicted in FIG. 9 .
- a patient drape 1002 covers the patient and the head-clamp 108 .
- the patient drape 1002 has an opening 1006 to expose the surgical site of the patient's anatomy 112 (i.e. head).
- the patient drape 1002 further provides a mounting arm 104 opening 1004 such that the camera 102 and mounting arm 104 covered by a camera drape 904 can stick out through the patient drape 1002 .
- the draping procedure is performed according to aseptic techniques.
- the interface between the patient drape 1002 and the camera drape 904 may be substantially sealed to maintain a continuous sterile barrier.
- the sterile patient drape 1002 and camera drape 904 may be configured in the following ways.
- a sterile elastic band may be used to hold the patient drape 1002 opening tightly around the camera 102 drape 904 .
- the elastic band may be provided in the sterile packaging of the sterile patient drape 1002 and camera drape 904 as a separate unit. Alternatively, the elastic band may be pre-attached to the camera drape 904 or patient drape 1002 .
- the camera drape 904 may comprise one or multiple circumferential rings (or a spiral) with adhesive at or near the end of the camera drape 904 (distal from the window end) such that the patient drape 1002 can be adhered to the camera drape 904 along the outer circumference of drape 904 .
- the adhesive rings may be covered by strips, and exposed for use by removing the adhesive strip covering. Multiple circumferential adhesive rings may be provided so that the desired location along the length of the patient drape 1002 and the camera drape 904 interface may be used.
- the camera drape 904 may provide an adhesive strip, either partially or removably attached thereto. The adhesive strip may be used to secure the camera drape 904 to the patient drape 1002 .
- Other fasteners may be contemplated, including hook and eye, pull fasteners, etc. configured to maintain the sterile barrier.
- the intra-operative localization system 100 in non-sterile as well as sterile environments in the same procedure.
- registration i.e. localization system registration and/or image registration
- FIG. 11 depicts a sterile draped intra-operative localization system 1100 .
- the exemplary intra-operative localization system 1100 is conducive to sterile and non-sterile use within the same procedure, since it is configured to be used with or without the sterile camera drape 190 .
- the sterile camera drape 904 may be applied to the camera 102 and/or mounting arm 104 without moving the camera 102 relative to the patient's anatomy 112 such that the position of the camera 102 is the same while the surgical environment is not sterile and after the draping techniques have been applied to make the surgical environment sterile.
- the exemplary intra-operative localization system 1100 is used in sterile use, it may be necessary to use only sterile instruments 110 in order to maintain the sterile environment.
- non-sterile instruments 110 such as non-sterile registration instruments 110 .
- a tracker 708 rigidly fixed relative to the patient's anatomy 112 (i.e. head). It may be desirable to have a tracker 708 rigidly fixed relative to the patient's anatomy 112 in a sterile environment.
- a sterile tracker 708 is rigidly attached to the head-clamp 108 through the patient drape 1002 .
- the patient drape 1002 includes a window 1006 to make visible the patient anatomy 112 .
- the sterile tracker 708 is connected to a non-sterile tracker mount 1102 through the patient drape 1002 without compromising sterility.
- the patient drape 1002 may include an adaptor with a non-sterile-side connection for attachment to the tracker mounting arm 702 , and a sterile side for attachment to the sterile tracker 708 and/or the sterile tracker mount 1102 .
- the geometrical properties of the adaptor may be known to the computing unit 114 , such that the relative position between a non-sterile tracker 708 (when mounted to the tracker mounting arm 702 before the patient drape 1002 is applied) and a sterile tracker 708 (when mounted to the tracker mounting arm 702 after the patient drape 1002 is applied) is known to the intra-operative localization system 1100 . Additionally, the location of the optically detectable features of the tracker 708 may also be known.
- the sterile tracker 708 may also be configured to puncture the patient drape 1002 and attach to the tracker mounting arm 702 and/or the tracker mount 1102 , the punctured part of the patient drape 1002 being covered by the base of the tracker 708 such that contamination to the sterile side of the patient drape 1002 is highly improbable.
- the patient drape 1002 may be sandwiched between the sterile tracker 708 and the tracker mounting arm 702 , the patient drape 1002 being sufficiently thin so as to not significantly affect the position of the sterile tracker 708 on the tracker mounting arm 702 and allow a sufficiently strong connection between the sterile tracker 708 and the tracker mounting arm 702 to not allow the sterile tracker 708 to fall off the tracker mounting arm 702 due to movement of the patient drape 1002 during the surgical procedure.
- the intra-operative localization system 1100 may use a sterile tracker 708 and a non-sterile tracker 708 at different stages of a surgical procedure.
- the sterile tracker 708 may have the same geometry as the non-sterile tracker 708 ; alternatively, the respective geometries may be different, in which case the difference may be known to the computing unit 114 and be factored into calculations of poses at different stages of a surgical procedure accordingly.
- the camera 102 , camera mounting arm 104 , camera drape 904 , camera drape window 906 and shroud 902 may be configured similar to the above described sterile draped camera intra-operative localization system 1000 .
- the camera mounting arm 104 protrudes from the camera mounting arm opening 1004 .
- the camera 102 may provide a user interface (comprising for example buttons, indicator lights and/or displays).
- the intra-operative localization system 1100 may allow the user to access to the user interface both when sterile and non-sterile.
- the user interface may be accessible and/or functional through the sterile camera drape 904 . This may be accomplished, for example, by the camera drape 904 being of clear and flexible material.
- FIG. 12 depicting a computer-implemented method 1200 for storing differences in geometries of a non-sterile tracker 708 and a sterile tracker 708 and using a registration between a camera 102 and the non-sterile tracker 708 and the stored differences when calculating poses during a sterile stage of a surgery for a patient.
- a computing unit 114 stores (at 1202 ) the differences between geometries of a non-sterile tracker 708 and a sterile tracker 708 , the non-sterile tracker 708 for use during a non-sterile stage of a surgery for a patient and the sterile tracker 708 for use in place of the non-sterile tracker 708 during a sterile stage of the surgery for the patient.
- the computing unit 114 calculates (at 1204 ) a registration of the camera 102 with respect to the non-sterile tracker 708 during the non-sterile stage.
- the computing unit 114 uses the registration and differences stored by the computing unit 114 when calculating poses.
- the computing unit 114 may store a relative position between the non-sterile tracker 708 when mounted to a mounting arm 702 before the patient drape 1002 is applied and the sterile tracker 809 when mounted to the mounting arm 702 via the sterile tracker adaptor of the patient drape 1002 after the patient drape 1002 is applied, based on the geometrical properties of the adaptor.
- the computing unit 114 may use the stored relative position when calculating poses.
- the camera 102 may be desirable to reposition the camera 102 , for example, in order to achieve better viewing angles of the instruments 110 being tracked as part of the surgical procedure.
- the positional relationship between the camera 102 and the patient's anatomy 112 may be registered and moving the camera 102 would compromise this registration.
- a system is described herein to provide a “Move Camera” function, allowing the camera 102 to be moved between a plurality of positions and orientations while maintaining a registration that allows the position and orientation of instruments 110 to be tracked relative to the patient's anatomy 112 .
- the camera 102 may be repositioned without compromising the registration using the following system.
- a tracker 708 either sterile or non-sterile, is rigidly attached to the head-clamp 108 and is within the field of view and trackable by the camera 102 .
- Computing unit 114 captures a first pose of the rigidly-mounted tracker 708 relative to the camera 102 and the patient anatomy 112 , and computes a tracker-patient anatomy registration (i.e.
- the computing unit 114 captures a second pose of the rigidly-mounted tracker 708 relative to the camera 102 and the patient anatomy 112 and computes a new camera-tracker registration (i.e.
- the computing unit 114 uses the computing unit 114 to compute a new camera-patient anatomy registration (i.e. the registration between the coordinate-frame of the camera 102 in its new position and orientation with the coordinate-frame of the patient anatomy 112 ) by applying the tracker-patient anatomy registration stored in the memory of the computing unit 114 to the new camera-tracker registration.
- a new camera-patient anatomy registration i.e. the registration between the coordinate-frame of the camera 102 in its new position and orientation with the coordinate-frame of the patient anatomy 112
- the memory of the computing unit 114 may include instructions to display to a user describing how to capture the poses of the camera 102 and to compute the registrations accordingly.
- the computing unit 114 may also display graphical instructions to the user on a display of the computing unit 114 to guide them through the aforementioned steps.
- the computing unit 114 may be further configured to display a graphical indication of how the camera's working volume aligns with the surgical site, after capturing the first pose and before capturing the second pose (i.e. to give a user visual feedback to help them align the camera to a location that has an improved view of the surgical site).
- FIG. 13 depicting a computer-implemented method 1300 for updating a registration of a camera 102 with respect to a patient's anatomy 112 when the camera 102 is moved from a first pose to a second pose, based on the camera's pose relative to a tracker 708 and the patient's anatomy 112 .
- a computing unit 114 captures (at 1302 ) a first pose between the camera 102 having a first positional relationship with respect to the patient's anatomy 112 and the tracker 708 .
- the computing unit 114 captures (at 1304 ) a second pose between the camera 102 having a third positional relationship with respect to the patient's anatomy 112 and the tracker 708 .
- the computing unit 112 calculates (at 1306 ) an updated registration of the camera 102 with respect to the patient's anatomy 112 .
- a “4-up” style visualization of the patient's anatomy may be used to display a multiple planes of a three dimensional (“3D”) medical image (e.g. MRI or CT image) of the patient's anatomy 112 .
- 3D three dimensional
- FIG. 14 illustrates a “4-up” view 1400 that would be displayed on a display of a computing unit in an intra-operative localization system.
- the “4-up” style of visualization 1400 includes, for a current location within the patient's 3D medical image: (1) a two dimensional cross-section of the 3D medical image in the coronal plane 1402 of the patient's anatomy 112 , (2) a two dimensional cross-section of the 3D medical image in the sagittal plane 1404 of the patient's anatomy 112 , (3) a two dimensional cross-section of the 3D medical image in the transverse plane 1406 of the patient's anatomy 112 , and (4) and isometric view 1408 of the 3D medical image.
- the “4-up” view 1400 may update in real-time based on the position of a navigated surgical instrument 110 .
- the two-dimensional coronal, sagittal and transverse cross-sections of the 3D medical image ( 1402 , 1404 , 1406 ) may be updated in real time to reflect the position of the a tracked probe, where the plane of the coronal, sagittal and transverse two dimensional cross-sections of the 3D image ( 1402 , 1404 , 1406 ) reflect the current tip of the probe relative to the patient's anatomy 112 .
- the isometric view 1408 may be modified to enhance visualization of the anatomical features of interest.
- the isometric view 1408 may be modified to provide cut-away views of the patient's anatomy 112 such that regions of interest inside the anatomical volume, for example the brain, may be displayed.
- the regions of interest may include structures or lesions identified during pre-operative planning. Further, the regions of interest may be displayed in a way that is visibly distinguished from other areas within the anatomy.
- the areas of interest may be pre-operatively identified and may be pre-operatively segmented within the medical image such that they may be viewed and/or manipulated independently from the other anatomy.
- the isometric view 1408 may be modified in real-time based on the pose of a tracked probe.
- the cut-away view of the patient's anatomy 112 may be displayed based on the pose of the tracked probe.
- FIG. 15 illustrates orthogonal three-dimensional cut-planes 1500 defined by the pose of a probe 1514 .
- the probe may comprise a shaft 1508 , a tip 1510 , and a body 1512 with optically trackable features.
- a first cut-plane 1502 is defined as being perpendicular to the shaft 1508 of the probe 1514 and containing the point defined by the tip 1510 of the probe 1514 .
- a second cut-plane 1504 is defined as being parallel to the front face of the probe 1514 and containing the vector defined by the shaft 1508 of the probe 1514 .
- a third cut-plane 1506 is defined as being perpendicular to the front face of the probe 1514 and containing the vector defined by the shaft 1508 of the probe 1514 .
- FIG. 16 illustrates a hand-held intra-operative localization system 1600 showing camera 102 held in a user's hand 1612 , while tracking the pose of a tracker 106 in the field of view 1610 of the camera 102 where the tracker 708 is registered to the patient's anatomy 112 (i.e. head).
- the registration includes image registration of a patient medical image to the patient anatomy 112 .
- the tracker 708 is fixedly attached to the patient's anatomy 112 which may be accomplished, for 1614 may be attached to a head-clamp 104 , or the mounting structure 1614 may be affixed directly to the patient's anatomy 112 (e.g.
- a tracker 708 may comprise individual fiducial markers attached to the patient's anatomy 112 forming a trackable array.
- the tracker 708 may be attached with non-invasive means (e.g. suction cups, elastic straps, glasses frames, stickers, individually attachable fiducial markers, clamps).
- non-invasive means e.g. suction cups, elastic straps, glasses frames, stickers, individually attachable fiducial markers, clamps.
- the tracker 708 it may be advantageous for the tracker 708 to be rigidly anchored directly to bone using invasive means, such as bone screws.
- a tracker 708 is attached to a patient's anatomy 112 (i.e. head) via a mounting structure 1614 that includes a head band and a frame that contacts a patient's ears and the bridge of their nose.
- the pose of the camera 102 may be used to modify the visualization displayed on the computing unit 114 , for example, by modifying any of the coronal, sagittal and transverse two dimensional cross-sections 1402 , 1404 , 1406 and the isometric view 1408 to correspond to the camera 102 coordinate-frame.
- FIG. 17 illustrates orthogonal camera-based reference planes 1700 defined by the camera coordinate-frame.
- the camera 102 may be held in a user's hand 1612 and the orthogonal three-dimensional cut-planes 1700 will be defined to correspond with the current pose of the camera 102 .
- the camera-based reference plane used to modify the coronal plane 1402 displayed in the 4-up view 1400 is referred to as the coronal' plane 1702 .
- the camera reference plane used to modify the sagittal plane 1404 displayed in the 4-up view 1400 is referred to as the sagittal' plane 1704 .
- the camera reference plane used to modify the transverse plane 1406 displayed in the 4-up view 1400 is referred to as the transverse' plane 1706 .
- Each of the orthogonal reference-planes may share a common origin 1708 displaced from the camera 102 .
- the origin 1708 is displaced by a distance “d” from the camera 102 along the optical axis 1712 .
- the distance “d” may be any distance and may be selected by the user. Further, the distance “d” may be selected such that when any anatomy of interest is located at the origin 1708 , the tracker 708 is viewable by the camera 102 (i.e. the tracker 708 is within the camera's working volume).
- the coronal' plane 1702 may be further defined as being parallel to the plane of the camera's optical imager 1710 and perpendicular to the optical axis 1712 of the camera 102 .
- the sagittal′ plane 1704 may be further defined as being perpendicular to the plane of the camera's optical imager 1710 and parallel to the vector of the vertical axis 1714 of the camera 102 .
- the transverse′ plane 1706 may be further defined as being perpendicular to the plane of the camera's optical imager 1710 and perpendicular to the vector of the vertical axis 1714 of the camera 102 .
- the coronal' plane 1702 , sagittal' plane 1704 and transverse' plane 1706 may be used instead of anatomical planes to visualize the patient's anatomy 112 via the slices depicted in the 4-up style visualization displayed on the display of the computing unit 114 .
- the patient's anatomical reference planes may be used.
- the system may allow a user to select which reference planes (i.e. camera-based or patient based) are displayed, for example, via buttons located on the camera 102 .
- the isometric view 1408 may also be modified to correspond to the camera-based reference planes 1702 , 1704 , 1706 , either in conjunction with, or independently from, the two-dimensional slices.
- the isometric view 1408 is modified independently from the slices to correspond to the camera-based reference planes 1702 , 1704 , 1706 , the two-dimensional slices in the 4-up style visualization displayed on the display of the computing unit 114 will remain based on the coronal, sagittal and transverse patient reference planes.
- FIG. 18 depicting a computer-implemented method 1800 for modifying a view of a patient's anatomy in a medical image based on camera reference planes and the pose of a camera.
- a computing unit 114 provides (at 1802 ) at least one view of the patient's anatomy 112 for display to a user on a display unit.
- the computing unit 114 receives (at 1804 ) optical signals from a camera 102 comprising pose data of a tracker 708 having a fixed positional relationship with respect to the patient's anatomy 112 .
- the computing unit 114 calculates (at 1806 ) a registration of the tracker coordinate frame to the patient's anatomy 112 and (at 1808 ) the pose of the camera 102 with respect to the patient's anatomy 112 .
- the computing unit modifies (at 1810 ) the at least one view based on camera-based reference planes and the pose of the camera 102 with respect to the patient's anatomy 112 .
- the camera 102 may provide a projector for projecting a visible pattern onto the patient's anatomy 112 .
- the projector may be any means of projecting the visible on the patient anatomy 112 .
- the visible pattern may be generated via two planar lasers, the planar lasers being perpendicular to each other, and parallel to the sagittal' plane 1704 and transverse' plane 1706 , and further passing through the origin 1708 , displaced from the camera by distance “d”.
- the location of where the reference planes intersect with the patient's outer surface may be physically represented on the patient, while the views displayed on the display of the computing unit 114 are based on those same reference planes.
- a line projecting laser may be used, the line passing through the origin. Therefore, a user may have an enhanced ability to visualize a patient's anatomy 112 based on the displayed views and the projected pattern on the patient.
- the camera 102 and a display 1902 may be head-mounted and may be contained in an augmented reality headset 1904 worn by the surgeon 1906 .
- Display 1902 may include a projector and a surface upon which to project.
- the surface may be a glass or plastic surface of the headset, such as a lens carried by an eyeglasses frame.
- the headset 1904 may be coupled to computing unit 114 such as via one or more cables/cabling 1908 providing an augmented reality system 1900 .
- Computing unit 114 may also comprise an integrated display device (e.g. display screen) for presenting information to the surgeon or others.
- the computing unit 114 can compute the pose of the camera 102 relative to the tracker 708 and patient anatomy 112 as the surgeon 1906 moves their head, thus ensuring the proper alignment of the overlay of the virtual view of the patient anatomy 112 on the user's actual view of the patient.
- the headset may incorporate computing unit 114 itself (not shown).
- the camera may be located closer to an eye of the surgeon, such as on a corner of the frame of the headset, in front of a portion of the glass/plastic lens to more closely align with the surgeon's field of vision.
- the augmented reality display in such an augmented reality system is preferably transparent to allow the surgeon to see through the display of the computing unit 114 to directly see the patient.
- the computing unit 114 may receive a real-time feed from the camera 112 of the patient anatomy. This real-time feed from the camera 112 may be displayed on the augmented reality display of the computing unit 114 .
- the overlaid virtual view of the patient's anatomy 112 may be opaque or partially transparent. In any headset embodiment, it may be preferable to only display a single view, rather than the “4-up” view displayed on a non-augmented reality display of a computing unit 114 . If a single view is presented, it may be based on the coronal' plane 1702 and the origin 1708 .
- a virtual view of any pre-identified regions of interest may also be persistently displayed from a perspective that matches camera's 102 coordinate-frame and thus be representative of the user's actual perspective of the patient's anatomy 112 .
- the relative pose between the tracker 708 , which is attached to the patient, and the camera 102 , whose pose can be manipulated by a user, may be used to control the pan and tilt of an isometric view 1408 , displayed to the user.
- the camera 102 may provide a button, which when pressed, causes the computing unit 114 , connected to the camera 102 , to enter a pan or tilt mode, in which the changes to the relative pose of the tracker 708 and camera 102 cause a change to the pan and/or tilt of the displayed isometric view 1408 .
- tilt mode for example, a change in the relative pose of the tracker 708 and camera 102 results in a corresponding change in the tilt of the displayed isometric view 1408 of the medical image.
- pan mode for example, a relative translation of the tracker 708 and camera 102 causes a corresponding translation of the isometric view 1408 of the medical image.
- an intra-operative localization system may provide a virtual probe to aid in planning the navigation of the surgical procedure.
- FIG. 20 illustrates a corresponding probe and virtual probe pair 2000 .
- the system provides a virtual probe 2004 comprising a body 2008 , the body 2008 with optically trackable features and a location for a user to grasp or a handle, the virtual probe 2004 may not have a physical shaft or tip.
- An intra-operative localization system using a virtual probe 2004 may further comprise a computing unit 114 that has access to the location of the virtual tip 2006 of the virtual probe 2004 which is located a distance, “d T ”, from relative to the body 2008 .
- computing unit stores or has access to a stored definition of the virtual probe (e.g.
- the computing unit 114 then provides a navigational display of the patient's anatomy 112 , where the displayed view(s) are modified based on the position of the virtual tip 2006 of the virtual probe 2004 .
- the system may also provide a probe 1514 , comprising a body 1512 with optically trackable features and a location for a user to grasp, as well as a shaft 1508 and a tip 1510 extending from the body 1512 .
- the probe 1514 and the virtual probe 2004 may have similar or identical features on the respective bodies 1512 , 2008 .
- the main difference between the two probes 1514 , 2004 may be that the probe 1514 provides a physical shaft 1508 with a tip 1510 for localization, whereas the virtual probe 2004 does not.
- the virtual tip 2006 location relative to the virtual probe body 2008 (as accessed by the computing unit 114 ) is the same as the physical tip 1510 location of the probe 1514 relative to the probe body 1512 .
- the system may further be configured to provide the virtual probe 2004 for non-sterile use, and the probe 1514 for sterile use.
- a system for performing a navigated medical procedure comprising: A camera configured to be mounted relative to a patient's anatomy by a mounting arm, the camera being configured to detect optical signals comprising pose information of an object at the surgical site, and providing the optical signals to a computing unit for calculating pose; a tracker configured to provide optical signals for detection by the camera, the tracker attached to or inherently a part of the object; the mounting arm further configured to provide positional adjustment to orient the camera toward the surgical site; the camera and mounting arm being further configured to be enclosed within a sterile camera drape for use within a sterile field; the position of the camera relative to the anatomy not changing when enclosed within the sterile camera drape.
- the system may further include the sterile camera drape.
- the sterile camera drape may further provide a window to allow for optical transmission of signals comprising pose information.
- the system may further comprise a shroud for securing the sterile camera drape to the camera.
- the camera may further comprise shroud features to mate the shroud with the camera to secure the sterile camera drape.
- the system may be configured to secure the sterile camera drape to the camera such that the window is secured in alignment with the optical path of the camera.
- the shroud may be configured to secure the camera drape to the camera via spring forces.
- the mounting arm may be configured to rigidly attach to the camera.
- the system may further comprise a camera clamp to rigidly hold the camera, the camera clamp being further configured to provide a mounting mechanism to the mounting arm.
- the mounting arm may be configured to releasably and repeatably attach to the camera.
- the camera or camera clamp may provide a kinematic mounting mechanism and the mounting arm may provide a complementary kinematic mounting mechanism.
- the mounting arm may be configured to provide positional alignment via lockable joints.
- the mounting arm may comprise at least one joint for positional adjustment where the at least one joint is a lockable ball joint.
- the mounting arm may comprise multiple joints where the multiple joints are lockable by a single user-adjustable mechanism.
- the mounting arm positional adjustment may be performed when enclosed within the sterile camera drape, for example via adjustment members that are grippable through the drape.
- the mounting arm may be configured for mounting to a patient immobilizer or a patient positioner.
- the patient immobilizer may be a head clamp.
- the mounting arm may configured for rigid fixation to the patient's anatomy.
- Rigid fixation may be provided through a Mayfield clamp or secured to the patient's anatomy via bone screws.
- the patient's anatomy may be one of a cranium, a spine, a pelvis, a femur, a tibia, a hip, a knee, a shoulder, an ankle.
- a system for performing a navigated surgical procedure comprising: a sterile camera drape configured to provide a sterile barrier for a camera mounting arm and a camera attached thereto.
- the sterile drape may be configured to allow positional adjustment of the positionally-adjustable camera mounting arm when providing the sterile barrier.
- the sterile camera drape may further provide a window to allow for optical transmission of signals comprising pose information from the camera to a computing unit.
- the sterile camera drape window may be made of a rigid, thin and flat optically transparent material.
- the sterile camera drape may be configured to be secured in alignment with a camera such that the optical window is in alignment with the optical path of the camera.
- the sterile camera drape may be configured to extend from the camera to at least the base of the camera mounting arm.
- the sterile camera drape may comprise a mechanism for providing a continuous sterile barrier with a patient drape.
- the mechanism may be a sterile elastic band, configured to tightly hold together the patient drape with the sterile camera drape.
- the mechanism may be an adhesive strip, which is configured to be applied at a location where the sterile camera drape and patient drape intersect.
- the mechanism may be a plurality of adhesive sections encircling the sterile camera drape at various locations along the length of the sterile camera drape, configured to enable circumferential adhesion to a patient drape at a desired location along the length of the camera drape.
- a system for performing a navigated medical procedure comprising: a mounting arm, configured to attach to a camera configured to detect optical signals comprising pose information of objects at a surgical site and providing the optical signals to a computing unit for calculating pose, the mounting arm having proximal end comprising an attachment mechanism, and a distal end comprising a base mounting mechanism.
- the mounting arm may comprise a user-adjustable mechanism to adjust the relative position and orientation of the proximal and distal ends in up to 6 DOF.
- the user-adjustable mechanism may comprise at least one lockable ball joint.
- the user-adjustable mechanism may comprise a gooseneck mechanism.
- the base mounting mechanism may be configured to attach to one of: a mobile cart, an operating table, providing a compatible clamp for operating room table rails, a head clamp, providing a starburst connector.
- the mounting arm may comprise a second connector at the distal end.
- the second connector may comprise a same connector that is complementary to the base mounting mechanism.
- the mounting arm may be further configured to selectively attach to a tracker.
- the system may comprise a second mounting arm, the second mounting arm configured to attach to a tracker.
- the second mounting arm may be configured to attach to a sterile tracker and a non-sterile tracker, for example, one at a time.
- the second mounting arm may be configured to attach to a sterile tracker through a sterile patient drape.
- a system for performing a navigated medical procedure comprising: a virtual probe comprising a body providing a tracker and a surface (e.g. handle) to be grasped by a user, the tracker of the virtual probe configured to provide optical signals comprising pose data to a camera in communication to a computing unit, the computing unit configured to provide a view of a patient's anatomy for display, the view of the patient's anatomy being based on a medical image, the computing unit further configured to modify the view based on a registration and further based on a location of a tip of the virtual probe, the location of the tip of the virtual probe relative to the pose of the tracker represented by the pose data being accessible in a memory to the computing unit.
- the view may be modified to show the location of the tip of the virtual probe in the medical image.
- the medical image may be one of a CT-scan and an MRI-scan.
- the system may further comprise a probe comprising a body comprising a tracker and a user-graspable aspect, and further comprising a shaft and a tip extending from the body.
- the location of the tip of the virtual probe may have the same positional relationship to the virtual probe tip body as the positional relationship between the probe tip and the probe body.
- the virtual probe may be provided for non-sterile use and the probe being provided for sterile use.
- the patient's anatomy may be a cranium and brain.
- the patient's anatomy may be a hip or knee.
- the patient's anatomy may be a vertebrae.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Robotics (AREA)
- Neurosurgery (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Systems, methods and devices are described herein for performing a navigated surgical procedure involving a patient's anatomy in sterile and non-sterile surgical environments. A camera may be used to determine a registration of the camera coordinate-frame to the patient anatomy or optionally a tracker in relation to the patient anatomy. A drape may be applied to permit use in a sterile surgical environment. The camera may be moved from its original position to enable access to patient anatomy while maintaining a registration of the camera coordinate-frame with the patient anatomy. Alternatively, the camera may be used in a hand-held or head-mounted manner. A visualization of the patient anatomy may be displayed on a computing unit, with visualization reference planes defined by the pose of an instrument or the camera. The visualization may be presented on a display of a computing unit or as part of a head mounted augmented reality system.
Description
- This application is a continuation of U.S. application Ser. No. 16/331,236 (the “'236 application”), having a filing or 371(c) date of Mar. 7, 2019 and entitled, “Systems and Methods for Surgical Navigation, Including Image-Guided Navigation of a Patient's Head”, which is incorporated by reference in its entirety. The '236 application is a 371 of PCT/IB2017/055400 filed Sep. 7, 2017, the “PCT application”), having a same title and which claims the benefit of U.S. Provisional Patent Application No. 62/384,410, filed Sep. 7, 2016, entitled “Systems and Methods for Surgical Navigation, Including Image-Guided Navigation of a Patient's Head” which is incorporated by reference herein. This application incorporates by reference U.S. Provisional Patent Application No. 62/328,978, filed May 27, 2016, entitled “Systems and Methods to perform image registration and scan 3D surfaces for intra-operative localization”, and PCT Patent Application No. PCT/CA2017/000104, filed Apr. 28, 2017 and entitled “Systems, Methods and Devices to Scan 3D Surfaces for Intra-operative Localization.”
- The present application relates to intra-operative localization systems for use in sterile and non-sterile surgical environments and more particularly to systems, methods and devices to track the pose of instruments relative to patient anatomy, to move a camera of an intra-operative localization system from its original mounting position while maintaining a registration between the camera's coordinate-frame and the patient anatomy, to drape an intra-operative localization system to enable use in a sterile environment and to display a visualization of a medical image of the patient anatomy in such environments.
- Many types of surgery benefit from precise positional navigation of surgical instruments with respect to a patient's anatomy (for example, a patient's head, spine, or joint). For example, in neurosurgery, surgical treatment may include localizing physical lesions identified on a pre-operative image (e.g. MRI) within a patient's brain to perform biopsies, excisions, ablations, etc. For example, in ENT surgery, Functional Endoscopic Sinus Surgery (FESS). Another example is Deep Brain Stimulation (DBS).
- Systems, methods and devices are described herein for performing a navigated surgical procedure involving a patient's anatomy in sterile and non-sterile surgical environments. A camera may be used to determine a registration of the camera coordinate-frame to the patient anatomy or optionally a tracker in relation to the patient anatomy. A drape may be applied to permit use in a sterile surgical environment. The camera may be moved from its original position to enable access to patient anatomy while maintaining a registration of the camera coordinate-frame with the patient anatomy. Alternatively, the camera may be used in a hand-held or head-mounted manner. A visualization of the patient anatomy may be displayed on a computing unit, with visualization reference planes defined by the pose of an instrument or the camera. The visualization may be presented on a display of a computing unit or as part of a head mounted augmented reality system.
- There is described a method comprising: releasably coupling a proximal end of a non-sterile camera mounting arm to a surgical clamp immobilizing a patient's anatomy; releasably coupling a non-sterile camera to a distal end of the non-sterile camera mounting arm; following a registration between a coordinate frame of the non-sterile camera and the patient's anatomy in a computing unit: draping the non-sterile camera and the non-sterile camera mounting arm with a camera drape to provide a sterile barrier between the patient's anatomy and the non-sterile camera and non-sterile camera mounting arm; wherein draping is performed without moving a position of the non-sterile camera relative to the patient's anatomy.
- The camera drape may be configured to permit the computing unit to use pose data received from the non-sterile camera after the draping with the registration preformed before the draping. The camera drape may be configured to permit a transmission of optical signals to the non-sterile camera without distortion.
- Draping the non-sterile camera may comprise enclosing a closed end of a tube-like camera drape over the non-sterile camera, the camera drape extending to cover the non-sterile camera mounting arm.
- The method may comprise using a holding mechanism to hold the camera drape in place over the non-sterile camera. The holding mechanism may comprise a shroud that mechanically clips onto the non-sterile camera. The camera drape may comprise a drape optical window and the holding mechanism holds the optical window in a fixed and correct alignment with optics of the non-sterile camera.
- The method may further comprise sealing an interface of an open end of the camera drape to a patient drape to maintain a continuous sterile barrier. The patient drape may comprise an opening through which the non-sterile camera mounting arm extends and the interface may be defined by the opening of the patient drape and the open end of the camera drape. The camera drape may comprise a sterile elastic band or an adhesive and the method comprises using the sterile elastic band or adhesive when sealing the interface.
- The non-sterile camera may be coupled to the computing unit via a cable and the camera drape encloses a portion of the cable.
- The method may comprise rigidly fixing a non-sterile tracker relative to the patient's anatomy to perform the registration; and following the draping, rigidly fixing a sterile tracker relative to the patient's anatomy to perform surgical navigation without performing a second registration. The non-sterile tracker and sterile tracker may be affixed to a same tracker mounting arm having a same position. If a geometry of the non-sterile tracker and a geometry of the sterile tracker are different, a difference may be factored into calculations of poses by the computing unit when the respective non-sterile tracker and sterile tracker are used.
- There is disclosed a computer implemented method comprising: performing, in a computing unit, a registration between a coordinate frame of a non-sterile camera and a patient's anatomy, where the non-sterile camera is releasably coupled to a distal end of a non-sterile camera mounting arm and a proximal end of the non-sterile camera mounting arm is releasably coupled to a surgical clamp immobilizing the patient's anatomy and the non-sterile camera communicates pose data to the computing unit; and, following a draping of the non-sterile camera and the non-sterile camera mounting arm by a camera drape to provide a sterile barrier between the patient's anatomy and the non-sterile camera and non-sterile camera mounting arm, where the draping is performed without moving a position of the camera relative to the patient's anatomy: calculating, by the computing unit, poses of sterile instruments relative to the patient's anatomy using the registration to provide surgical navigation during a surgical procedure. The camera drape may be configured to permit a transmission of optical signals to the non-sterile camera without distortion.
- The registration may be performed using pose data of non-sterile instruments.
- The camera drape may comprise a drape optical window and a holding mechanism may hold the optical window in a fixed and correct alignment with optics of the non-sterile camera.
- The method may comprise: receiving at the computing unit pose data from the non-sterile camera of a non-sterile tracker rigidly fixed relative to the patient's anatomy to perform the registration; and following the draping, receiving at the computing unit pose data from the non-sterile camera of a sterile tracker rigidly fixed relative to the patient's anatomy to provide the surgical navigation without performing a second registration. The non-sterile tracker and sterile tracker may be affixed to a same tracker mounting arm having a same position. If a geometry of the non-sterile tracker and a geometry of the sterile tracker are different, a difference is factored into calculations of poses by the computing unit when the respective non-sterile tracker and sterile tracker are used.
- The patient's anatomy may be a cranium.
- There is disclosed a system to drape a patient, a camera mounting arm and a camera attached thereto to provide a sterile barrier for performing a navigated surgical procedure, the system comprising: a sterile camera drape to cover the camera mounting arm and the camera mounted on a distal end of the camera mounting arm, the sterile camera drape comprising a closed end adjacent the camera when draped and an open end distal from the closed end; and a sealing mechanism to seal the camera drape to a sterile patient drape that covers the patient to maintain a continuous sterile barrier.
- The sterile patient drape may provide an opening to receive a proximal end of the camera mounting arm. The open end of the sterile camera drape and the opening of the sterile patient drape may form an interface which is substantially sealed by the sealing mechanism.
- The sealing mechanism may comprises a sterile elastic band, the sterile elastic band engaging the camera drape near or at the open end and the sterile patient drape near or at the opening to form the interface.
- The sterile elastic band may be affixed to the sterile camera drape near or at the open end.
- The sealing mechanism may comprise an adhesive affixed near or at the open end of the sterile camera drape. The adhesive may comprise one or more circumferential adhesive rings comprising an adhesive side and a side affixed near or at the open end of the sterile camera drape, the adhesive side of the one or more circumferential adhesive rings engaging the sterile patient drape near or at the aperture to form the interface.
- The sterile patient drape may further comprise: a tubular protrusion with a closed end distal from the sterile patient drape; an adapter near or at the closed end of the tubular protrusion, the adapter comprising a non-sterile connector on a non-sterile side of the sterile patient drape and a sterile connector on a sterile side of the sterile patient drape; where the non-sterile connector is configured to attach to a tracker mounting arm and the sterile connector is configured to attach to a sterile tracker or a sterile tracker mount.
- The sterile patient drape may further comprise a tubular protrusion with a closed end distal from the sterile patient drape; and the closed end of the tubular protrusion may be engaged between the sterile tracker and a tracker mounting arm, and the closed end of the tubular protrusion may be sufficiently thin to not significantly affect the position of the sterile tracker attached to the tracker mounting arm.
- There is disclosed a computer implemented method comprising the steps of: storing, by a computing unit, the differences between geometries of a non-sterile tracker and a sterile tracker, the non-sterile tracker for use during a non-sterile stage of a surgery for a patient and the sterile tracker for use in place of the non-sterile tracker during a sterile stage of the surgery for the patient; calculating, by the computing unit, a registration of the camera with respect to the non-sterile tracker during the non-sterile stage; during the sterile stage where the sterile tracker is used in place of the non-sterile tracker and the patient is draped with a patient drape, using the registration and differences stored by the computing unit when calculating poses.
- The method may further comprise, before the step of calculating the updated registration, the steps of: storing, by the computing unit, a relative position between the non-sterile tracker when mounted to a mounting arm before the patient drape is applied and the sterile tracker when mounted to the mounting arm via the sterile tracker adaptor of the patient drape after the patient drape is applied based on the geometrical properties of the adaptor; during the sterile stage where the sterile tracker is used in place of the non-sterile tracker and the patient is draped with the patient drape, using the relative position stored by the computing unit when calculating poses.
- There is disclosed a system for performing a navigated medical procedure, the system comprising: a computer readable storage medium storing instructions which, when executed on a computing unit, configure the computing unit to: capture a first pose between a camera having a first positional relationship with respect to the anatomy of a patient and a tracker having a second positional relationship with respect to the anatomy of the patient; capture a second pose between the camera having a third positional relationship with respect to the anatomy of the patient and the tracker having the second positional relationship; and calculate an updated registration of a camera with respect to the patient's anatomy based on a first registration, the first pose and the second pose.
- The system may comprise a camera mounting arm configure to hold the camera in the first positional relationship, and further configured to be positionally adjustable such that the camera is moveable to the third positional relationship.
- The system may further comprise the tracker for mounting to have the second positional relationship.
- The system may further comprise a tracker mounting arm configured to hold the tracker in the second positional relationship.
- The instructions may configure the computing unit to calculate the first registration based on pose data and a medical image of the patient's anatomy.
- The instructions may configure the computing unit to provide graphical instructions to a display unit instructing a user to perform at least one of the functions of moving the camera or capturing a pose.
- The instructions may configure the computing unit to, after capturing the first pose and before capturing the second pose, provide to a display unit a graphical indication representing the alignment a working volume of the camera with a surgical site.
- There is disclosed a computer implemented method comprising the steps: capturing, by the computing unit, a first pose between a camera having a first positional relationship with respect to the anatomy of a patient and a tracker having a second positional relationship with respect to the anatomy of the patient; capturing, by the computing unit, a second pose between the camera having a third positional relationship with respect to the anatomy of the patient and the tracker having the second positional relationship; and calculating, by the computing unit, an updated registration of the camera with respect to the patient's anatomy based on a first registration, the first pose and the second pose.
- The method may further comprise the step of calculating, by the computing unit, the first registration based on pose data and a medical image of the patient's anatomy.
- The method may further comprise the step of providing, by the computing unit, graphical instructions to a display unit instructing a user to perform at least one of the functions of moving the camera or capturing a pose
- The method may further comprise, after capturing the first pose and before capturing the second pose, the step of providing, by the computing unit, a graphical indication to a display unit representing the alignment a working volume of the camera with a surgical site.
- There is disclosed a system for visualization of a patient's anatomy, comprising: a camera configured to receive optical signals from a tracker having a fixed positional relationship to a patient's anatomy, the optical signals comprising pose data, and to communicate the optical signals to a computing unit; a computer-readable storage device storing instruction which, when executed on the computing unit, configure the computing unit to: provide at least one view of the patient's anatomy based on a medical image of the patient's anatomy to a display unit for display to a user; receive from the camera the optical signals comprising the pose data; calculate, based on the pose data and registration data, the pose of the camera with respect to the patient's anatomy; and modify the view based on camera-based reference planes and the pose of the camera. The tracker may be non-invasively attached to the patient. The camera may be handheld and movements of the camera may cause the computing unit to modify the view. The camera may be head-mounted and head movements may cause the computing unit to modify the view. The display unit may comprise a head-mounted display and the view provided to the display may be overlaid on the user's view of the patient. The display unit may be transparent to permit a user to see through the display unit. The view provided to the display unit may be partially transparent. The system may further comprise a projector for projecting a visible pattern onto the patient's anatomy, the visible pattern corresponding to the camera-based reference planes. The camera based reference planes may comprise one plane perpendicular to an optical axis of the camera and displaced from the camera by a positive distance, d, along the optical axis such that when an area of interest of the patient's anatomy is located at the distance, d, from the camera, the tracker is in view of the camera. The distance, d, may be modified.
- There is disclosed a computer implemented method comprising the steps of: providing, by a computing unit, at least one view of a patient's anatomy based on a medical image of the patient's anatomy to a display unit for display to a user; receiving, by the computing unit, optical signals from a camera comprising the pose data of a tracker having a fixed positional relationship with respect to the patient's anatomy; calculating, by the computing unit, based on the pose data and registration data, the pose of the camera with respect to the patient's anatomy; and modifying, by the computing unit, the view based on camera-based reference planes and the pose of the camera.
- These and other aspects will be apparent to those of ordinary skill in the art. While some teachings herein may be described with reference to one aspect such as a system (or apparatus), or a method (or process), it will be understood that, as is applicable, equivalent aspects are included herein. For example, when a computer system is disclosed herein, an equivalent computer implemented method is also included, and vice versa. Equivalent computer program products comprising a non-transient medium storing instructions to configure a computer system or perform a computer implemented method are also contemplated for disclosed computer systems (or computing units) and computer implemented methods.
- Embodiments disclosed herein will be more fully understood from the detailed description and the corresponding drawings, which form a part of this application, and in which:
-
FIG. 1 depicts an intra-operative localization system in the context of a navigated cranial procedure using a camera attached to a head-clamp and a tracker attached to a probe. -
FIG. 2 depicts a head-clamp with a starburst mechanism connector. -
FIG. 3 depicts a mounting arm with a starburst mechanism connector. -
FIG. 4 depicts an alternative mounting arm with a gooseneck mechanism. -
FIG. 5 depicts a camera-head-clamp assembly where a camera is mounted to a head-clamp. -
FIG. 6 shows a camera-mounting arm assembly where a camera is mounted to a camera mounting arm. -
FIG. 7 depicts a camera-tracker-head-clamp assembly where a camera and a tracker are mounted to a head-clamp. -
FIG. 8 depicts an intra-operative localization system where a tracker mounted to a head-clamp and a camera separately mounted to a cart. -
FIG. 9 depicts a sterile draped intra-operative localization system with a camera drape. -
FIG. 10 depicts a sterile draped intra-operative localization system with a camera drape and patient drape. -
FIG. 11 depicts a sterile draped intra-operative localization system with a camera drape, a patient drape and a tracker mounted to a mounting arm under the patient drape without breaking the sterile barrier. -
FIG. 12 depicts a computer implemented method for storing differences in geometries of a sterile tracker and a non-sterile tracker and using a registration between a camera and the non-sterile tracker and the stored differences when calculating poses during a sterile stage of a surgery for a patient. -
FIG. 13 depicts a computer implemented method for updating a registration of a camera with respect to a patient's anatomy when the camera is moved from a first pose to a second pose, based on the camera's pose relative to a tracker and the patient's anatomy. -
FIG. 14 depicts a 4-up view of the visualization medical images of a patient's anatomy. -
FIG. 15 depicts orthogonal cut-planes defined relative to a probe. -
FIG. 16 depicts a handheld intra-operative localization system wherein a camera is moveable in a user's hand and a tracker is mounted to a structure on the patient's head. -
FIG. 17 depicts orthogonal reference planes defined by the pose of the camera coordinate frame. -
FIG. 18 depicts a computer implemented method for modifying a view of a patient's anatomy in a medical image based on camera reference planes and the pose of a camera. -
FIG. 19 depicts a head-mounted intra-operative localization system wherein a camera is mounted to a structure of a user's head and a tracker is mounted to a structure on the patient's head. -
FIG. 20 depicts a probe and corresponding virtual probe. - Described herein are systems and methods for performing a navigated surgical procedure involving a patient's anatomy. As the primary example, an image-guided cranial neurosurgical treatment is provided; however, it should be evident that the systems, devices, apparatuses, methods and computer-implemented methods described herein may be applied to any anatomy requiring treatment (e.g. a cranium, a spine, a pelvis, a femur, a tibia, a hip, a knee, a shoulder, or an ankle).
- Several systems, methods and devices will be described below as embodiments. The scope of the claims should not be limited by the embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.
- Reference in the specification to “one embodiment”, “preferred embodiment”, “an embodiment”, or “embodiments” means that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment, and may be in more than one embodiment. Also, such phrases in various places in the specification are not necessarily all referring to the same embodiment or embodiments.
- A computing unit may comprise a laptop, workstation, or other computing device having at least one processing unit and at least one storage device such as memory storing software (instructions and/or data) as further described herein to configure the execution of the computing unit.
-
FIG. 1 illustrates an exemplaryintra-operative localization system 100, in the context of a navigated cranial (brain) procedure. In thisintra-operative localization system 100, acamera 102 is shown attached to a mountingarm 104 rigidly mounted to a head-clamp 108, with its field of view oriented towards the site of the surgery. Atracker 106 is attached to an instrument 110 (e.g. a probe), thetracker 106 providing optically detectable features for detection by thecamera 102. An instrument may be any type of instrument used in a surgical environment, such as a probe, a tool for cutting or resecting tissue, a tool for retracting tissue, or an imaging too such as an ultrasound probe. Thecamera 102 transmits camera data (including image data or pose data associated with the tracker 106) to acomputing unit 114. Thecomputing unit 114 performs the necessary processing to calculate the position and orientation (pose) of theinstrument 110 with respect to the patient's anatomy 112 (i.e. brain), and to display clinically relevant information to the surgeon. Thecomputing unit 114 may also have access to medical image data (such as a magnetic resonance (MRI) image and/or a computed tomography (CT) image of the patient's anatomy, i.e. head/brain), and may further display the navigational information relative to this medical image. - The
intra-operative localization system 100 is registered to the patient'sanatomy 112; that is, the patient's anatomical planes/axes/features have a relationship, known to thecomputing unit 114, with respect to theintra-operative localization system 100 coordinate frame. - In operation, the intra-operative localization system is used in three stages (1) pre-operative set-up, (2) patient set-up, registration and planning, and (3) surgical navigation.
- Pre-operative setup using the
intra-operative localization system 100, includes the following exemplary steps: (a)Non-sterile instruments 110 to be used for registration and planning are calibrated, if necessary; (b) medical image data (e.g. Mill data, CT data) of the patient'sanatomy 112 is loaded onto thecomputing unit 114; (c) registration landmarks are selected, if necessary depending on the method of registration. The pre-operative set-up steps may be performed in advance of entering the operating room. Given the small, portable nature of the exemplaryintra-operative localization system 100, the pre-operative steps may be performed by a trained user off-site at any convenient time or location. - Calibrating an
instrument 110 generally refers to determining or confirming the spatial relationship between the “effector” of theinstrument 110 and thetracker 106 associated with thatinstrument 110. Various tools/jigs/software routines may be used forinstrument 110 calibration. The “effector” of aninstrument 110 refers to the aspect of theinstrument 110 for which the navigational information is useful. For example: the tip of a biopsy needle; the shaft axis of a probe; the axis, plane, or pattern of a laser; the position and/or orientation of an implantable device. Patient set-up, registration and planning using theintra-operative localization system 100, includes the following exemplary steps: (a) The patient andintra-operative localization system 100 are brought into the operating room; (b) the patient's anatomy 112 (i.e. head) is immobilized via a head-clamp 108; (c) thecamera 102 is mounted to the mountingarm 104 which is in turn connected to the head-clamp 108; (d) landmarks and/or features are localized to generate a registration between the camera coordinate-frame and the patient's anatomy 112 (i.e. head), referred to as Localizer System Registration, and optionally thecamera 102 with a medical image of the patient's anatomy 112 (i.e. head), as a mapping between the camera coordinate-frame and the coordinate-frame of the medical image or model, referred to as Image Registration. The term “registration” generally refers to at least Localizer System Registration, but may also include Image Registration when required; (e) the registration is verified (e.g. a check to ensure that a virtual spatial representation of a trackedinstrument 110 relative to the patient's anatomy 112 (i.e. head) matches the physical spatial relationship between the trackedinstrument 110 and the patient's anatomy 112 (i.e. head); (f) the surgical intervention is planned by, for example, identifying the location of a craniotomy that would provide an optimal path towards a lesion to be resected and/or biopsied. - Surgical navigation using the
intra-operative localization system 100, includes the following exemplary steps during sterile surgical procedures: (a) the patient is prepped and draped; (b) thecamera 102 is draped without moving its position relative to the patient's anatomy 112 (i.e. head); (c) the sterile instruments 110 (e.g. a probe connected to a tracker 106) are calibrated. Surgical navigation, regardless of whether it is sterile or non-sterile, includes the step of calculating the pose of at least oneinstrument 110 and displaying a representation of thatinstrument 110 on thecomputing unit 114 in relation to a medical image of the patient's anatomy 112 (i.e. head). Optionally, further registration verifications may be performed to check if the patient's position relative to thecamera 102 of theintra-operative localization system 100 is accurate. - A head-
clamp 108 may be any type of clamp for restricting the movement of the patient's head.FIG. 2 illustrates an exemplary head-clamp 200 in the form of a Mayfield-style clamp, which immobilizes a patient's skull by clamping it with three points offixation 204. The head-clamp 200 may include aconnector 202 to connect the head-clamp 200 to an operating table and optionally other mounting arms. Theconnector 202 of the head-clamp 200 may be any form of connector. The exemplary connector depicted inFIG. 2 is a starburst mechanism. A starburst mechanism is advantageous for rigid connections, since it allows rotational adjustment at the time of connection, but when secured, provides a highly rigid structure that is unlikely to shift when forces or impacts are applied. - With reference to
FIG. 3 , in one embodiment, a mountingarm 300 is provided, the mountingarm 300 configured to mount to a head-clamp 108, and thus provide a rigid reference to the patient's anatomy 112 (i.e. the skull, head-clamp 108 and mountingarm 300 have a rigid and fixed positional relationship). The mountingarm 300 comprises acamera mount 306 connected to the arm mechanism allowing for positional alignment of thecamera 102. The mountingarm 300 may provide acomplementary connector 302 to the head-clamp connector 202. Thecomplementary connector 302 to may any form of connector. The exemplarycomplementary connector 302 depicted inFIG. 3 is a complementary starburst mechanism. - The
connector 302 on the mountingarm 300 is used to rigidly attach to the head-clamp 108, and may optionally provide an additional connector interface, such as the same starburst interface, for further attachment thereto (e.g. to attach to the operating table, or to attach a second mounting arm 300). The mountingarm 300 is preferably adjustable, such that the camera's working volume may be aligned with the working volume of thesurgical instruments 110, referred to as the surgical site. This may be achieved by providing up to 6 Degrees of Freedom (DOF) positional adjustment in the mountingarm 300 with respect to the patient's anatomy 112 (i.e. head). The mountingarm 300 may incorporate any mechanism to achieve positional alignment. The exemplary embodiment inFIG. 3 depicts a starburst-style connector 302 which provides 1 DOF; an angular joint 308 provides another DOF; twoball joints 304 provide multiple degrees of freedom each; in aggregate, this exemplary embodiment of a mountingarm 300 provides 6DOF of positional alignment. - Once aligned in the desired orientation with respect to the surgical site, the mounting
arm 300 must be rigidly locked in place. This may be accomplished via various mechanisms, including lockable ball joints 304, lockableangular joints 308, lockable gooseneck mechanisms, or any other form moveable and rigidly lockable mechanism. Knobs may be provided so that the user can lock the mounting arm in place. - With respect to
FIG. 4 , an alternate goose-neck mounting arm 400 that is also capable of 6DOF of positional alignment is depicted. The exemplary embodiment of the goose-neck mounting arm 400 comprises a camera and/ortracker mount 406 connected to a “goose-neck”arm mechanism 404 that allows 6DOF positional alignment, which is in turn connected to aclamp 402 to rigidly connect the goose-neck mounting arm to the operating table, the head-clamp 108, or any other surface of the surgical environment. - The
exemplary mounting arm 300 and alternative goose-neck mounting arm 400 may be adapted to attach to a tracker instead of a camera. -
FIG. 5 depicts an exemplary camera-head-clamp assembly 500 comprising the head-clamp 108 holding a patient's anatomy 112 (i.e. head), mountingarm 104 andcamera 102, where thecamera 102 is generally oriented toward the patient's anatomy 112 (signifying the alignment of the camera's working volume with the surgical site). - The
camera 102 may be rigidly fixed to the mountingarm 104 via acamera mount 306, which provides a rigid attachment mechanism fixing thecamera 102 to the mountingarm 104. Rigid mechanisms connecting thecamera 102 to the mountingarm 104 may includecamera 102 being permanently attached to the mounting arm 104 (e.g. by being integrally formed, via welding, adhesion, or fastening), cam locks, threaded connectors, dovetail connections, or magnetic connectors. The rigid mechanism of thecamera mount 306 holds thecamera 102 in a fixed position relative to the mountingarm 104 at least for the duration of the surgically navigated portion of the procedure. Alternatively, a releasable and repeatable connection is also contemplated, such that thecamera 102 may be removed from thecamera mount 306 of the mounting arm 104 (e.g. for convenience), and re-attached in the exact same position. Any highly repeatable connection mechanism is contemplated for this purpose, including magnetic kinematic mounts. Thecamera mounting arm 104 is connected to the head-clamp 108 via theconnector 202 of the head-clamp 108 being connected to thecomplimentary connector 302 of the mountingarm 104. -
FIG. 6 shows an exemplary camera-mounting arm assembly 600 (similar to theassembly 500 ofFIG. 5 ) but where acamera 102 is mounted to a mountingarm 104 via a camera clamp 602 (theclamp 602 having an upper portion and a lower portion, a threaded adjustment mechanism and a hinge joint, such that the threaded adjustment mechanism can apply sufficient force to thecamera 102 to rigidly hold it in place, theclamp 602 further having acamera mount 608 comprised of a force-applying feature (e.g. magnet(s)) and kinematic locating features (in this case, mating hemispherical features and v-slots)). The mountingarm 104 provides a complementary andmating camera mount 306. The result is that (a) thecamera 102/clamp 602 can be attached to the mountingarm 104 viacamera mount 306; (b) thecamera 102/clamp 602 can be removed from the mountingarm 104 by pulling it off thecamera mount 306, and (c) b) thecamera 102/clamp 602 can be reattached with the exact same positional relationship to the mountingarm 104 via thecamera mount 306. Thecamera 102 includesfeatures 604 configured to optionally connect a shroud.Features 604 may be a raised surface from the shape of the camera body (such as is shown) or an indentation (not shown) or a combination of same for receiving a cooperating surface of the shroud. - In some embodiments, it may be desirable to either have a non-fixed camera location, or to have the ability to move or reposition the
camera 102 relative to the surgical site. In order to facilitate these options, the system may include a tracker with a fixed and rigid positional relationship with respect to the head-clamp 108, for example, via a mounting arm for the tracker. - For example,
FIG. 7 camera-tracker-head-clamp assembly 700 shows both acamera 102 andtracker 708 mounted to a head-clamp 108 restricting the movement of the patient's anatomy 112 (i.e. head). Thecamera 102 may be mounted to thecamera mounting arm 104 via thecamera mount 306. Thecamera mounting arm 104 may be connected via aconnector 302 to a complementary head-clamp connector 202. Thecamera 102 may be oriented toward the surgical site, and thetracker 708 is preferably fixed in a position that is within the working volume of the camera 102 (i.e. thetracker 708 is in close proximity to the surgical site, and generally orientated such that it is within the working volume of the camera 102). Thetracker 708 may be rigidly attached to the head-clamp 108 in a permanent manner (for the duration of the procedure). For example, thetracker 708 may be integrally formed with itsown mounting arm 702 which is in turn connected to the head-clamp 108 via, for example, aclamp 704. Thetracker 708 may also be releasably and repeatably coupled to the tracker mounting arm 702 (with a high degree of positional repeatability) via atracker mount 706. When rigidly attached to the mountingarm 702/head-clamp 108, thetracker 708 serves as a rigid reference for the patient'sanatomy 112, and thecamera 102 need not maintain a rigid and fixed position with respect to the patient'sanatomy 112. As described further herein, software incomputing unit 114 may be configured to perform a registration so as to register the positional relationship between thetracker 114 and the patient'sanatomy 112 and thereafter track the pose of the patient'sanatomy 112 in the field of view (working volume) of thecamera 102 even if thecamera 102 is in a position that is different from the camera's position at registration. Exemplary benefits of this configuration include the ability to adjust the camera's 102 position mid-procedure (e.g. to attain a better viewing angle of the surgical site) and the ability to mount thecamera 102 on other objects other than the patient'sanatomy 112 such as a cart, the operating table (via standard operating table attachment rails), or using thecamera 102 in a handheld or surgeon-mounted (e.g. head mounted) manner. -
FIG. 8 illustrates an exemplaryintra-operative localization system 800 where thecamera 102 is rigidly attached to acart 802 providing thecomputing unit 114. Thecamera 102 is mounted to a mountingarm 104 which is in turn rigidly connected to thecart 802. A head-clamp 108 restricts the movement of the patient'sanatomy 112. Atracker 708 is mounted via atracker mount 706 to atracker mounting arm 702, which is in turn rigidly connected to the head-clamp 108 via, for example, aclamp 704. - With reference to
FIGS. 9-11 , it is desirable to use an intra-operative localization system in a sterile surgical procedure.FIG. 9 depicts an exemplary sterile drapedintra-operative localization system 900. In exemplaryintra-operative localization system 900, acamera 102 is attached to a mountingarm 104 via acamera mount 306.Camera 102 is enclosed in asterile drape 904 affixed to thecamera 102 via ashroud 902. Thedrape 904 extends to the base of the mountingarm 104 near a mountingarm connector 102 which attaches the mountingarm 104 to the head-clamp 108 at a head-clamp connector 202. The head-clamp 108 restricts the motion of the patient's anatomy 112 (i.e. head). Thecamera 102 is connected via acable 908 to thecomputing unit 114. - In the exemplary embodiment, the
intra-operative localization system 900 includes asterile drape 904 with anoptical window 906 at its distal end. Thedrape 904 includes a long tube-like body that encloses thecamera 102, a proximate portion of thecable 908 connecting thecamera 102 to thecomputing unit 114 and the mountingarm 104. Thedrape 904 may terminate at the base of the mountingarm 104, which may be located at its connector 302 (i.e. the attachment point to theconnector 202 of the head-clamp 108). The proximate portion of the cable is typically at least the portion of the cable that extends from the camera to (more or less) a length of the mountingarm 104. This length should provide sufficient freedom to move the mountingarm 104 without impacting sterility requirements. A slightly longer or shorter length may also function similarly. Thedrape 904 is intended to cover the portion of the cable that may come into contact with personnel or tools (during the surgery when sterility is to be maintained) at or about the location of the surgery near the patient's head. - The exemplary
intra-operative localization system 900 provides a mechanism to hold thedrape 904 in place on thecamera 102. This is desired in order to hold thedrape window 906 in alignment with the optics of camera 102 (e.g. glass or other material about the lens opening (not shown)), or the optical path. This mechanism may be ashroud 902, that mechanically clips, or otherwise connects, to thecamera 102 via theshroud connecting features 604 of thecamera 102 while holding thedrape window 906 in a fixed and correct alignment with the camera's optics, without puncturing or compromising the sterile barrier (i.e. by sandwiching thesterile drape 904 between the body of theshroud 902 and the camera 102). Theshroud 902 may be capable of deflecting, such that theshroud 902 provides a spring force to hold thecamera 102 and thedrape 904 assembly together. Theshroud 902 may provide locating features intended to mate with complementary locating features on thecamera 102, to enforce the correct alignment between theshroud 902 and thecamera 102. Thedrape window 906 is designed to allow for undistorted transmission of optical signals; it may be constructed of a rigid, thin and flat optically transparent material. - With reference to
FIG. 10 , in a further illustration, the sterile drapedintra-operative localization system 1000 is similar tosystem 900 and is used where the patient is also draped for the surgical procedure. Thecamera 102,shroud 902, mountingarm 104, head-clamp 108 (not shown),camera drape 904 anddrape window 906 may be arranged as described above for theexemplary localization system 900 depicted inFIG. 9 . In the present embodiment, apatient drape 1002 covers the patient and the head-clamp 108. Thepatient drape 1002 has anopening 1006 to expose the surgical site of the patient's anatomy 112 (i.e. head). Thepatient drape 1002 further provides a mountingarm 104opening 1004 such that thecamera 102 and mountingarm 104 covered by acamera drape 904 can stick out through thepatient drape 1002. The draping procedure is performed according to aseptic techniques. - The interface between the
patient drape 1002 and thecamera drape 904 may be substantially sealed to maintain a continuous sterile barrier. To maintain this barrier, thesterile patient drape 1002 andcamera drape 904 may be configured in the following ways. A sterile elastic band may be used to hold thepatient drape 1002 opening tightly around thecamera 102drape 904. The elastic band may be provided in the sterile packaging of thesterile patient drape 1002 andcamera drape 904 as a separate unit. Alternatively, the elastic band may be pre-attached to thecamera drape 904 orpatient drape 1002. Thecamera drape 904 may comprise one or multiple circumferential rings (or a spiral) with adhesive at or near the end of the camera drape 904 (distal from the window end) such that thepatient drape 1002 can be adhered to thecamera drape 904 along the outer circumference ofdrape 904. The adhesive rings may be covered by strips, and exposed for use by removing the adhesive strip covering. Multiple circumferential adhesive rings may be provided so that the desired location along the length of thepatient drape 1002 and thecamera drape 904 interface may be used. Thecamera drape 904 may provide an adhesive strip, either partially or removably attached thereto. The adhesive strip may be used to secure thecamera drape 904 to thepatient drape 1002. Other fasteners may be contemplated, including hook and eye, pull fasteners, etc. configured to maintain the sterile barrier. - With reference to
FIG. 11 , in certain surgically-navigated procedures, it may be desirable to use theintra-operative localization system 100 in non-sterile as well as sterile environments in the same procedure. For example, in neurosurgical applications, it may be desirable to perform registration (i.e. localization system registration and/or image registration) prior to the establishment of the sterile field because the registration landmarks are non-sterile. -
FIG. 11 depicts a sterile drapedintra-operative localization system 1100. The exemplaryintra-operative localization system 1100 is conducive to sterile and non-sterile use within the same procedure, since it is configured to be used with or without the sterile camera drape 190. Thesterile camera drape 904 may be applied to thecamera 102 and/or mountingarm 104 without moving thecamera 102 relative to the patient'sanatomy 112 such that the position of thecamera 102 is the same while the surgical environment is not sterile and after the draping techniques have been applied to make the surgical environment sterile. When the exemplaryintra-operative localization system 1100 is used in sterile use, it may be necessary to use onlysterile instruments 110 in order to maintain the sterile environment. Conversely, when the exemplaryintra-operative localization system 1100 is used in non-sterile use, it may be acceptable to use non-sterile instruments 110 (such as non-sterile registration instruments 110). - As previously described, it is advantageous to have a
tracker 708 rigidly fixed relative to the patient's anatomy 112 (i.e. head). It may be desirable to have atracker 708 rigidly fixed relative to the patient'sanatomy 112 in a sterile environment. In the exemplary embodiment depicted inFIG. 11 , asterile tracker 708 is rigidly attached to the head-clamp 108 through thepatient drape 1002. Thepatient drape 1002 includes awindow 1006 to make visible thepatient anatomy 112. Thesterile tracker 708 is connected to anon-sterile tracker mount 1102 through thepatient drape 1002 without compromising sterility. Thepatient drape 1002 may include an adaptor with a non-sterile-side connection for attachment to thetracker mounting arm 702, and a sterile side for attachment to thesterile tracker 708 and/or thesterile tracker mount 1102. The geometrical properties of the adaptor may be known to thecomputing unit 114, such that the relative position between a non-sterile tracker 708 (when mounted to thetracker mounting arm 702 before thepatient drape 1002 is applied) and a sterile tracker 708 (when mounted to thetracker mounting arm 702 after thepatient drape 1002 is applied) is known to theintra-operative localization system 1100. Additionally, the location of the optically detectable features of thetracker 708 may also be known. - The
sterile tracker 708 may also be configured to puncture thepatient drape 1002 and attach to thetracker mounting arm 702 and/or thetracker mount 1102, the punctured part of thepatient drape 1002 being covered by the base of thetracker 708 such that contamination to the sterile side of thepatient drape 1002 is highly improbable. - The
patient drape 1002 may be sandwiched between thesterile tracker 708 and thetracker mounting arm 702, thepatient drape 1002 being sufficiently thin so as to not significantly affect the position of thesterile tracker 708 on thetracker mounting arm 702 and allow a sufficiently strong connection between thesterile tracker 708 and thetracker mounting arm 702 to not allow thesterile tracker 708 to fall off thetracker mounting arm 702 due to movement of thepatient drape 1002 during the surgical procedure. - The
intra-operative localization system 1100 may use asterile tracker 708 and anon-sterile tracker 708 at different stages of a surgical procedure. Thesterile tracker 708 may have the same geometry as thenon-sterile tracker 708; alternatively, the respective geometries may be different, in which case the difference may be known to thecomputing unit 114 and be factored into calculations of poses at different stages of a surgical procedure accordingly. - The
camera 102,camera mounting arm 104,camera drape 904,camera drape window 906 andshroud 902 may be configured similar to the above described sterile draped cameraintra-operative localization system 1000. Thecamera mounting arm 104 protrudes from the camera mountingarm opening 1004. Thecamera 102 may provide a user interface (comprising for example buttons, indicator lights and/or displays). Theintra-operative localization system 1100 may allow the user to access to the user interface both when sterile and non-sterile. When thecamera drape 904 has been applied and the surgical environment is sterile, the user interface may be accessible and/or functional through thesterile camera drape 904. This may be accomplished, for example, by thecamera drape 904 being of clear and flexible material. - Reference is now made to
FIG. 12 depicting a computer-implementedmethod 1200 for storing differences in geometries of anon-sterile tracker 708 and asterile tracker 708 and using a registration between acamera 102 and thenon-sterile tracker 708 and the stored differences when calculating poses during a sterile stage of a surgery for a patient. Acomputing unit 114 stores (at 1202) the differences between geometries of anon-sterile tracker 708 and asterile tracker 708, thenon-sterile tracker 708 for use during a non-sterile stage of a surgery for a patient and thesterile tracker 708 for use in place of thenon-sterile tracker 708 during a sterile stage of the surgery for the patient. Thecomputing unit 114 calculates (at 1204) a registration of thecamera 102 with respect to thenon-sterile tracker 708 during the non-sterile stage. During the sterile stage where thesterile tracker 708 is used in place of thenon-sterile tracker 708 and the patient is draped with apatient drape 1002, the computing unit 114 (at 1206) uses the registration and differences stored by thecomputing unit 114 when calculating poses. - In one embodiment, the
computing unit 114 may store a relative position between thenon-sterile tracker 708 when mounted to a mountingarm 702 before thepatient drape 1002 is applied and the sterile tracker 809 when mounted to the mountingarm 702 via the sterile tracker adaptor of thepatient drape 1002 after thepatient drape 1002 is applied, based on the geometrical properties of the adaptor. During the sterile stage where thesterile tracker 708 is used in place of thenon-sterile tracker 708 and the patient is draped with thepatient drape 1002 thecomputing unit 114 may use the stored relative position when calculating poses. - It may be desirable to reposition the
camera 102, for example, in order to achieve better viewing angles of theinstruments 110 being tracked as part of the surgical procedure. However, where thecamera 102 is rigidly mounted to the head-clamp 108, the positional relationship between thecamera 102 and the patient'sanatomy 112 may be registered and moving thecamera 102 would compromise this registration. - A system is described herein to provide a “Move Camera” function, allowing the
camera 102 to be moved between a plurality of positions and orientations while maintaining a registration that allows the position and orientation ofinstruments 110 to be tracked relative to the patient'sanatomy 112. - After mounting the
camera 102 to thepatient anatomy 112 to generate a camera-patient anatomy registration which is stored in the memory of acomputing unit 114, thecamera 102 may be repositioned without compromising the registration using the following system. Atracker 708, either sterile or non-sterile, is rigidly attached to the head-clamp 108 and is within the field of view and trackable by thecamera 102.Computing unit 114 captures a first pose of the rigidly-mountedtracker 708 relative to thecamera 102 and thepatient anatomy 112, and computes a tracker-patient anatomy registration (i.e. a registration between the coordinate-frames of thetracker 708 and the patient's anatomy 112) based on this initial pose, and stores this tracker-patient anatomy registration in the memory of thecomputing unit 114. Thecamera 102 may now be repositioned to a desired alignment with the surgical site subject to the rigidly-mountedtracker 708 remaining trackable within the field of view and working volume of thecamera 102. Thecomputing unit 114 captures a second pose of the rigidly-mountedtracker 708 relative to thecamera 102 and thepatient anatomy 112 and computes a new camera-tracker registration (i.e. the registration between the coordinate-frame of thecamera 102 in its new position and orientation with the coordinate-frame of the rigidly-mounted tracker 708) which is then used by thecomputing unit 114 to compute a new camera-patient anatomy registration (i.e. the registration between the coordinate-frame of thecamera 102 in its new position and orientation with the coordinate-frame of the patient anatomy 112) by applying the tracker-patient anatomy registration stored in the memory of thecomputing unit 114 to the new camera-tracker registration. - The memory of the
computing unit 114 may include instructions to display to a user describing how to capture the poses of thecamera 102 and to compute the registrations accordingly. Thecomputing unit 114 may also display graphical instructions to the user on a display of thecomputing unit 114 to guide them through the aforementioned steps. - The
computing unit 114 may be further configured to display a graphical indication of how the camera's working volume aligns with the surgical site, after capturing the first pose and before capturing the second pose (i.e. to give a user visual feedback to help them align the camera to a location that has an improved view of the surgical site). - Reference is now made to
FIG. 13 depicting a computer-implementedmethod 1300 for updating a registration of acamera 102 with respect to a patient'sanatomy 112 when thecamera 102 is moved from a first pose to a second pose, based on the camera's pose relative to atracker 708 and the patient'sanatomy 112. Acomputing unit 114 captures (at 1302) a first pose between thecamera 102 having a first positional relationship with respect to the patient'sanatomy 112 and thetracker 708. Thecomputing unit 114 captures (at 1304) a second pose between thecamera 102 having a third positional relationship with respect to the patient'sanatomy 112 and thetracker 708. Using the first pose, the second pose and a first registration, thecomputing unit 112 calculates (at 1306) an updated registration of thecamera 102 with respect to the patient'sanatomy 112. - During the planning and execution of an image-guided medical intervention, visualization of the patient's
anatomy 112 relative to the navigation system is required. A “4-up” style visualization of the patient's anatomy may be used to display a multiple planes of a three dimensional (“3D”) medical image (e.g. MRI or CT image) of the patient'sanatomy 112. -
FIG. 14 illustrates a “4-up”view 1400 that would be displayed on a display of a computing unit in an intra-operative localization system. The “4-up” style ofvisualization 1400 includes, for a current location within the patient's 3D medical image: (1) a two dimensional cross-section of the 3D medical image in thecoronal plane 1402 of the patient'sanatomy 112, (2) a two dimensional cross-section of the 3D medical image in thesagittal plane 1404 of the patient'sanatomy 112, (3) a two dimensional cross-section of the 3D medical image in thetransverse plane 1406 of the patient'sanatomy 112, and (4) andisometric view 1408 of the 3D medical image. - During navigation in a surgical procedure, the “4-up”
view 1400 may update in real-time based on the position of a navigatedsurgical instrument 110. For example, the two-dimensional coronal, sagittal and transverse cross-sections of the 3D medical image (1402, 1404, 1406) may be updated in real time to reflect the position of the a tracked probe, where the plane of the coronal, sagittal and transverse two dimensional cross-sections of the 3D image (1402, 1404, 1406) reflect the current tip of the probe relative to the patient'sanatomy 112. - The
isometric view 1408 may be modified to enhance visualization of the anatomical features of interest. For example, theisometric view 1408 may be modified to provide cut-away views of the patient'sanatomy 112 such that regions of interest inside the anatomical volume, for example the brain, may be displayed. The regions of interest may include structures or lesions identified during pre-operative planning. Further, the regions of interest may be displayed in a way that is visibly distinguished from other areas within the anatomy. For example, the areas of interest may be pre-operatively identified and may be pre-operatively segmented within the medical image such that they may be viewed and/or manipulated independently from the other anatomy. - The
isometric view 1408 may be modified in real-time based on the pose of a tracked probe. For example, the cut-away view of the patient'sanatomy 112 may be displayed based on the pose of the tracked probe.FIG. 15 illustrates orthogonal three-dimensional cut-planes 1500 defined by the pose of aprobe 1514. The probe may comprise ashaft 1508, atip 1510, and abody 1512 with optically trackable features. A first cut-plane 1502 is defined as being perpendicular to theshaft 1508 of theprobe 1514 and containing the point defined by thetip 1510 of theprobe 1514. A second cut-plane 1504 is defined as being parallel to the front face of theprobe 1514 and containing the vector defined by theshaft 1508 of theprobe 1514. A third cut-plane 1506 is defined as being perpendicular to the front face of theprobe 1514 and containing the vector defined by theshaft 1508 of theprobe 1514. -
FIG. 16 illustrates a hand-heldintra-operative localization system 1600showing camera 102 held in a user'shand 1612, while tracking the pose of atracker 106 in the field of view 1610 of thecamera 102 where thetracker 708 is registered to the patient's anatomy 112 (i.e. head). The registration includes image registration of a patient medical image to thepatient anatomy 112. Thetracker 708 is fixedly attached to the patient'sanatomy 112 which may be accomplished, for 1614 may be attached to a head-clamp 104, or the mountingstructure 1614 may be affixed directly to the patient's anatomy 112 (e.g. via bone screws, suction cups, elastic straps, or structures, such as a glasses-style frame for referencing off of a patient's features such as ears, nose, eyes), or atracker 708 may comprise individual fiducial markers attached to the patient'sanatomy 112 forming a trackable array. In some applications, it may be advantageous for thetracker 708 to be attached with non-invasive means (e.g. suction cups, elastic straps, glasses frames, stickers, individually attachable fiducial markers, clamps). In some applications, it may be advantageous for thetracker 708 to be rigidly anchored directly to bone using invasive means, such as bone screws. In the exemplary embodiment depicted inFIG. 16 , atracker 708 is attached to a patient's anatomy 112 (i.e. head) via a mountingstructure 1614 that includes a head band and a frame that contacts a patient's ears and the bridge of their nose. In this embodiment, the pose of thecamera 102 may be used to modify the visualization displayed on thecomputing unit 114, for example, by modifying any of the coronal, sagittal and transverse twodimensional cross-sections isometric view 1408 to correspond to thecamera 102 coordinate-frame. -
FIG. 17 illustrates orthogonal camera-basedreference planes 1700 defined by the camera coordinate-frame. Thecamera 102 may be held in a user'shand 1612 and the orthogonal three-dimensional cut-planes 1700 will be defined to correspond with the current pose of thecamera 102. The camera-based reference plane used to modify thecoronal plane 1402 displayed in the 4-up view 1400 is referred to as the coronal'plane 1702. The camera reference plane used to modify thesagittal plane 1404 displayed in the 4-up view 1400 is referred to as the sagittal'plane 1704. The camera reference plane used to modify thetransverse plane 1406 displayed in the 4-up view 1400 is referred to as the transverse'plane 1706. Each of the orthogonal reference-planes (the coronal'plane 1702, the sagittal'plane 1704, and the transverse' plane 1706) may share acommon origin 1708 displaced from thecamera 102. In the exemplary embodiment depicted inFIG. 17 , theorigin 1708 is displaced by a distance “d” from thecamera 102 along theoptical axis 1712. The distance “d” may be any distance and may be selected by the user. Further, the distance “d” may be selected such that when any anatomy of interest is located at theorigin 1708, thetracker 708 is viewable by the camera 102 (i.e. thetracker 708 is within the camera's working volume). The coronal'plane 1702 may be further defined as being parallel to the plane of the camera'soptical imager 1710 and perpendicular to theoptical axis 1712 of thecamera 102. Thesagittal′ plane 1704 may be further defined as being perpendicular to the plane of the camera'soptical imager 1710 and parallel to the vector of thevertical axis 1714 of thecamera 102. Thetransverse′ plane 1706 may be further defined as being perpendicular to the plane of the camera'soptical imager 1710 and perpendicular to the vector of thevertical axis 1714 of thecamera 102. The coronal'plane 1702, sagittal'plane 1704 and transverse'plane 1706 may be used instead of anatomical planes to visualize the patient'sanatomy 112 via the slices depicted in the 4-up style visualization displayed on the display of thecomputing unit 114. Alternatively, the patient's anatomical reference planes may be used. The system may allow a user to select which reference planes (i.e. camera-based or patient based) are displayed, for example, via buttons located on thecamera 102. Theisometric view 1408 may also be modified to correspond to the camera-basedreference planes isometric view 1408 is modified independently from the slices to correspond to the camera-basedreference planes computing unit 114 will remain based on the coronal, sagittal and transverse patient reference planes. - Reference is now made to
FIG. 18 depicting a computer-implementedmethod 1800 for modifying a view of a patient's anatomy in a medical image based on camera reference planes and the pose of a camera. Acomputing unit 114 provides (at 1802) at least one view of the patient'sanatomy 112 for display to a user on a display unit. Thecomputing unit 114 receives (at 1804) optical signals from acamera 102 comprising pose data of atracker 708 having a fixed positional relationship with respect to the patient'sanatomy 112. Using the pose data, thecomputing unit 114 calculates (at 1806) a registration of the tracker coordinate frame to the patient'sanatomy 112 and (at 1808) the pose of thecamera 102 with respect to the patient'sanatomy 112. The computing unit modifies (at 1810) the at least one view based on camera-based reference planes and the pose of thecamera 102 with respect to the patient'sanatomy 112. - To aid in improving the visual relationship between the views displayed on the display of the
computing unit 114 and thecamera 102, where the views displayed on the display of acomputing unit 114 are based on or modified by the camera-basedreference planes camera 102 may provide a projector for projecting a visible pattern onto the patient'sanatomy 112. The projector may be any means of projecting the visible on thepatient anatomy 112. For example, the visible pattern may be generated via two planar lasers, the planar lasers being perpendicular to each other, and parallel to the sagittal'plane 1704 and transverse'plane 1706, and further passing through theorigin 1708, displaced from the camera by distance “d”. In this way, the location of where the reference planes intersect with the patient's outer surface (e.g. their skin) may be physically represented on the patient, while the views displayed on the display of thecomputing unit 114 are based on those same reference planes. Alternatively, a line projecting laser may be used, the line passing through the origin. Therefore, a user may have an enhanced ability to visualize a patient'sanatomy 112 based on the displayed views and the projected pattern on the patient. - With reference to
FIG. 19 , in another embodiment, rather than being handheld, thecamera 102 and adisplay 1902 may be head-mounted and may be contained in anaugmented reality headset 1904 worn by thesurgeon 1906.Display 1902 may include a projector and a surface upon which to project. In the illustration ofFIG. 19 , the surface may be a glass or plastic surface of the headset, such as a lens carried by an eyeglasses frame. Theheadset 1904 may be coupled tocomputing unit 114 such as via one or more cables/cabling 1908 providing anaugmented reality system 1900.Computing unit 114 may also comprise an integrated display device (e.g. display screen) for presenting information to the surgeon or others. As described herein, thecomputing unit 114 can compute the pose of thecamera 102 relative to thetracker 708 andpatient anatomy 112 as thesurgeon 1906 moves their head, thus ensuring the proper alignment of the overlay of the virtual view of thepatient anatomy 112 on the user's actual view of the patient. The headset may incorporatecomputing unit 114 itself (not shown). The camera may be located closer to an eye of the surgeon, such as on a corner of the frame of the headset, in front of a portion of the glass/plastic lens to more closely align with the surgeon's field of vision. - The augmented reality display in such an augmented reality system is preferably transparent to allow the surgeon to see through the display of the
computing unit 114 to directly see the patient. Thecomputing unit 114 may receive a real-time feed from thecamera 112 of the patient anatomy. This real-time feed from thecamera 112 may be displayed on the augmented reality display of thecomputing unit 114. The overlaid virtual view of the patient'sanatomy 112 may be opaque or partially transparent. In any headset embodiment, it may be preferable to only display a single view, rather than the “4-up” view displayed on a non-augmented reality display of acomputing unit 114. If a single view is presented, it may be based on the coronal'plane 1702 and theorigin 1708. A virtual view of any pre-identified regions of interest (pre-identified in the patient's medical image) may also be persistently displayed from a perspective that matches camera's 102 coordinate-frame and thus be representative of the user's actual perspective of the patient'sanatomy 112. - The advantage of these alternative embodiments that modify the 4-up view based on the pose of the camera, whether hand-held or head-mounted, is that it displays the
underlying patient anatomy 112 in a more intuitive manner to the user. - Further, the relative pose between the
tracker 708, which is attached to the patient, and thecamera 102, whose pose can be manipulated by a user, may be used to control the pan and tilt of anisometric view 1408, displayed to the user. Thecamera 102 may provide a button, which when pressed, causes thecomputing unit 114, connected to thecamera 102, to enter a pan or tilt mode, in which the changes to the relative pose of thetracker 708 andcamera 102 cause a change to the pan and/or tilt of the displayedisometric view 1408. In tilt mode, for example, a change in the relative pose of thetracker 708 andcamera 102 results in a corresponding change in the tilt of the displayedisometric view 1408 of the medical image. In pan mode, for example, a relative translation of thetracker 708 andcamera 102 causes a corresponding translation of theisometric view 1408 of the medical image. - In an embodiment, with reference to
FIG. 20 , an intra-operative localization system may provide a virtual probe to aid in planning the navigation of the surgical procedure.FIG. 20 illustrates a corresponding probe andvirtual probe pair 2000. The system provides avirtual probe 2004 comprising abody 2008, thebody 2008 with optically trackable features and a location for a user to grasp or a handle, thevirtual probe 2004 may not have a physical shaft or tip. An intra-operative localization system using avirtual probe 2004 may further comprise acomputing unit 114 that has access to the location of thevirtual tip 2006 of thevirtual probe 2004 which is located a distance, “dT”, from relative to thebody 2008. For example, computing unit stores or has access to a stored definition of the virtual probe (e.g. a probe model). Thecomputing unit 114 then provides a navigational display of the patient'sanatomy 112, where the displayed view(s) are modified based on the position of thevirtual tip 2006 of thevirtual probe 2004. The system may also provide aprobe 1514, comprising abody 1512 with optically trackable features and a location for a user to grasp, as well as ashaft 1508 and atip 1510 extending from thebody 1512. Theprobe 1514 and thevirtual probe 2004 may have similar or identical features on therespective bodies probes probe 1514 provides aphysical shaft 1508 with atip 1510 for localization, whereas thevirtual probe 2004 does not. In a further embodiment, thevirtual tip 2006 location relative to the virtual probe body 2008 (as accessed by the computing unit 114) is the same as thephysical tip 1510 location of theprobe 1514 relative to theprobe body 1512. The system may further be configured to provide thevirtual probe 2004 for non-sterile use, and theprobe 1514 for sterile use. - There is disclosed a system for performing a navigated medical procedure, the system comprising: A camera configured to be mounted relative to a patient's anatomy by a mounting arm, the camera being configured to detect optical signals comprising pose information of an object at the surgical site, and providing the optical signals to a computing unit for calculating pose; a tracker configured to provide optical signals for detection by the camera, the tracker attached to or inherently a part of the object; the mounting arm further configured to provide positional adjustment to orient the camera toward the surgical site; the camera and mounting arm being further configured to be enclosed within a sterile camera drape for use within a sterile field; the position of the camera relative to the anatomy not changing when enclosed within the sterile camera drape.
- The system may further include the sterile camera drape. The sterile camera drape may further provide a window to allow for optical transmission of signals comprising pose information. The system may further comprise a shroud for securing the sterile camera drape to the camera. The camera may further comprise shroud features to mate the shroud with the camera to secure the sterile camera drape. The system may be configured to secure the sterile camera drape to the camera such that the window is secured in alignment with the optical path of the camera. The shroud may be configured to secure the camera drape to the camera via spring forces.
- The mounting arm may be configured to rigidly attach to the camera. The system may further comprise a camera clamp to rigidly hold the camera, the camera clamp being further configured to provide a mounting mechanism to the mounting arm. The mounting arm may be configured to releasably and repeatably attach to the camera. The camera or camera clamp may provide a kinematic mounting mechanism and the mounting arm may provide a complementary kinematic mounting mechanism. The mounting arm may be configured to provide positional alignment via lockable joints.
- The mounting arm may comprise at least one joint for positional adjustment where the at least one joint is a lockable ball joint. The mounting arm may comprise multiple joints where the multiple joints are lockable by a single user-adjustable mechanism. The mounting arm positional adjustment may be performed when enclosed within the sterile camera drape, for example via adjustment members that are grippable through the drape.
- The mounting arm may be configured for mounting to a patient immobilizer or a patient positioner. The patient immobilizer may be a head clamp.
- The mounting arm may configured for rigid fixation to the patient's anatomy. Rigid fixation may be provided through a Mayfield clamp or secured to the patient's anatomy via bone screws.
- The patient's anatomy may be one of a cranium, a spine, a pelvis, a femur, a tibia, a hip, a knee, a shoulder, an ankle.
- There is disclosed a system for performing a navigated surgical procedure, the system comprising: a sterile camera drape configured to provide a sterile barrier for a camera mounting arm and a camera attached thereto. The sterile drape may be configured to allow positional adjustment of the positionally-adjustable camera mounting arm when providing the sterile barrier.
- The sterile camera drape may further provide a window to allow for optical transmission of signals comprising pose information from the camera to a computing unit. The sterile camera drape window may be made of a rigid, thin and flat optically transparent material. The sterile camera drape may be configured to be secured in alignment with a camera such that the optical window is in alignment with the optical path of the camera.
- The sterile camera drape may be configured to extend from the camera to at least the base of the camera mounting arm.
- The sterile camera drape may comprise a mechanism for providing a continuous sterile barrier with a patient drape. The mechanism may be a sterile elastic band, configured to tightly hold together the patient drape with the sterile camera drape. The mechanism may be an adhesive strip, which is configured to be applied at a location where the sterile camera drape and patient drape intersect. The mechanism may be a plurality of adhesive sections encircling the sterile camera drape at various locations along the length of the sterile camera drape, configured to enable circumferential adhesion to a patient drape at a desired location along the length of the camera drape.
- There is disclosed a system for performing a navigated medical procedure, the system comprising: a mounting arm, configured to attach to a camera configured to detect optical signals comprising pose information of objects at a surgical site and providing the optical signals to a computing unit for calculating pose, the mounting arm having proximal end comprising an attachment mechanism, and a distal end comprising a base mounting mechanism.
- The mounting arm may comprise a user-adjustable mechanism to adjust the relative position and orientation of the proximal and distal ends in up to 6 DOF. The user-adjustable mechanism may comprise at least one lockable ball joint. The user-adjustable mechanism may comprise a gooseneck mechanism.
- The base mounting mechanism may be configured to attach to one of: a mobile cart, an operating table, providing a compatible clamp for operating room table rails, a head clamp, providing a starburst connector.
- The mounting arm may comprise a second connector at the distal end. The second connector may comprise a same connector that is complementary to the base mounting mechanism.
- The mounting arm may be further configured to selectively attach to a tracker.
- The system may comprise a second mounting arm, the second mounting arm configured to attach to a tracker. The second mounting arm may be configured to attach to a sterile tracker and a non-sterile tracker, for example, one at a time. The second mounting arm may be configured to attach to a sterile tracker through a sterile patient drape.
- There is disclosed a system for performing a navigated medical procedure, the system comprising: a virtual probe comprising a body providing a tracker and a surface (e.g. handle) to be grasped by a user, the tracker of the virtual probe configured to provide optical signals comprising pose data to a camera in communication to a computing unit, the computing unit configured to provide a view of a patient's anatomy for display, the view of the patient's anatomy being based on a medical image, the computing unit further configured to modify the view based on a registration and further based on a location of a tip of the virtual probe, the location of the tip of the virtual probe relative to the pose of the tracker represented by the pose data being accessible in a memory to the computing unit.
- The view may be modified to show the location of the tip of the virtual probe in the medical image. The medical image may be one of a CT-scan and an MRI-scan.
- The system may further comprise a probe comprising a body comprising a tracker and a user-graspable aspect, and further comprising a shaft and a tip extending from the body.
- The location of the tip of the virtual probe may have the same positional relationship to the virtual probe tip body as the positional relationship between the probe tip and the probe body.
- The virtual probe may be provided for non-sterile use and the probe being provided for sterile use.
- The patient's anatomy may be a cranium and brain. The patient's anatomy may be a hip or knee. The patient's anatomy may be a vertebrae.
- It is to be understood that this subject matter is not limited to particular embodiments described, and as such may vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
- As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the teachings herein. Any recited method can be carried out in the order of events recited or in any other order which is logically possible.
- Various embodiments have been described herein with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the broader scope of the disclosed embodiments as set forth in the claims that follow.
Claims (24)
1. A computer implemented method comprising executing by a processor steps comprising:
a) receiving optical information from a camera, the optical information comprising pose data for a tracker coupled in a fixed positional relationship with a patient anatomy, wherein each of the tracker and a three dimensional (“3D”) medical image is registered to the patient anatomy;
b) determining a pose of the camera in relation to the patient anatomy using the pose data of the tracker and the registration of the tracker;
c) displaying one or more two dimensional (“2D”) cross-sections of the 3D medical image, each of the one or more 2D cross-sections selected from the 3D medical image based on a camera coordinate frame, and the pose of the camera.
2. The method of claim 1 , wherein each 2D cross-section of the one or more 2D cross-sections is selected to comprise a respective view of the 3D medical image corresponding to an orthogonal camera-based reference plane defined by the camera coordinate frame.
3. The method of claim 1 , wherein the 3D medical image comprises a segmented image.
4. The method of claim 1 comprising, prior to step c):
displaying the one or more 2D cross-sections based on an anatomical coordinate frame;
receiving user input to change a basis of the one or more 2D cross-sections to be based on the camera coordinate frame; and
performing step c) in response to receiving the user input.
5. The method of claim 4 , wherein each 2D cross-section of the one or more 2D cross-sections based on the anatomical coordinate frame is selected to comprise a respective view of the 3D image based on a coronal plane, a sagittal plane, or a transverse plane of the patient anatomy.
6. The method of claim 4 , where in the user input is received by way of a user interface of the camera.
7. The method of claim 1 comprising, following step c):
d) receiving user input to change a basis of at least some of the 2D cross-sections to be based on an anatomical coordinate frame; and
e) updating the displaying of the at least some of the 2D cross-sections to be based on the anatomical coordinate frame in response to the user input.
8. The method of claim 1 , wherein each of the one or more 2D cross-sections is further selected based on a distance measured from a location of the camera along an optical axis in a field of view, wherein the distance defines an origin for determining the one or more 2D cross-sections from the 3D medical image.
9. The method of claim 1 comprising registering the tracker to the patient anatomy and registering the 3D medical image to the patient anatomy.
10. The method of claim 1 comprising providing an interface to receive input selecting a camera coordinate frame basis or an anatomical coordinate frame basis for displaying the 2D cross-sections of the 3D medical image.
11. The method of claim 1 comprising updating the displaying of the 2D cross-sections in response to an updated pose of the camera determined from updated pose data of the tracker.
12. A system comprising:
a tracker configured to couple with a patient anatomy in a fixed positional relationship;
a camera configured to provide optical information comprising pose data of the tracker;
an intraoperative computing unit having a processor, the computing unit configured to:
receive optical information from a camera, the optical information comprising pose data for a tracker coupled in a fixed positional relationship with a patient anatomy, wherein each of the tracker and a three dimensional (“3D”) medical image is registered to the patient anatomy;
determine a pose of the camera in relation to the patient anatomy using the pose data of the tracker and the registration of the tracker; and
display one or more two dimensional (“2D”) cross-sections of the 3D medical image, each of the one or more 2D cross-sections selected based on a camera coordinate frame, and the pose of the camera.
13. The system of claim 12 , wherein each 2D cross-section of the one or more 2D cross-sections is selected to comprise a respective view of the 3D medical image corresponding to an orthogonal camera-based reference plane defined by the camera coordinate frame.
14. The system of claim 12 , wherein the computing device is configured to:
provide an interface to receive input selecting a camera coordinate frame or an anatomical coordinate frame as a basis for displaying the 2D cross-sections of the 3D medical image;
receive user input selecting the anatomical coordinate frame; and
update the display of the one or more 2D cross-sections to be based on the anatomical coordinate frame.
15. The system of claim 14 , wherein at least one 2D cross-section of the one or more 2D cross-sections based on the anatomical coordinate frame is selected to comprise a respective view of the 3D image based on a coronal plane, a sagittal plane, or a transverse plane of the patient anatomy.
16. The system of claim 14 , where in the user input is received by way of a user interface of the camera.
17. The system of claim 12 , wherein the computing unit is configured to determine each of the one or more cross-sections further selected based on a distance measured from a location of the camera along an optical axis in a field of view, wherein the distance defines an origin for determining the one or more 2D cross-sections from the 3D medical image.
18. The system of claim 12 , wherein the camera comprises a hand-held camera.
19. The system of claim 12 wherein the computing unit is configured to update the display of the 2D cross-sections in response to an updated pose of the camera determined from updated pose data of the tracker.
20. The system of claim 12 comprising a projector coupled with the camera, the projector configured to project a visible pattern to visualize one or more camera reference planes on an outer surface of the patient anatomy.
21. A method comprising steps of:
operating a camera to provide optical information to an intraoperative computing unit, the optical information comprising pose data for a tracker coupled in a fixed positional relationship with a patient anatomy, wherein each of the tracker and a three dimensional (“3D”) medical image is registered to the patient anatomy by the intraoperative computing unit; and
receiving a display of one or more two dimensional (“2D”) cross-sections of the 3D medical image, each of the one or more 2D cross-sections based on a camera coordinate frame, and a pose of the camera in relation to the patient anatomy, the pose of the camera determined by the intraoperative computing unit using the pose data of the tracker and the registration of the tracker.
22. The method of claim 21 comprising operating a user interface to provide an input to configure the display of the one or more 2D cross-sections of the 3D medical image, the input provided to select a basis for the one or more 2D cross-sections, the basis comprising a camera coordinate frame basis or an anatomical coordinate frame basis.
23. The method of claim 21 comprising operating a projector coupled to the camera to project a visible pattern to visualize one or more camera reference planes on an outer surface of the patient anatomy.
24. The method of claim 21 comprising: moving the camera to provide updated optical information including updated pose data of the tracker; and receiving an update of the one or more 2D cross-sections responsive to the updated pose data of the tracker.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/140,898 US20230263586A1 (en) | 2016-09-07 | 2023-04-28 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662384410P | 2016-09-07 | 2016-09-07 | |
PCT/IB2017/055400 WO2018047096A1 (en) | 2016-09-07 | 2017-09-07 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
US201916331236A | 2019-03-07 | 2019-03-07 | |
US18/140,898 US20230263586A1 (en) | 2016-09-07 | 2023-04-28 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/331,236 Continuation US11666407B2 (en) | 2016-09-07 | 2017-09-07 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
PCT/IB2017/055400 Continuation WO2018047096A1 (en) | 2016-09-07 | 2017-09-07 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230263586A1 true US20230263586A1 (en) | 2023-08-24 |
Family
ID=61561346
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/331,236 Active 2040-09-09 US11666407B2 (en) | 2016-09-07 | 2017-09-07 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
US18/140,898 Pending US20230263586A1 (en) | 2016-09-07 | 2023-04-28 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/331,236 Active 2040-09-09 US11666407B2 (en) | 2016-09-07 | 2017-09-07 | Systems and methods for surgical navigation, including image-guided navigation of a patient's head |
Country Status (5)
Country | Link |
---|---|
US (2) | US11666407B2 (en) |
EP (1) | EP3509526A4 (en) |
JP (2) | JP7328653B2 (en) |
AU (1) | AU2017323599B2 (en) |
WO (1) | WO2018047096A1 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US11160610B2 (en) | 2017-02-07 | 2021-11-02 | Intellijoint Surgical Inc. | Systems and methods for soft tissue navigation |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
FR3095331A1 (en) | 2019-04-26 | 2020-10-30 | Ganymed Robotics | Computer-assisted orthopedic surgery procedure |
US11093038B2 (en) | 2019-05-14 | 2021-08-17 | Synchron Australia Pty Limited | Systems and methods for generic control using a neural signal |
EP3808282A3 (en) | 2019-09-01 | 2021-06-30 | Bb Surgical Devices, S.L. | Universal surgical access system |
WO2021086972A1 (en) | 2019-10-29 | 2021-05-06 | Synchron Australia Pty Limited | Systems and methods for configuring a brain control interface using data from deployed systems |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US20210275274A1 (en) * | 2020-03-05 | 2021-09-09 | John B. Clayton | Fixed Camera Apparatus, System, and Method for Facilitating Image-Guided Surgery |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
ES2898124B2 (en) * | 2021-09-15 | 2022-10-13 | Gonzalez Gonzalez Igor | SYSTEM FOR NAVIGATION IN THE IMPLANTATION OF KNEE REVISION PROSTHESIS |
TR2021020732A2 (en) * | 2021-12-22 | 2022-01-21 | Atatuerk Ueniversitesi Rektoerluegue Bilimsel Arastirma Projeleri Bap Koordinasyon Birimi | A GUIDING COVER AND ITS IMPLEMENTATION |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120253200A1 (en) * | 2009-11-19 | 2012-10-04 | The Johns Hopkins University | Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors |
US20130317351A1 (en) * | 2012-05-22 | 2013-11-28 | Vivant Medical, Inc. | Surgical Navigation System |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5662111A (en) * | 1991-01-28 | 1997-09-02 | Cosman; Eric R. | Process of stereotactic optical navigation |
US6529765B1 (en) | 1998-04-21 | 2003-03-04 | Neutar L.L.C. | Instrumented and actuated guidance fixture for sterotactic surgery |
ATE540634T1 (en) * | 2005-06-06 | 2012-01-15 | Intuitive Surgical Operations | LAPAROSCOPIC ULTRASONIC ROBOT SYSTEM FOR SURGICAL PURPOSES |
WO2011134083A1 (en) * | 2010-04-28 | 2011-11-03 | Ryerson University | System and methods for intraoperative guidance feedback |
US8657809B2 (en) * | 2010-09-29 | 2014-02-25 | Stryker Leibinger Gmbh & Co., Kg | Surgical navigation system |
AU2011342900A1 (en) * | 2010-12-17 | 2013-07-18 | Intellijoint Surgical Inc. | Method and system for aligning a prosthesis during surgery |
US9355289B2 (en) | 2011-06-01 | 2016-05-31 | Matrix It Medical Tracking Systems, Inc. | Sterile implant tracking device and method |
EP2983606B1 (en) | 2013-03-15 | 2018-11-14 | Stryker Corporation | Assembly for positioning a sterile surgical drape relative to optical position sensors |
US9247998B2 (en) * | 2013-03-15 | 2016-02-02 | Intellijoint Surgical Inc. | System and method for intra-operative leg position measurement |
US20140318551A1 (en) * | 2013-04-29 | 2014-10-30 | Contour Fabricators, Inc. | Craniotomy Drape and Method of Simultaneously Draping a Sterile Barrier Over a Patient and Navigation Tracker |
EP3104803B1 (en) * | 2014-02-11 | 2021-09-15 | KB Medical SA | Sterile handle for controlling a robotic surgical system from a sterile field |
US20150282735A1 (en) | 2014-04-04 | 2015-10-08 | Izi Medical Products,Llc | Reference device for surgical navigation system |
US9737370B2 (en) | 2014-10-14 | 2017-08-22 | Synaptive Medical (Barbados) Inc. | Patient reference tool |
WO2016065458A1 (en) * | 2014-10-29 | 2016-05-06 | Intellijoint Surgical Inc. | Systems and devices including a surgical navigation camera with a kinematic mount and a surgical drape with a kinematic mount adapter |
US10265854B2 (en) * | 2016-08-04 | 2019-04-23 | Synaptive Medical (Barbados) Inc. | Operating room safety zone |
-
2017
- 2017-09-07 EP EP17848243.6A patent/EP3509526A4/en not_active Withdrawn
- 2017-09-07 AU AU2017323599A patent/AU2017323599B2/en active Active
- 2017-09-07 JP JP2019513888A patent/JP7328653B2/en active Active
- 2017-09-07 US US16/331,236 patent/US11666407B2/en active Active
- 2017-09-07 WO PCT/IB2017/055400 patent/WO2018047096A1/en unknown
-
2023
- 2023-04-28 US US18/140,898 patent/US20230263586A1/en active Pending
- 2023-05-10 JP JP2023078064A patent/JP2023100926A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120253200A1 (en) * | 2009-11-19 | 2012-10-04 | The Johns Hopkins University | Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors |
US20130317351A1 (en) * | 2012-05-22 | 2013-11-28 | Vivant Medical, Inc. | Surgical Navigation System |
Also Published As
Publication number | Publication date |
---|---|
JP2023100926A (en) | 2023-07-19 |
AU2017323599B2 (en) | 2023-04-13 |
EP3509526A4 (en) | 2020-07-01 |
JP2019532693A (en) | 2019-11-14 |
US20190183590A1 (en) | 2019-06-20 |
EP3509526A1 (en) | 2019-07-17 |
JP7328653B2 (en) | 2023-08-17 |
WO2018047096A1 (en) | 2018-03-15 |
AU2017323599A1 (en) | 2019-04-04 |
US11666407B2 (en) | 2023-06-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230263586A1 (en) | Systems and methods for surgical navigation, including image-guided navigation of a patient's head | |
JP2019532693A5 (en) | ||
US11648064B2 (en) | Motorized full field adaptive microscope | |
US20220079686A1 (en) | 3d navigation system and methods | |
CA2944313C (en) | Reference device for surgical navigation system | |
US8509503B2 (en) | Multi-application robotized platform for neurosurgery and resetting method | |
WO2010067267A1 (en) | Head-mounted wireless camera and display unit | |
CA2960523C (en) | End effector for a positioning device | |
Caversaccio et al. | Computer assistance for intraoperative navigation in ENT surgery | |
WO2006075331A2 (en) | Image-guided robotic system for keyhole neurosurgery | |
Kwartowitz et al. | Toward image-guided robotic surgery: determining intrinsic accuracy of the da Vinci robot | |
WO2002080773A1 (en) | Augmentet reality apparatus and ct method | |
US20240268919A1 (en) | Robotically coordinated surgical visualization | |
Kadi et al. | Stereotactic brain surgery: instrumentation, automation, and image guidance | |
Vilsmeier et al. | Introduction of the Passive Marker Neuronavigation System VectorVision | |
Karahalios et al. | Image-guided spinal surgery | |
Nathoo et al. | SURGICAL NAVIGATION SYSTEM TECHNOLOGIES | |
Franceschini et al. | Computer-Aided Surgery in Otolaryngology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTELLIJOINT SURGICAL INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HLADIO, ANDRE NOVOMIR;FANSON, RICHARD TYLER;RYTERSKI, ERIC J.;AND OTHERS;SIGNING DATES FROM 20180306 TO 20190307;REEL/FRAME:063482/0730 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |