WO2020234409A1 - Intraoperative imaging-based surgical navigation - Google Patents

Intraoperative imaging-based surgical navigation Download PDF

Info

Publication number
WO2020234409A1
WO2020234409A1 PCT/EP2020/064177 EP2020064177W WO2020234409A1 WO 2020234409 A1 WO2020234409 A1 WO 2020234409A1 EP 2020064177 W EP2020064177 W EP 2020064177W WO 2020234409 A1 WO2020234409 A1 WO 2020234409A1
Authority
WO
WIPO (PCT)
Prior art keywords
lung
medical instrument
deflated state
interventional
controller
Prior art date
Application number
PCT/EP2020/064177
Other languages
French (fr)
Inventor
Torre Michelle BYDLON
Paul Thienphrapa
Alvin Chen
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201962851174P priority Critical
Priority to US62/851,174 priority
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2020234409A1 publication Critical patent/WO2020234409A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/12Devices for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00743Type of operation; Specification of treatment sites
    • A61B2017/00809Lung operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/40Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with arrangements for generating radiation specially adapted for radiation diagnosis
    • A61B6/4064Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with arrangements for generating radiation specially adapted for radiation diagnosis specially adapted for producing a particular type of beam
    • A61B6/4085Cone-beams
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Abstract

A controller (420) for assisting navigation in an interventional procedure includes a memory (520) that stores instructions, and a processor (510) that executes the instructions. When executed by the processor (510), the instructions cause the controller to implement a process that includes registering (S260) coordinate systems of an interventional medical instrument (450) and an intra-operative image of a lung and calculating (S260) a deformation between the lung in the deflated state and the lung in the inflated state. The process also includes applying (S260) the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three-dimensional model of the lung in the deflated state and generating (S280) an image of the modified three-dimensional model of the lung in the deflated state.

Description

INTRAOPERATIVE IMAGING-BASED SURGICAL NAVIGATION
BACKGROUND
[001] Lung cancer is a deadly form of cancer with surgery being the preferred treatment of choice for early stage tumors. The most invasive form of surgery historically is open surgery, where the chest is split open to expose a large portion of the lung. In open surgery, surgical tools such as scalpels are inserted through a large opening in the thorax and used to remove the tumor. The open surgical techniques allow physical access for palpation to sense the tumors by touch.
[002] In the past, tumors were large enough to be sensed by touch, so surgeons could find them even though they were invisible to the eye. In recent years, both detection of embedded lung tumors and techniques for surgically removing (resecting) lung tumors have improved. Recent years have seen the emergence of a minimally invasive technique for resecting lung tumors called video assisted thoracoscopic surgery (VATS). Additionally, recent years have seen growth of lung cancer screening programs, which tend to identify tumor nodules at an earlier stage when they are smaller and more difficult to discern by touch.
[003] VATS was developed to provide a more minimally invasive approach to lung tumor resection. In VATS, a small camera is inserted into the chest cavity through a small port (i.e. a small hole or incision) and the surgical instruments are inserted through the same port or other small ports. However, palpation to sense the tumors by touch is more difficult under VATS due to constrained access and no haptic feedback, and the entire resection is done using the camera view. FIG. 1 illustrates a known VATS implementation for lung resection. In FIG. l, a thoracoscope or a small camera stick is inserted through the rib cage of a patient P as one of the instruments. Vision that is otherwise occluded is restored via the thoracoscope or small camera. In FIG. 1, instrument #1 and instrument #2 are separately inserted into the patient P via two separate small incisions to perform the resection. In recent years, robotic surgery has emerged as a minimally invasive approach similar to and competitive with VATS.
[004] Nevertheless, three major challenges are still encountered in lung surgery today, regardless of the type of lung surgery being performed. First, insofar as the surgeon determines the location of the tumor based on a pre-operative CT scan with the lung fully inflated and well before surgery, when the lung is collapsed during surgery the three-dimensional (3D) orientation of the lung, and location of the tumor, will not match the images from the pre-operative CT scan. Second, the lung is complex with many blood vessels and airways which have to be carefully dissected and addressed before the tumor and any feeding airways or vessels are removed. Third, since small, non-palpable tumors are very hard to locate in the lung, especially with VATS or robotic surgery, extra healthy lung tissue may be removed to prevent the possibility of leaving tumor tissue behind and this further compromises lung function.
[005] To overcome the challenges noted above, surgeons and research groups have investigated ways of improving the surgical workflow to better guide tumor resection. This can be done by implanting dyes or markers into the tumor or with better imaging techniques.
[006] While intra-operative imaging is rarely used today, in one known clinical trial, intra operative cone beam CT is used for a needle-guided insertion of a marker. The marker is placed in the center of the tumor and a string/wire comes out to the surface of the lung. The
thoracoscope is then inserted and the traditional VATS procedure is completed with the surgeon following the wire to the marker at the center of the tumor. Visible and fluorescent dyes can also be injected into the tumor to serve as a visual marker for the surgeon as the tissue is dissected. Another known mechanism provides a deformable registration algorithm that calculates a deformation matrix between cone-beam CT images of inflated and deflated lungs in phantom and animal models.
[007] Nevertheless, the benefits of minimally invasive surgery and earlier detection are not yet enough to precisely locate and resect lung tumors today with safe margins while sparing healthy tissue. Additional investigation now leads to the intraoperative imaging-based surgical navigation described herein.
SUMMARY
[008] According to an aspect of the present disclosure, a controller for assisting navigation in an interventional procedure includes a memory that stores instructions, and a processor that executes the instructions. When executed by the processor, the instructions cause the controller to implement a process that includes registering coordinate systems of an interventional medical instrument and an intra-operative image of a lung; and calculating a deformation between the lung in the deflated state and the lung in the inflated state. The process implemented when the processor executes the instructions also includes applying the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three-dimensional model of the lung in the deflated state; and generating an image of the modified three-dimensional model of the lung in the deflated state.
[009] According to another aspect of the present disclosure, a controller for assisting navigation in an interventional procedure includes a memory that stores instructions, and a processor that executes the instructions. When executed by the processor, the instructions cause the controller to implement a process that includes segmenting a cone-beam computed tomography image of a lung in a deflated state during the interventional procedure to obtain a three-dimensional original model of the lung in the deflated state. The process implemented when the processor executes the instructions also includes registering coordinate systems of an interventional medical instrument to the three-dimensional original model of the lung in the deflated state; and generating an image of the original three-dimensional model of the lung in the deflated state.
[010] According to still another aspect of the present disclosure, a system for assisting navigation in an interventional procedure includes a cone-beam computed tomography imaging apparatus and a computer. The cone-beam computed tomography imaging apparatus generates a cone-beam computed tomography image of a lung in a deflated state. The computer includes a controller with a memory that stores instructions and a processor that executes the instructions. When executed by the processor, the instructions cause the controller to implement a process that includes registering coordinate systems of an interventional medical instrument and an intraoperative image of the lung and calculating a deformation between the lung in the deflated state and the lung in the inflated state. The process also includes applying the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three- dimensional model of the lung in the deflated state; and generating an image of the modified three-dimensional model of the lung in the deflated state.
BRIEF DESCRIPTION OF THE DRAWINGS
[011] The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
[012] FIG. 1 illustrates a known VATS implementation for lung resection.
[013] FIG. 2A illustrates a method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[014] FIG. 2B illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[015] FIG. 3 illustrates another method of intraoperative imaging -based surgical navigation, in accordance with a representative embodiment.
[016] FIG. 4 illustrates a system for intraoperative imaging -based surgical navigation, in accordance with a representative embodiment.
[017] FIG. 5 illustrates a general computer system, on which a method of intraoperative imaging-based surgical navigation can be implemented, in accordance with another representative embodiment
DETAILED DESCRIPTION
[018] In the following detailed description, for purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments. Nonetheless, systems, devices, materials and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only, and is not intended to be limiting. The defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings. [019] It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept.
[020] The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. As used in the specification and appended claims, the singular forms of terms‘a’,‘an’ and‘the’ are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms "comprises", and/or "comprising," and/or similar terms when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
[021] Unless otherwise noted, when an element or component is said to be“connected to”, “coupled to”, or“adjacent to” another element or component, it will be understood that the element or component can be directly connected or coupled to the other element or component, or intervening elements or components may be present. That is, these and similar terms encompass cases where one or more intermediate elements or components may be employed to connect two elements or components. However, when an element or component is said to be “directly connected” to another element or component, this encompasses only cases where the two elements or components are connected to each other without any intermediate or intervening elements or components.
[022] In view of the foregoing, the present disclosure, through one or more of its various aspects, embodiments and/or specific features or sub-components, is thus intended to bring out one or more of the advantages as specifically noted below. For purposes of explanation and not limitation, example embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. However, other embodiments consistent with the present disclosure that depart from specific details disclosed herein remain within the scope of the appended claims. Moreover, descriptions of well-known apparatuses and methods may be omitted so as to not obscure the description of the example embodiments. Such methods and apparatuses are within the scope of the present disclosure.
[023] As described herein, mechanisms for intraoperative imaging-based surgical navigation are useful in mitigating the challenges of VATS or RATS (radio assisted thoracoscopic surgery). The mechanisms described herein are placed in the context of the surgical setup and workflow, so that components and procedural steps are sequenced to enable productive use of time and resources. The embodiments of intraoperative imaging-based surgical navigation described below each typically involve intraoperative cone-beam CT or fluoroscopy image-based registration methods, along with image-based tracking.
[024] FIG. 2A illustrates a method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[025] In FIG. 2A, the method starts with imaging at S210. The imaging at S210 may be computed tomography and/or positron emission tomography-computed tomography performed prior to surgery. The imaging at S210 may be performed at a different time and place, and under the supervision and control of different personnel than the imaging using the equipment in the system 400 of FIG. 4 described later. The imaging at S210 involves imaging the lung in the inflated state using computed tomography prior to an interventional procedure. That is, the imaging at S210 may result in the lung in the inflated state being imaged using computed tomography. The imaging at S210 may optionally involve imaging the lung in a partially inflated state using computed tomography, where the partially inflated state may be different than the intraoperative state.
[026] Next, the method of FIG. 2A continues with pre-operative segmentation at S220. An algorithm is used to perform the segmentation and is applied to the images obtained in S210 to produce a three-dimensional (3D) model of the anatomical features (e.g., airways, vessels, fissures, tumor, and/or lymph nodes) of the lung that was imaged at S210. Segmentation is a representation of the surface of structures of the anatomical features (e.g., airways, vessels, fissures, tumor and/or lymph nodes) and consists for example of a set of points in three- dimensional (3-D) coordinates on the surfaces of the lung, and triangular plane segments defined by connecting neighboring groups of three points, such that the entire structure is covered by a mesh of non-intersecting triangular planes. A three-dimensional model of the lung is obtained by segmenting at least one of imagery obtained based on cone-beam computed tomography imaging of the lung in the inflated state in an alternative to S210 and imagery obtained based on computed tomography imaging of the lung in the inflated state as in S210.
[027] At S230, the method of FIG. 2A includes imaging of an inflated lung. The image of the inflated lung at S230 may be considered intraoperative and may be an intraoperative image taken using computed tomography or cone-beam computed tomography. The imaging of the inflated lung at S230 may be with a cone-beam CT imaging apparatus. That is, at the beginning of surgery a cone-beam CT scan of the patient may be completed when the lung is still inflated. The cone-beam CT image of the inflated lung can be used subsequently to register the pre operative image(s) obtained at S210 to the intra-operative state and make any alignment adjustments to the patient’s positioning. To be clear, the imaging at S210 and the imaging at S230 may both be via cone-beam CT even when performed in different places and/or different times and/or with different cone-beam CT imaging apparatuses.
[028] In one or more embodiments, the 3D model created at S220 may be updated after the images from S210 and S230 are registered. For example, CT images from S210 may be registered with cone-beam CT images from S230, and then the 3D model created at S220 is updated prior to the process described next for S240. For example, a rigid transformation may be applied after S230 to the 3D model from S220 to just account for differences in patient position. In another example, a deformable transformation may be applied after S230 to the 3D model from S220 to account for non-rigid changes in the lung.
[029] At S240, a scope is inserted into the patient and the lung is deflated. The scope may be a thoracoscope. The scope is inserted into the chest cavity and may be, but is not necessarily, inserted into the lung. The lung is deflated after the scope is inserted in S240.
[030] At S250, the method of FIG. 2A includes imaging of the deflated lung. The imaging of the deflated lung at S250 may be via cone-beam CT and may be performed with the same cone- beam CT imaging apparatus used in S230. Alternatively, the imaging at S250 may be visible light imaging. When the imaging at S250 is via cone-beam CT, the imaging at S250 may involve the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using cone-beam computed tomography during the interventional procedure. In another alternative, the imaging of the deflated lung at S250 may be via one or more X-rays, which may involve the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using X-ray during the interventional procedure.
[031] At S260, the method of FIG. 2A includes registering the scope to the deflated lung. The registration at S260 may involve aligning the coordinate system of the scope inserted into the chest cavity while the lung is deflated and the coordinate system of an intra-operative image of the lung. Registration of coordinate systems may be performed in a variety of ways, such as by identifying and aligning landmarks present in both coordinate systems. The intra-operative image of the lung may be that taken at S230 and may be an intra-operative image taken using computed tomography or cone-beam computed tomography.
[032] A deformation matrix may be calculated between the cone-beam CT image of the inflated lung from S230 and the cone-beam CT image of the deflated lung from S250 and applied to the 3D model of the anatomical features which is created at S220 and already updated at or after S230. This results in a new 3D model of the anatomical features in the deflated state. The 3D model can be used for anatomical reference and guidance during surgery and may exist as its own feature. The deformation matrix may be a comprehensive all-encompassing deformation matrix which includes CT to cone-beam CT of the inflated lung to cone-beam CT of the deflated lung. Alternatively, the deformation matrix may be applied in two steps, where one deformation matrix is applied for CT to the cone-beam CT of the inflated lung (e.g., after S230 as described above), and then a second deformation matrix is applied for the transformation from the cone- beam CT of the inflated lung to the cone-beam CT of the deflated lung.
[033] At S270, the method of FIG. 2A includes tracking movement of the scope relative to the lung surface. That is, movement of an interventional medical instrument such as a scope may be tracked visually via the tissue surface of the lung; this may be with a visible light camera (such as the traditional thoracoscope), or by hyperspectral imaging, or by near-infrared (NIR) fluorescence imaging to name a few. An interventional medical instrument may alternatively be tracked with external tracking technologies such as electromagnetic tracking using sensors, optical tracking using optical sense shaping (OSS), and by other forms of tracking technologies for tracking interventional medical instruments. Applying a surface-feature tracking algorithm or different tracking methods to the scope allows the system used to implement S270 to move the 3D model on a monitor as the scope is moved with respect to the lung. In other embodiments, a 3D model may be presented on a headset screen or glasses, such as by using augmented reality. In yet other embodiments, a 3D model may be presented as a hologram.
[034] At S280, the method of FIG. 2A includes augmenting the scope view. For example, because the thoracoscope is registered to the cone-beam CT images at S260 and hence the 3D model of the deflated anatomical features, the 3D model can be overlaid on the scope video-feed. Augmenting can be performed at S280 in other ways, such as by highlighting a tumor in the view of the scope by brightness or color, by visually warning of anatomical features that should be avoided, and in other ways that are supplemental to the teachings herein.
[035] At S290, the method of FIG. 2A includes resecting a tumor in the lung, based on intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 2A. The tumor can be resected at S290 using the augmented view of the thoracoscope and the augmented view helps ensure the surgeon knows where the tumor is located within the lung.
[036] In FIG. 2A, S270, S280 and S290 are shown partially overlapping in the vertical direction on the page. This reflects that the tracking at S270 may be performed continually before and during the augmenting of the scope view at S280, and both may be performed continually during the resecting of the tumor at S290. In other words, while methods described herein are generally shown as a series of discrete steps performed separately in sequence, some steps in the methods may be performed continually while other steps in the methods are also performed.
[037] The explanation for the tracking at S270 above assumes that the thoracoscope is a white light scope which only“sees” visible colors. However, other methods may be employed for the tracking movement of the interventional medical device at S270, such as if there are not enough features in the white light view. That is, movement of an interventional medical instrument may also be tracked at S270 based on light emitted in a frequency band outside of a visible region. Other methods include use of a hyperspectral camera/scope, a near-infrared or infrared scope, or a fluorescence scope - each of which“see” the tissue at wavelengths outside of the visible region. Alternative (non-optical) imaging modalities such as endoscopic ultrasound may also be incorporated as part of the augmented reality view.
[038] FIG. 2B illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[039] The method in FIG. 2B overlaps at the beginning with the method of FIG. 2A, and descriptions of the overlapping parts are not detailed since they may be the same as in FIG. 2A In FIG. 2B, the method again starts with imaging at S210, continues with pre-operative segmentation at S220, and includes imaging of an inflated lung at S230. The method of FIG. 2B also includes inserting a scope into the patient and deflating the lung at S240, imaging the deflated lung at S250, and registering the scope to the deflated lung at S260.
[040] However, in the method of FIG. 2B, the deflated lung is imaged again so that additional images are acquired during the interventional procedure, and the 3D model(s) of the deflated lung is updated at S285 based on the additional images as movement of the scope is tracked relative to the lung surface at S270 and the scope view is augmented at S280. The method in FIG. 2B concludes again with resecting a tumor in the lung, based on intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 2B. However, the re-imaging of the deflated lung and updating of the 3D model(s) of the deflated lung at S285 may be performed selectively and dynamically intraoperatively in order to improve the intraoperative surgical navigation.
[041] In the embodiment of FIG. 2B, additional cone-beam CT images or fluoroscopy images can be acquired during surgery to update the 3D models of the deflated lung at S285. In the case of fluoroscopy, a sparse 3D reconstruction of the scene can be achieved by taking two or more projections, identifying common image features between those projections, and then reconstructing the 3D positions of only those features. Alternatively, a 3D reconstruction of the scene can be achieved by using an image analysis algorithm to automatically find anatomical landmarks to use in registering the image. As a result of the additional imaging at S285, the interventional medical instrument is imaged using multiple x-ray projections during the interventional procedure, since the interventional medical instrument is inserted into the patient at S240 and imaging of the deflated lung is performed both at S250 and again at S285.
[042] In the embodiments of FIGs. 2A and 2B, one intraoperative cone-beam CT scan of the lung in the inflated state and one intraoperative cone-beam CT scan of the lung in the deflated state may be obtained and used. In some embodiments that are alternative to those in FIGs. 2A and 2B, rather than having two intraoperative cone-beam CT scans, a single cone-beam CT scan of the lung in the deflated state can be used. In these embodiments, a deformable registration between the cone-beam CT image of the lung in the deflated state and the pre-operative CT image may be required. Workflow is simplified by using only one intraoperative scan such as the single cone-beam CT scan of the lung in the deflated state in these alternative embodiments.
[043] In some embodiments, when performing the deformable registration, anatomical structures of interest may have already been segmented from the pre-operative CT image. If so, then these segmentations can be used to guide the registration at S260. In one embodiment, a second set of segmentations are performed on the cone-beam CT image(s) from S250 and used in the registration at S260. Alternatively, it is still possible to register the cone-beam CT image(s) from S230 and/or S250 with the pre-operative CT image(s) from S210 without the need for segmentation in either imaging step. In these alternative embodiments, an advantage of simplicity is obtained with a tradeoff of potential loss of accuracy.
[044] In other embodiments, if contrast fluoroscopy is used during the surgery separate from the imaging at S210, S230 and S250, the additional information provided by the contrast images may be used to update the deflated models at S285 and improve the registration accuracy. For example, additional information in contrast images may be vessels shown more clearly.
[045] FIG. 3 illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[046] At S305, the method of FIG. 3 starts with inserting a scope in the patient and deflating a lung in the patient. That is, in the embodiment of FIG. 3 pre-operative imaging may not be required. Instead the patient may be prepared for surgery immediately before the surgery and the thoracoscope inserted through a port.
[047] At S310, the method of FIG. 3 continues with imaging the deflated lung. That is, the lung is collapsed, and a cone-beam CT image is acquired of the deflated lung at S310. The imaging at S310 also involves the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using cone-beam computed tomography during the interventional procedure while the lung is in the deflated state.
[048] At S321, the method of FIG. 3 includes segmenting the image(s) of the deflated lung taken at S310. Important structures of the lung anatomy, like the vessels, airways, and tumor, are segmented directly from cone-beam CT image of the deflated lung. As a result, a 3D model of the anatomy in the deflated state is generated showing the segmentation.
[049] At S360, the scope is registered to the deflated lung in the method of FIG. 3. Similar to the embodiments of FIGs. 2A and 2B, a pose of the thoracoscope pose is registered to the 3D model.
[050] At S370, movement of the scope is tracked relative to lung surface in the method of FIG. 3. Similar again to the embodiments of FIGs. 2A and 2B the pose of the thoracoscope can be tracked with respect to the lung surface or otherwise such as with electromagnetic sensors. The tracking at S370 may be based on light emitted in a frequency band within a visible region or based on light emitted in a frequency band outside of a visible region.
[051] At S380, the scope view is augmented in the method of FIG. 3. Similar once again to the embodiments of FIGs. 2A and 2B, the thoracoscope view can be augmented with the 3D model information.
[052] The method of FIG. 3 again concludes with resecting a tumor in the lung, based on the intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 3.
[053] An advantage of the workflow of the embodiment in FIG. 3 is that deformable registration between images from CT and from cone-beam CT is not required, and a cone-beam CT image of the inflated lung is not needed as well.
[054] Another hybrid embodiment is applicable when segmentation of a cone-beam CT image of a deflated lung is difficult. In this hybrid approach, an intraprocedural cone-beam CT image of an inflated lung is obtained but again without a pre-operative CT imaging process. In this embodiment, the segmentation is performed on the cone-beam CT image of the inflated lung, and registration is performed to map the cone-beam CT image segmentation model to the model based on the intraoperative cone-beam CT image of the deflated lung.
[055] FIG. 4 illustrates a system for intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
[056] The system 400 of FIG. 4 includes a first medical imaging system 410, a computer 420, a display 425, and a tracked device 450.
[057] An example of the first medical imaging system 410 is a cone-beam computed tomography imaging system. A cone-beam computed tomography imaging system provides three-dimensional (3D) imaging with X-rays in the shape of a cone. A cone-beam computed tomography imaging system differs from a computed tomography imaging system. For example, a computed tomography imaging system generates X-ray beams in the shape of a rotating fan to capture slices of a limited thickness, whereas a cone-beam computed tomography imaging system generates the X-ray beams in the shape of the cone. Also, a patient does not have to advance or move at all in the cone-beam CT, whereas a patient advances during a CT procedure. Thus, the difference between cone-beam CT and CT is not necessarily a matter of simply flipping a switch to activate different modes using the same system; rather, cone-beam CT and CT as described herein may involve imaging by entirely different systems. In FIG. 4, the first medical imaging system 410 is typically the cone-beam CT, and performs imaging such as at S230, S250, S285, and S310.
[058] The computer 420 may include a controller described herein. A controller described herein may include a combination of a memory that stores instructions and a processor that executes the instructions in order to implement processes described herein. A controller may be housed within or linked to a workstation such as the computer 420 or another assembly of one or more computing devices, a display/monitor, and one or more input devices (e.g., a keyboard, joysticks and mouse) in the form of a standalone computing system, a client computer of a server system, a desktop or a tablet. The descriptive label for the term“controller” herein facilitates a distinction between controllers as described herein without specifying or implying any additional limitation to the term“controller”. The term“controller” broadly encompasses all structural configurations, as understood in the art of the present disclosure and as exemplarily described in the present disclosure, of an application specific main board or an application specific integrated circuit for controlling an application of various principles as described in the present disclosure. The structural configuration of the controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
[059] Additionally, although Fig. 4 shows components networked together, two such components may be integrated into a single system. For example, the computer 420 may be integrated with the display 425 and/or with the first medical imaging system 410. That is, in some embodiments, functionality attributed to the computer 420 may be implemented by (e.g., performed by) a system that includes the first medical imaging system 410. On the other hand, the networked components shown in Fig. 4 may also be spatially distributed such as by being distributed in different rooms or different buildings, in which case the networked components may be connected via data connections. In still another embodiment, one or more of the components in Fig. 4 is not connected to the other components via a data connection, and instead is provided with input or output manually such as by a memory stick or other form of memory. In yet another embodiment, functionality described herein may be performed based on functionality of the elements in Fig. 4 but outside of the system shown in Fig. 4.
[060] The computer 420 in Fig. 4 may include some or all elements and functionality of the general computer system described below with respect to Fig. 5. For example, the computer 420 may include a controller for registering a scope to a cone-beam CT image of a deflated lung, for tracking a scope, and/or for augmenting a view of a scope. A process executed by a controller may include receiving a three-dimensional model of anatomy of a lung that is the subject of an interventional procedure.
[061] The display 425 may be used to display the three-dimensional models, the cone-beam CT images obtained at S230, S250, S285 and S310, the scope views and the augmented scope views, and other imagery and views described herein. Imagery obtained during a medical intervention may be, for example, imagery of an inflated lung, imagery of a deflated lung, and imagery or positional information of the tracked device 450. Imagery that may be displayed on a display 425 includes imagery obtained during a medical intervention, imagery of a 3D model of a lung generated based on segmentation, and other visual information described herein.
[062] As the term“display” is used herein, the term should be interpreted to include a class of features such as a“display device” or“display unit”, and these terms encompass an output device, or a user interface adapted for displaying images and/or data. A display may output visual, audio, and or tactile data. Examples of a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light-emitting diode displays (OLED), a projector, and Head-mounted display.
[063] Movement of the tracked device 450 may be tracked using white light, infrared or near- infrared, electromagnetism, or other tracking technologies such as optical shape sensing. That is, movement of an interventional medical instrument as a tracked device 450 may be tracked at either based on light emitted in a frequency band within a visible region or based on light emitted in a frequency band outside of a visible region. The tracking of the tracked device 450 results in positions and/or pose of the tracked device 450 being sent to the computer 420. The computer 420 processes positions of the tracked device 450 and the medical images from the first medical imaging system 410 to, for example, perform the registration at S260 and the tracking at S270, and to control the augmentation at S280.
[064] In an embodiment implemented with features of FIGs. 2A, 2B, 3 and/or 4 described above, a full surgical workflow incurs a minimal number of steps and components to set up, while also providing ease and consistency in performing each step. As a result, the methods described herein can be efficiently executed. Several tradeoffs are possible in implementing the intraoperative imaging-based surgical navigation described herein. For example, reliability may be traded with simplicity, so that achieving a reliable workflow via, for example, more user input, can be offset with a simpler workflow via, for example, less user involvement.
[065] One such tradeoff is between tracking and deformable registration. In a favorable workflow, a surgeon can immediately locate anatomical features in the thoracoscope view in real time. But since features such as tumors are embedded in tissue and invisible, such information must be transferred from a different modality which resides in a different coordinate space. Furthermore, the tumor can undergo deformation between image acquisitions, placing this problem in the realm of deformable registration. Deformable registration may be replaced in some embodiments with tracking discrete anatomical features or surrogates thereof in real time.
[066] Another tradeoff described herein is the use of extrinsic tracking via markers with intrinsic tracking via image features. Insofar as marker placement can complicate workflow and potentially raise health risks, the efficient procedures described herein which do not require or use physical markers can be seen as a tradeoff with the use of such markers.
[067] One other tradeoff described herein is the tradeoff between simplicity and accuracy based on using imaging of an inflated lung. Whereas deformation compensation, tracking, and/or augmentation facilities are most beneficial at the deflated lung state since the resection is performed when the lung is in the deflated state, workflows addressing the deflated lung exclusively are simplified. Therefore, sacrificing simplicity in some embodiments with transformative mapping from anatomical features identified preoperatively when the lung is inflated may be appropriate in other embodiments.
[068] FIG. 5 illustrates a general computer system, on which a method of intraoperative imaging-based surgical navigation can be implemented, in accordance with another representative embodiment.
[069] The computer system 500 can include a set of instructions that can be executed to cause the computer system 500 to perform any one or more of the methods or computer-based functions disclosed herein. The computer system 500 may operate as a standalone device or may be connected, for example, using a network 501, to other computer systems or peripheral devices.
[070] In a networked deployment, the computer system 500 may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 500 can also be implemented as or incorporated into various devices, such as the first medical imaging system 410, the computer 420, a second medical imaging system in the embodiment of FIG. 4 (not shown), a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. The computer system 500 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices. In an embodiment, the computer system 500 can be implemented using electronic devices that provide voice, video or data communication. Further, while the computer system 500 is illustrated in the singular, the term "system" shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
[071] As illustrated in Fig. 5, the computer system 500 includes a processor 510. A processor for a computer system 500 is tangible and non-transitory. As used herein, the term“non- transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term“non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. A processor is an article of manufacture and/or a machine component. A processor for a computer system 500 is configured to execute software instructions to perform functions as described in the various embodiments herein. A processor for a computer system 500 may be a general-purpose processor or may be part of an application specific integrated circuit (ASIC). A processor for a computer system 500 may also be a microprocessor, a microcomputer, a processor chip, a controller, a microcontroller, a digital signal processor (DSP), a state machine, or a programmable logic device. A processor for a computer system 500 may also be a logical circuit, including a programmable gate array (PGA) such as a field programmable gate array (FPGA), or another type of circuit that includes discrete gate and/or transistor logic. A processor for a computer system 500 may be a central processing unit (CPU), a graphics processing unit (GPU), or both. Additionally, any processor described herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
[072] A“processor” as used herein encompasses an electronic component which is able to execute a program or machine executable instruction. References to the computing device comprising“a processor” should be interpreted as possibly containing more than one processor or processing core. The processor may for instance be a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems. The term computing device should also be interpreted to possibly refer to a collection or network of computing devices each including a processor or processors. Many programs have instructions performed by multiple processors that may be within the same computing device or which may even be distributed across multiple computing devices.
[073] Moreover, the computer system 500 may include a main memory 520 and a static memory 530, where memories in the computer system 500 may communicate with each other via a bus 508. Memories described herein are tangible storage mediums that can store data and executable instructions and are non-transitory during the time instructions are stored therein. As used herein, the term“non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term“non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. A memory described herein is an article of manufacture and/or machine component. Memories described herein are computer- readable mediums from which data and executable instructions can be read by a computer. Memories as described herein may be random access memory (RAM), read only memory (ROM), flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu- ray disk, or any other form of storage medium known in the art. Memories may be volatile or non-volatile, secure and/or encrypted, unsecure and/or unencrypted.
[074] “Memory” is an example of a computer-readable storage medium. Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to“computer memory” or“memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
[075] As shown, the computer system 500 may further include a video display unit 550, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT). Additionally, the computer system 500 may include an input device 560, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 570, such as a mouse or touch-sensitive input screen or pad. The computer system 500 can also include a disk drive unit 580, a signal generation device 590, such as a speaker or remote control, and a network interface device 540.
[076] In an embodiment, as depicted in Fig. 5, the disk drive unit 580 may include a computer- readable medium 582 in which one or more sets of instructions 584, e.g. software, can be embedded. Sets of instructions 584 can be read from the computer-readable medium 582. Further, the instructions 584, when executed by a processor, can be used to perform one or more of the methods and processes as described herein. In an embodiment, the instructions 584 may reside completely, or at least partially, within the main memory 520, the static memory 530, and/or within the processor 510 during execution by the computer system 500.
[077] In an alternative embodiment, dedicated hardware implementations, such as application- specific integrated circuits (ASICs), programmable logic arrays and other hardware components, can be constructed to implement one or more of the methods described herein. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. Nothing in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
[078] In accordance with various embodiments of the present disclosure, the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing can be constructed to implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
[079] The present disclosure contemplates a computer- readable medium 582 that includes instructions 584 or receives and executes instructions 584 responsive to a propagated signal; so that a device connected to a network 501 can communicate voice, video or data over the network 501. Further, the instructions 584 may be transmitted or received over the network 501 via the network interface device 540.
[080] Accordingly, intraoperative imaging-based surgical navigation enables a surgeon to navigate to a lung tumor without the use of physical markers, so that markers do not have to be placed well before or even immediately before a lung tumor is resected in a surgery. As a result, intraoperative imaging-based surgical navigation can be used with minimally invasive surgery and result in more complete and more accurate removal of lung tumors, reduced requirements for follow-up surgeries and subsequent radiation/chemotherapy or avoidance of recurrence. Additionally, intraoperative imaging-based surgical navigation can help avoid removing healthy tissue, which helps avoid compromising lung function and/or prolonging recovery times. Moreover, avoiding or reducing the placement of physical markers by using intraoperative image-based surgical navigation can avoid additional complications and hospital/patient burden.
[081] The representative embodiments described above help alleviate some of the challenges described herein for lung tumor resections by providing improved 3D guidance in procedures involving resecting a tumor from the deflated lung. The representative embodiments described herein can be used to provide surgical workflows with intra-operative imaging and performed without use of physical markers being placed in the tumor.
[082] Although intraoperative imaging-based surgical navigation has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of intraoperative imaging-based surgical navigation in its aspects. Although intraoperative imaging-based surgical navigation has been described with reference to particular means, materials and embodiments, intraoperative imaging-based surgical navigation is not intended to be limited to the particulars disclosed; rather intraoperative imaging-based surgical navigation extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.
[083] The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of the disclosure described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
[084] One or more embodiments of the disclosure may be referred to herein, individually and/or collectively, by the term“invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept. Moreover, although specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
[085] The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b) and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description, with each claim standing on its own as defining separately claimed subject matter.
[086] The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.

Claims

CLAIMS:
1. A controller (420) for assisting navigation in an interventional procedure, comprising: a memory (520) that stores instructions, and
a processor (510) that executes the instructions,
wherein, when executed by the processor (510), the instructions cause the controller (420) to implement a process that includes:
registering (S260) coordinate systems of an interventional medical instrument (450) and an intra-operative image of a lung;
calculating (S260) a deformation between the lung in a deflated state (S250) and the lung in an inflated state (S230);
applying (S260) the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three-dimensional model of the lung in the deflated state; and
generating (S280) an image of the modified three-dimensional model of the lung in the deflated state.
2. The controller (420) of claim 1, wherein the process implemented when the processor (510) executes the instructions further comprises:
determining (S270) a location of the interventional medical instrument (450) in the deflated state; and
augmenting (S280) a view from the interventional medical instrument (450) at the location during the interventional procedure with features of the modified three-dimensional model of the lung in the deflated state.
3. The controller (420) of claim 2, wherein the process implemented when the processor (510) executes the instructions further comprises:
tracking (S270) movement of the interventional medical instrument (450) relative to the lung in the deflated state; updating (S280) the view from the interventional medical instrument (450), based on the movement of the interventional medical instrument, with the features of the modified three- dimensional model of the lung in the deflated state.
4. The controller (420) of claim 3, wherein the movement of the interventional medical instrument (450) is tracked (S270) based on light emitted in a frequency band outside of a visible region.
5. The controller (420) of claim 3, further comprising:
acquiring (S285) additional images during the interventional procedure; and
updating (S285) the modified three-dimensional model of the lung in the deflated state based on the additional images.
6. The controller (420) of claim 1, wherein the interventional medical instrument (450) is imaged (S250) using cone-beam computed tomography during the interventional procedure while the lung is in the deflated state, and
the lung in the inflated state is imaged (S230) using cone-beam computed tomography during the interventional procedure.
7. The controller (420) of claim 1, wherein the interventional medical instrument (450) is imaged (S250) using cone-beam computed tomography during the interventional procedure while the lung is in the deflated state,
the lung in the inflated state is imaged (S210) using computed tomography prior to the interventional procedure to obtain a computed tomography image of the lung in the inflated state, and
the interventional medical instrument (450) imaged (S250) while the lung is in the deflated state is registered (S260) to the computed tomography image of the lung in the inflated state.
8. The controller (420) of claim 1, wherein the interventional medical instrument (450) is imaged using multiple X-ray projections during the interventional procedure while the lung is in the deflated state,
the lung in the inflated state is imaged (S210) using computed tomography prior to the interventional procedure to obtain a computed tomography image of the lung in the inflated state, and
the interventional medical instrument (450) imaged while the lung that is in the deflated state is registered (S260) to the computed tomography image of the lung in the inflated state.
9. The controller (420 of claim 1 , wherein the three-dimensional original model of the lung is obtained by segmenting (S220) at least one of imagery obtained based on cone-beam computed tomography imaging of the lung in the inflated state and imagery obtained based on computed tomography imaging of the lung in the inflated state.
10. A controller (420) for assisting navigation in an interventional procedure, comprising:
a memory (520) that stores instructions, and
a processor (510) that executes the instructions,
wherein, when executed by the processor (510), the instructions cause the controller (420) to implement a process that includes:
segmenting (S321) a cone-beam computed tomography image of a lung in a deflated state during the interventional procedure to obtain a three-dimensional original model of the lung in the deflated state;
registering (S360) coordinate systems of an interventional medical instrument (450) to the three-dimensional original model of the lung in the deflated state; and
generating (S380) an image of the original three-dimensional model of the lung in the deflated state.
11. The controller (420) of claim 10, wherein the process implemented when the processor (510) executes the instructions further comprises: determining (S370) a location of the interventional medical instrument (450) in the deflated state; and
augmenting (S380) a view from the interventional medical instrument (450) at the location during the interventional procedure with features of the original three-dimensional model of the lung in the deflated state.
12. The controller (420) of claim 11, wherein the process implemented when the processor (510) executes the instructions further comprises:
tracking (S370) movement of the interventional medical instrument (450) relative to the lung in the deflated state;
updating (S380) the view from the interventional medical instrument, based on the movement of the interventional medical instrument (450), with the features of the original three- dimensional model of the lung in the deflated state.
13. The controller (420) of claim 11, wherein the interventional procedure is performed without use of a physical marker to mark a location on the lung in the deflated state.
14. A system (400) for assisting navigation in an interventional procedure, comprising: a cone-beam computed tomography imaging apparatus (410) that generates a cone-beam computed tomography image of a lung in a deflated state;
a computer (420) comprising a controller (420) with a memory (520) that stores instructions and a processor (510) that executes the instructions,
wherein, when executed by the processor (510), the instructions cause the controller (420) to implement a process that includes:
registering (S260) coordinate systems of an interventional medical instrument (450) and an intraoperative image of the lung;
calculating (S260) a deformation between the lung in the deflated state and the lung in an inflated state; applying (S260) the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three-dimensional model of the lung in the deflated state; and
generating (S280) an image of the modified three-dimensional model of the lung in the deflated state.
15. The system (400) of claim 14, further comprising:
the interventional medical instrument (450), wherein the interventional medical instrument (450) comprises a thoracoscope, and
wherein the process implemented when the processor (510) executes the instructions further comprises determining (S270) a location of the thoracoscope in the deflated state and augmenting (S280) a view from the thoracoscope at the location during the interventional procedure with features of the modified three-dimensional model of the lung in the deflated state.
16. The system (400) of claim 15, wherein the process implemented when the processor (510) executes the instructions further comprises:
tracking (270) movement of the interventional medical instrument (450) relative to the lung in the deflated state;
updating (S280) the view from the interventional medical instrument, based on the movement of the interventional medical instrument, with the features of the modified three- dimensional model of the lung in the deflated state.
PCT/EP2020/064177 2019-05-22 2020-05-20 Intraoperative imaging-based surgical navigation WO2020234409A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201962851174P true 2019-05-22 2019-05-22
US62/851,174 2019-05-22

Publications (1)

Publication Number Publication Date
WO2020234409A1 true WO2020234409A1 (en) 2020-11-26

Family

ID=70802864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/064177 WO2020234409A1 (en) 2019-05-22 2020-05-20 Intraoperative imaging-based surgical navigation

Country Status (1)

Country Link
WO (1) WO2020234409A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140073907A1 (en) * 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
WO2016178690A1 (en) * 2015-05-07 2016-11-10 Siemens Aktiengesellschaft System and method for guidance of laparoscopic surgical procedures through anatomical model augmentation
US20180161102A1 (en) * 2014-10-30 2018-06-14 Edda Technology, Inc. Method and system for estimating a deflated lung shape for video assisted thoracic surgery in augmented and mixed reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140073907A1 (en) * 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
US20180161102A1 (en) * 2014-10-30 2018-06-14 Edda Technology, Inc. Method and system for estimating a deflated lung shape for video assisted thoracic surgery in augmented and mixed reality
WO2016178690A1 (en) * 2015-05-07 2016-11-10 Siemens Aktiengesellschaft System and method for guidance of laparoscopic surgical procedures through anatomical model augmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
UNERI ALI ET AL: "Deformable registration of the inflated and deflated lung in cone-beam CT-guided thoracic surgery: Initial investigation of a combined model- and image-driven approach", MEDICAL PHYSICS, AIP, MELVILLE, NY, US, vol. 40, no. 1, 18 December 2012 (2012-12-18), pages 17501-1 - 17501-8, XP012170938, ISSN: 0094-2405, [retrieved on 20121218], DOI: 10.1118/1.4767757 *

Similar Documents

Publication Publication Date Title
US20200405433A1 (en) System and method for dynamic validation, correction of registration for surgical navigation
Bernhardt et al. The status of augmented reality in laparoscopic surgery as of 2016
US10074176B2 (en) Method, system and apparatus for displaying surgical engagement paths
EP2637593B1 (en) Visualization of anatomical data by augmented reality
US10074177B2 (en) Method, system and apparatus for quantitative surgical image registration
EP3398552A1 (en) Medical image viewer control from surgeon's camera
JP2019511931A (en) Alignment of Surgical Image Acquisition Device Using Contour Signature
Bertolo et al. Systematic review of augmented reality in urological interventions: the evidences of an impact on surgical outcomes are yet to come
Luo et al. Augmented reality navigation for liver resection with a stereoscopic laparoscope
Liu et al. Intraoperative image‐guided transoral robotic surgery: pre‐clinical studies
Yaniv et al. Applications of augmented reality in the operating room
US20140275994A1 (en) Real time image guidance system
EP3666218A1 (en) Systems for imaging a patient
WO2020234409A1 (en) Intraoperative imaging-based surgical navigation
Chen et al. Video-guided calibration of an augmented reality mobile C-arm
US10102681B2 (en) Method, system and apparatus for adjusting image data to compensate for modality-induced distortion
US11045261B2 (en) Method, system and apparatus for surface rendering using medical imaging data
US10893843B2 (en) System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US11172184B2 (en) Systems and methods for imaging a patient
WO2020182997A1 (en) Dynamic interventional three-dimensional model deformation
Mirota Video-based navigation with application to endoscopic skull base surgery
Wang et al. Augmented Reality for Digital Orthopedic Applications
CN113614844A (en) Dynamic intervention three-dimensional model deformation
WO2021074422A1 (en) Dynamic tissue imagery updating

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20727624

Country of ref document: EP

Kind code of ref document: A1