WO2020234409A1 - Navigation chirurgicale basée sur une imagerie peropératoire - Google Patents

Navigation chirurgicale basée sur une imagerie peropératoire Download PDF

Info

Publication number
WO2020234409A1
WO2020234409A1 PCT/EP2020/064177 EP2020064177W WO2020234409A1 WO 2020234409 A1 WO2020234409 A1 WO 2020234409A1 EP 2020064177 W EP2020064177 W EP 2020064177W WO 2020234409 A1 WO2020234409 A1 WO 2020234409A1
Authority
WO
WIPO (PCT)
Prior art keywords
lung
medical instrument
deflated state
interventional
controller
Prior art date
Application number
PCT/EP2020/064177
Other languages
English (en)
Inventor
Torre Michelle BYDLON
Paul Thienphrapa
Alvin Chen
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2020234409A1 publication Critical patent/WO2020234409A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00743Type of operation; Specification of treatment sites
    • A61B2017/00809Lung operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/40Arrangements for generating radiation specially adapted for radiation diagnosis
    • A61B6/4064Arrangements for generating radiation specially adapted for radiation diagnosis specially adapted for producing a particular type of beam
    • A61B6/4085Cone-beams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Definitions

  • [001] Lung cancer is a deadly form of cancer with surgery being the preferred treatment of choice for early stage tumors.
  • the most invasive form of surgery historically is open surgery, where the chest is split open to expose a large portion of the lung.
  • surgical tools such as scalpels are inserted through a large opening in the thorax and used to remove the tumor.
  • the open surgical techniques allow physical access for palpation to sense the tumors by touch.
  • VATS was developed to provide a more minimally invasive approach to lung tumor resection.
  • a small camera is inserted into the chest cavity through a small port (i.e. a small hole or incision) and the surgical instruments are inserted through the same port or other small ports.
  • a small port i.e. a small hole or incision
  • palpation to sense the tumors by touch is more difficult under VATS due to constrained access and no haptic feedback, and the entire resection is done using the camera view.
  • FIG. 1 illustrates a known VATS implementation for lung resection.
  • a thoracoscope or a small camera stick is inserted through the rib cage of a patient P as one of the instruments.
  • surgeons and research groups have investigated ways of improving the surgical workflow to better guide tumor resection. This can be done by implanting dyes or markers into the tumor or with better imaging techniques.
  • intra-operative cone beam CT is used for a needle-guided insertion of a marker.
  • the marker is placed in the center of the tumor and a string/wire comes out to the surface of the lung.
  • thoracoscope is then inserted and the traditional VATS procedure is completed with the surgeon following the wire to the marker at the center of the tumor. Visible and fluorescent dyes can also be injected into the tumor to serve as a visual marker for the surgeon as the tissue is dissected.
  • Another known mechanism provides a deformable registration algorithm that calculates a deformation matrix between cone-beam CT images of inflated and deflated lungs in phantom and animal models.
  • a controller for assisting navigation in an interventional procedure includes a memory that stores instructions, and a processor that executes the instructions.
  • the instructions When executed by the processor, the instructions cause the controller to implement a process that includes registering coordinate systems of an interventional medical instrument and an intra-operative image of a lung; and calculating a deformation between the lung in the deflated state and the lung in the inflated state.
  • the process implemented when the processor executes the instructions also includes applying the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three-dimensional model of the lung in the deflated state; and generating an image of the modified three-dimensional model of the lung in the deflated state.
  • a controller for assisting navigation in an interventional procedure includes a memory that stores instructions, and a processor that executes the instructions.
  • the instructions When executed by the processor, the instructions cause the controller to implement a process that includes segmenting a cone-beam computed tomography image of a lung in a deflated state during the interventional procedure to obtain a three-dimensional original model of the lung in the deflated state.
  • the process implemented when the processor executes the instructions also includes registering coordinate systems of an interventional medical instrument to the three-dimensional original model of the lung in the deflated state; and generating an image of the original three-dimensional model of the lung in the deflated state.
  • a system for assisting navigation in an interventional procedure includes a cone-beam computed tomography imaging apparatus and a computer.
  • the cone-beam computed tomography imaging apparatus generates a cone-beam computed tomography image of a lung in a deflated state.
  • the computer includes a controller with a memory that stores instructions and a processor that executes the instructions. When executed by the processor, the instructions cause the controller to implement a process that includes registering coordinate systems of an interventional medical instrument and an intraoperative image of the lung and calculating a deformation between the lung in the deflated state and the lung in the inflated state.
  • the process also includes applying the deformation to a three-dimensional original model of the lung in the inflated state to generate a modified three- dimensional model of the lung in the deflated state; and generating an image of the modified three-dimensional model of the lung in the deflated state.
  • FIG. 1 illustrates a known VATS implementation for lung resection.
  • FIG. 2A illustrates a method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • FIG. 2B illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • FIG. 3 illustrates another method of intraoperative imaging -based surgical navigation, in accordance with a representative embodiment.
  • FIG. 4 illustrates a system for intraoperative imaging -based surgical navigation, in accordance with a representative embodiment.
  • FIG. 5 illustrates a general computer system, on which a method of intraoperative imaging-based surgical navigation can be implemented, in accordance with another representative embodiment
  • mechanisms for intraoperative imaging-based surgical navigation are useful in mitigating the challenges of VATS or RATS (radio assisted thoracoscopic surgery).
  • the mechanisms described herein are placed in the context of the surgical setup and workflow, so that components and procedural steps are sequenced to enable productive use of time and resources.
  • the embodiments of intraoperative imaging-based surgical navigation described below each typically involve intraoperative cone-beam CT or fluoroscopy image-based registration methods, along with image-based tracking.
  • FIG. 2A illustrates a method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • the method starts with imaging at S210.
  • the imaging at S210 may be computed tomography and/or positron emission tomography-computed tomography performed prior to surgery.
  • the imaging at S210 may be performed at a different time and place, and under the supervision and control of different personnel than the imaging using the equipment in the system 400 of FIG. 4 described later.
  • the imaging at S210 involves imaging the lung in the inflated state using computed tomography prior to an interventional procedure. That is, the imaging at S210 may result in the lung in the inflated state being imaged using computed tomography.
  • the imaging at S210 may optionally involve imaging the lung in a partially inflated state using computed tomography, where the partially inflated state may be different than the intraoperative state.
  • the method of FIG. 2A continues with pre-operative segmentation at S220.
  • An algorithm is used to perform the segmentation and is applied to the images obtained in S210 to produce a three-dimensional (3D) model of the anatomical features (e.g., airways, vessels, fissures, tumor, and/or lymph nodes) of the lung that was imaged at S210.
  • 3D three-dimensional
  • Segmentation is a representation of the surface of structures of the anatomical features (e.g., airways, vessels, fissures, tumor and/or lymph nodes) and consists for example of a set of points in three- dimensional (3-D) coordinates on the surfaces of the lung, and triangular plane segments defined by connecting neighboring groups of three points, such that the entire structure is covered by a mesh of non-intersecting triangular planes.
  • a three-dimensional model of the lung is obtained by segmenting at least one of imagery obtained based on cone-beam computed tomography imaging of the lung in the inflated state in an alternative to S210 and imagery obtained based on computed tomography imaging of the lung in the inflated state as in S210.
  • the method of FIG. 2A includes imaging of an inflated lung.
  • the image of the inflated lung at S230 may be considered intraoperative and may be an intraoperative image taken using computed tomography or cone-beam computed tomography.
  • the imaging of the inflated lung at S230 may be with a cone-beam CT imaging apparatus. That is, at the beginning of surgery a cone-beam CT scan of the patient may be completed when the lung is still inflated.
  • the cone-beam CT image of the inflated lung can be used subsequently to register the pre operative image(s) obtained at S210 to the intra-operative state and make any alignment adjustments to the patient’s positioning.
  • the imaging at S210 and the imaging at S230 may both be via cone-beam CT even when performed in different places and/or different times and/or with different cone-beam CT imaging apparatuses.
  • the 3D model created at S220 may be updated after the images from S210 and S230 are registered.
  • CT images from S210 may be registered with cone-beam CT images from S230, and then the 3D model created at S220 is updated prior to the process described next for S240.
  • a rigid transformation may be applied after S230 to the 3D model from S220 to just account for differences in patient position.
  • a deformable transformation may be applied after S230 to the 3D model from S220 to account for non-rigid changes in the lung.
  • a scope is inserted into the patient and the lung is deflated.
  • the scope may be a thoracoscope.
  • the scope is inserted into the chest cavity and may be, but is not necessarily, inserted into the lung.
  • the lung is deflated after the scope is inserted in S240.
  • the method of FIG. 2A includes imaging of the deflated lung.
  • the imaging of the deflated lung at S250 may be via cone-beam CT and may be performed with the same cone- beam CT imaging apparatus used in S230.
  • the imaging at S250 may be visible light imaging.
  • the imaging at S250 may involve the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using cone-beam computed tomography during the interventional procedure.
  • the imaging of the deflated lung at S250 may be via one or more X-rays, which may involve the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using X-ray during the interventional procedure.
  • the method of FIG. 2A includes registering the scope to the deflated lung.
  • the registration at S260 may involve aligning the coordinate system of the scope inserted into the chest cavity while the lung is deflated and the coordinate system of an intra-operative image of the lung. Registration of coordinate systems may be performed in a variety of ways, such as by identifying and aligning landmarks present in both coordinate systems.
  • the intra-operative image of the lung may be that taken at S230 and may be an intra-operative image taken using computed tomography or cone-beam computed tomography.
  • a deformation matrix may be calculated between the cone-beam CT image of the inflated lung from S230 and the cone-beam CT image of the deflated lung from S250 and applied to the 3D model of the anatomical features which is created at S220 and already updated at or after S230. This results in a new 3D model of the anatomical features in the deflated state.
  • the 3D model can be used for anatomical reference and guidance during surgery and may exist as its own feature.
  • the deformation matrix may be a comprehensive all-encompassing deformation matrix which includes CT to cone-beam CT of the inflated lung to cone-beam CT of the deflated lung.
  • the deformation matrix may be applied in two steps, where one deformation matrix is applied for CT to the cone-beam CT of the inflated lung (e.g., after S230 as described above), and then a second deformation matrix is applied for the transformation from the cone- beam CT of the inflated lung to the cone-beam CT of the deflated lung.
  • the method of FIG. 2A includes tracking movement of the scope relative to the lung surface. That is, movement of an interventional medical instrument such as a scope may be tracked visually via the tissue surface of the lung; this may be with a visible light camera (such as the traditional thoracoscope), or by hyperspectral imaging, or by near-infrared (NIR) fluorescence imaging to name a few.
  • An interventional medical instrument may alternatively be tracked with external tracking technologies such as electromagnetic tracking using sensors, optical tracking using optical sense shaping (OSS), and by other forms of tracking technologies for tracking interventional medical instruments.
  • Applying a surface-feature tracking algorithm or different tracking methods to the scope allows the system used to implement S270 to move the 3D model on a monitor as the scope is moved with respect to the lung.
  • a 3D model may be presented on a headset screen or glasses, such as by using augmented reality.
  • a 3D model may be presented as a hologram.
  • the method of FIG. 2A includes augmenting the scope view.
  • the 3D model can be overlaid on the scope video-feed. Augmenting can be performed at S280 in other ways, such as by highlighting a tumor in the view of the scope by brightness or color, by visually warning of anatomical features that should be avoided, and in other ways that are supplemental to the teachings herein.
  • the method of FIG. 2A includes resecting a tumor in the lung, based on intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 2A.
  • the tumor can be resected at S290 using the augmented view of the thoracoscope and the augmented view helps ensure the surgeon knows where the tumor is located within the lung.
  • S270, S280 and S290 are shown partially overlapping in the vertical direction on the page. This reflects that the tracking at S270 may be performed continually before and during the augmenting of the scope view at S280, and both may be performed continually during the resecting of the tumor at S290.
  • methods described herein are generally shown as a series of discrete steps performed separately in sequence, some steps in the methods may be performed continually while other steps in the methods are also performed.
  • the thoracoscope is a white light scope which only“sees” visible colors.
  • other methods may be employed for the tracking movement of the interventional medical device at S270, such as if there are not enough features in the white light view. That is, movement of an interventional medical instrument may also be tracked at S270 based on light emitted in a frequency band outside of a visible region.
  • Other methods include use of a hyperspectral camera/scope, a near-infrared or infrared scope, or a fluorescence scope - each of which“see” the tissue at wavelengths outside of the visible region.
  • Alternative (non-optical) imaging modalities such as endoscopic ultrasound may also be incorporated as part of the augmented reality view.
  • FIG. 2B illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • FIG. 2B overlaps at the beginning with the method of FIG. 2A, and descriptions of the overlapping parts are not detailed since they may be the same as in FIG. 2A
  • the method again starts with imaging at S210, continues with pre-operative segmentation at S220, and includes imaging of an inflated lung at S230.
  • the method of FIG. 2B also includes inserting a scope into the patient and deflating the lung at S240, imaging the deflated lung at S250, and registering the scope to the deflated lung at S260.
  • the deflated lung is imaged again so that additional images are acquired during the interventional procedure, and the 3D model(s) of the deflated lung is updated at S285 based on the additional images as movement of the scope is tracked relative to the lung surface at S270 and the scope view is augmented at S280.
  • the method in FIG. 2B concludes again with resecting a tumor in the lung, based on intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 2B.
  • the re-imaging of the deflated lung and updating of the 3D model(s) of the deflated lung at S285 may be performed selectively and dynamically intraoperatively in order to improve the intraoperative surgical navigation.
  • additional cone-beam CT images or fluoroscopy images can be acquired during surgery to update the 3D models of the deflated lung at S285.
  • a sparse 3D reconstruction of the scene can be achieved by taking two or more projections, identifying common image features between those projections, and then reconstructing the 3D positions of only those features.
  • a 3D reconstruction of the scene can be achieved by using an image analysis algorithm to automatically find anatomical landmarks to use in registering the image.
  • the interventional medical instrument is imaged using multiple x-ray projections during the interventional procedure, since the interventional medical instrument is inserted into the patient at S240 and imaging of the deflated lung is performed both at S250 and again at S285.
  • one intraoperative cone-beam CT scan of the lung in the inflated state and one intraoperative cone-beam CT scan of the lung in the deflated state may be obtained and used.
  • a single cone-beam CT scan of the lung in the deflated state can be used.
  • a deformable registration between the cone-beam CT image of the lung in the deflated state and the pre-operative CT image may be required. Workflow is simplified by using only one intraoperative scan such as the single cone-beam CT scan of the lung in the deflated state in these alternative embodiments.
  • anatomical structures of interest may have already been segmented from the pre-operative CT image. If so, then these segmentations can be used to guide the registration at S260.
  • a second set of segmentations are performed on the cone-beam CT image(s) from S250 and used in the registration at S260.
  • an advantage of simplicity is obtained with a tradeoff of potential loss of accuracy.
  • the additional information provided by the contrast images may be used to update the deflated models at S285 and improve the registration accuracy.
  • additional information in contrast images may be vessels shown more clearly.
  • FIG. 3 illustrates another method of intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • the method of FIG. 3 starts with inserting a scope in the patient and deflating a lung in the patient. That is, in the embodiment of FIG. 3 pre-operative imaging may not be required. Instead the patient may be prepared for surgery immediately before the surgery and the thoracoscope inserted through a port.
  • the method of FIG. 3 continues with imaging the deflated lung. That is, the lung is collapsed, and a cone-beam CT image is acquired of the deflated lung at S310.
  • the imaging at S310 also involves the scope as the interventional medical instrument, such that the interventional medical instrument is imaged using cone-beam computed tomography during the interventional procedure while the lung is in the deflated state.
  • the method of FIG. 3 includes segmenting the image(s) of the deflated lung taken at S310.
  • Important structures of the lung anatomy like the vessels, airways, and tumor, are segmented directly from cone-beam CT image of the deflated lung.
  • a 3D model of the anatomy in the deflated state is generated showing the segmentation.
  • the scope is registered to the deflated lung in the method of FIG. 3. Similar to the embodiments of FIGs. 2A and 2B, a pose of the thoracoscope pose is registered to the 3D model.
  • movement of the scope is tracked relative to lung surface in the method of FIG. 3. Similar again to the embodiments of FIGs. 2A and 2B the pose of the thoracoscope can be tracked with respect to the lung surface or otherwise such as with electromagnetic sensors.
  • the tracking at S370 may be based on light emitted in a frequency band within a visible region or based on light emitted in a frequency band outside of a visible region.
  • the scope view is augmented in the method of FIG. 3. Similar once again to the embodiments of FIGs. 2A and 2B, the thoracoscope view can be augmented with the 3D model information.
  • FIG. 3 again concludes with resecting a tumor in the lung, based on the intraoperative surgical navigation enabled by the preceding features, functions and/or steps of FIG. 3.
  • An advantage of the workflow of the embodiment in FIG. 3 is that deformable registration between images from CT and from cone-beam CT is not required, and a cone-beam CT image of the inflated lung is not needed as well.
  • Another hybrid embodiment is applicable when segmentation of a cone-beam CT image of a deflated lung is difficult.
  • an intraprocedural cone-beam CT image of an inflated lung is obtained but again without a pre-operative CT imaging process.
  • the segmentation is performed on the cone-beam CT image of the inflated lung, and registration is performed to map the cone-beam CT image segmentation model to the model based on the intraoperative cone-beam CT image of the deflated lung.
  • FIG. 4 illustrates a system for intraoperative imaging-based surgical navigation, in accordance with a representative embodiment.
  • the system 400 of FIG. 4 includes a first medical imaging system 410, a computer 420, a display 425, and a tracked device 450.
  • An example of the first medical imaging system 410 is a cone-beam computed tomography imaging system.
  • a cone-beam computed tomography imaging system provides three-dimensional (3D) imaging with X-rays in the shape of a cone.
  • a cone-beam computed tomography imaging system differs from a computed tomography imaging system.
  • a computed tomography imaging system generates X-ray beams in the shape of a rotating fan to capture slices of a limited thickness
  • a cone-beam computed tomography imaging system generates the X-ray beams in the shape of the cone.
  • a patient does not have to advance or move at all in the cone-beam CT, whereas a patient advances during a CT procedure.
  • the difference between cone-beam CT and CT is not necessarily a matter of simply flipping a switch to activate different modes using the same system; rather, cone-beam CT and CT as described herein may involve imaging by entirely different systems.
  • the first medical imaging system 410 is typically the cone-beam CT, and performs imaging such as at S230, S250, S285, and S310.
  • the computer 420 may include a controller described herein.
  • a controller described herein may include a combination of a memory that stores instructions and a processor that executes the instructions in order to implement processes described herein.
  • a controller may be housed within or linked to a workstation such as the computer 420 or another assembly of one or more computing devices, a display/monitor, and one or more input devices (e.g., a keyboard, joysticks and mouse) in the form of a standalone computing system, a client computer of a server system, a desktop or a tablet.
  • the descriptive label for the term“controller” herein facilitates a distinction between controllers as described herein without specifying or implying any additional limitation to the term“controller”.
  • controller broadly encompasses all structural configurations, as understood in the art of the present disclosure and as exemplarily described in the present disclosure, of an application specific main board or an application specific integrated circuit for controlling an application of various principles as described in the present disclosure.
  • the structural configuration of the controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
  • Fig. 4 shows components networked together, two such components may be integrated into a single system.
  • the computer 420 may be integrated with the display 425 and/or with the first medical imaging system 410. That is, in some embodiments, functionality attributed to the computer 420 may be implemented by (e.g., performed by) a system that includes the first medical imaging system 410.
  • the networked components shown in Fig. 4 may also be spatially distributed such as by being distributed in different rooms or different buildings, in which case the networked components may be connected via data connections.
  • the computer 420 in Fig. 4 may include some or all elements and functionality of the general computer system described below with respect to Fig. 5.
  • the computer 420 may include a controller for registering a scope to a cone-beam CT image of a deflated lung, for tracking a scope, and/or for augmenting a view of a scope.
  • a process executed by a controller may include receiving a three-dimensional model of anatomy of a lung that is the subject of an interventional procedure.
  • the display 425 may be used to display the three-dimensional models, the cone-beam CT images obtained at S230, S250, S285 and S310, the scope views and the augmented scope views, and other imagery and views described herein.
  • Imagery obtained during a medical intervention may be, for example, imagery of an inflated lung, imagery of a deflated lung, and imagery or positional information of the tracked device 450.
  • Imagery that may be displayed on a display 425 includes imagery obtained during a medical intervention, imagery of a 3D model of a lung generated based on segmentation, and other visual information described herein.
  • the term“display” should be interpreted to include a class of features such as a“display device” or“display unit”, and these terms encompass an output device, or a user interface adapted for displaying images and/or data.
  • a display may output visual, audio, and or tactile data.
  • Examples of a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light-emitting diode displays (OLED), a projector, and Head-mounted display.
  • a display include, but are not limited to: a computer monitor, a television screen, a touch screen, tactile electronic display, Braille screen, Cathode ray tube (CRT), Storage tube, Bistable display, Electronic paper, Vector display, Flat panel display, Vacuum fluorescent display (VF), Light-emitting diode (LED) displays, Electroluminescent display (ELD), Plasma display panels (PDP), Liquid crystal display (LCD), Organic light-emitting diode
  • Movement of the tracked device 450 may be tracked using white light, infrared or near- infrared, electromagnetism, or other tracking technologies such as optical shape sensing. That is, movement of an interventional medical instrument as a tracked device 450 may be tracked at either based on light emitted in a frequency band within a visible region or based on light emitted in a frequency band outside of a visible region.
  • the tracking of the tracked device 450 results in positions and/or pose of the tracked device 450 being sent to the computer 420.
  • the computer 420 processes positions of the tracked device 450 and the medical images from the first medical imaging system 410 to, for example, perform the registration at S260 and the tracking at S270, and to control the augmentation at S280.
  • a full surgical workflow incurs a minimal number of steps and components to set up, while also providing ease and consistency in performing each step.
  • the methods described herein can be efficiently executed.
  • Several tradeoffs are possible in implementing the intraoperative imaging-based surgical navigation described herein. For example, reliability may be traded with simplicity, so that achieving a reliable workflow via, for example, more user input, can be offset with a simpler workflow via, for example, less user involvement.
  • FIG. 5 illustrates a general computer system, on which a method of intraoperative imaging-based surgical navigation can be implemented, in accordance with another representative embodiment.
  • the computer system 500 can include a set of instructions that can be executed to cause the computer system 500 to perform any one or more of the methods or computer-based functions disclosed herein.
  • the computer system 500 may operate as a standalone device or may be connected, for example, using a network 501, to other computer systems or peripheral devices.
  • the computer system 500 may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
  • the computer system 500 can also be implemented as or incorporated into various devices, such as the first medical imaging system 410, the computer 420, a second medical imaging system in the embodiment of FIG. 4 (not shown), a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the computer system 500 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices.
  • the computer system 500 can be implemented using electronic devices that provide voice, video or data communication.
  • the term "system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
  • the computer system 500 includes a processor 510.
  • a processor for a computer system 500 is tangible and non-transitory. As used herein, the term“non- transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term“non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • a processor is an article of manufacture and/or a machine component.
  • a processor for a computer system 500 is configured to execute software instructions to perform functions as described in the various embodiments herein.
  • a processor for a computer system 500 may be a general-purpose processor or may be part of an application specific integrated circuit (ASIC).
  • a processor for a computer system 500 may also be a microprocessor, a microcomputer, a processor chip, a controller, a microcontroller, a digital signal processor (DSP), a state machine, or a programmable logic device.
  • a processor for a computer system 500 may also be a logical circuit, including a programmable gate array (PGA) such as a field programmable gate array (FPGA), or another type of circuit that includes discrete gate and/or transistor logic.
  • a processor for a computer system 500 may be a central processing unit (CPU), a graphics processing unit (GPU), or both. Additionally, any processor described herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
  • A“processor” as used herein encompasses an electronic component which is able to execute a program or machine executable instruction.
  • References to the computing device comprising“a processor” should be interpreted as possibly containing more than one processor or processing core.
  • the processor may for instance be a multi-core processor.
  • a processor may also refer to a collection of processors within a single computer system or distributed amongst multiple computer systems.
  • the term computing device should also be interpreted to possibly refer to a collection or network of computing devices each including a processor or processors. Many programs have instructions performed by multiple processors that may be within the same computing device or which may even be distributed across multiple computing devices.
  • the computer system 500 may include a main memory 520 and a static memory 530, where memories in the computer system 500 may communicate with each other via a bus 508.
  • Memories described herein are tangible storage mediums that can store data and executable instructions and are non-transitory during the time instructions are stored therein.
  • the term“non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
  • the term“non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • a memory described herein is an article of manufacture and/or machine component.
  • Memories described herein are computer- readable mediums from which data and executable instructions can be read by a computer.
  • Memories as described herein may be random access memory (RAM), read only memory (ROM), flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu- ray disk, or any other form of storage medium known in the art.
  • Memories may be volatile or non-volatile, secure and/or encrypted, unsecure and/or unencrypted.
  • “Memory” is an example of a computer-readable storage medium.
  • Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to“computer memory” or“memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
  • the computer system 500 may further include a video display unit 550, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT). Additionally, the computer system 500 may include an input device 560, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 570, such as a mouse or touch-sensitive input screen or pad. The computer system 500 can also include a disk drive unit 580, a signal generation device 590, such as a speaker or remote control, and a network interface device 540.
  • a signal generation device 590 such as a speaker or remote control
  • a network interface device 540 such as a speaker or remote control
  • the disk drive unit 580 may include a computer- readable medium 582 in which one or more sets of instructions 584, e.g. software, can be embedded. Sets of instructions 584 can be read from the computer-readable medium 582. Further, the instructions 584, when executed by a processor, can be used to perform one or more of the methods and processes as described herein. In an embodiment, the instructions 584 may reside completely, or at least partially, within the main memory 520, the static memory 530, and/or within the processor 510 during execution by the computer system 500.
  • the instructions 584 may reside completely, or at least partially, within the main memory 520, the static memory 530, and/or within the processor 510 during execution by the computer system 500.
  • dedicated hardware implementations such as application- specific integrated circuits (ASICs), programmable logic arrays and other hardware components, can be constructed to implement one or more of the methods described herein.
  • ASICs application-specific integrated circuits
  • One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. None in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
  • the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing can be constructed to implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
  • the present disclosure contemplates a computer- readable medium 582 that includes instructions 584 or receives and executes instructions 584 responsive to a propagated signal; so that a device connected to a network 501 can communicate voice, video or data over the network 501. Further, the instructions 584 may be transmitted or received over the network 501 via the network interface device 540.
  • intraoperative imaging-based surgical navigation enables a surgeon to navigate to a lung tumor without the use of physical markers, so that markers do not have to be placed well before or even immediately before a lung tumor is resected in a surgery.
  • intraoperative imaging-based surgical navigation can be used with minimally invasive surgery and result in more complete and more accurate removal of lung tumors, reduced requirements for follow-up surgeries and subsequent radiation/chemotherapy or avoidance of recurrence.
  • intraoperative imaging-based surgical navigation can help avoid removing healthy tissue, which helps avoid compromising lung function and/or prolonging recovery times.
  • avoiding or reducing the placement of physical markers by using intraoperative image-based surgical navigation can avoid additional complications and hospital/patient burden.
  • the representative embodiments described above help alleviate some of the challenges described herein for lung tumor resections by providing improved 3D guidance in procedures involving resecting a tumor from the deflated lung.
  • the representative embodiments described herein can be used to provide surgical workflows with intra-operative imaging and performed without use of physical markers being placed in the tumor.
  • intraoperative imaging-based surgical navigation has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of intraoperative imaging-based surgical navigation in its aspects. Although intraoperative imaging-based surgical navigation has been described with reference to particular means, materials and embodiments, intraoperative imaging-based surgical navigation is not intended to be limited to the particulars disclosed; rather intraoperative imaging-based surgical navigation extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention concerne un dispositif de commande (420) pour l'assistance à la navigation dans une procédure interventionnelle, ce dispositif de commande comprenant une mémoire (520) qui stocke des instructions, ainsi qu'un processeur (510) qui exécute les instructions. Lorsqu'elles sont exécutées par le processeur (510), les instructions amènent le dispositif de commande à mettre en œuvre un processus qui consiste à enregistrer (S260) des systèmes de coordonnées d'un instrument médical d'intervention (450) et une image peropératoire d'un poumon et à calculer (S260) une déformation entre le poumon à l'état dégonflé et le poumon à l'état gonflé. Le procédé consiste également à appliquer (S260) la déformation à un modèle original tridimensionnel du poumon à l'état gonflé pour générer un modèle tridimensionnel modifié du poumon à l'état dégonflé, et à générer (S280) une image du modèle tridimensionnel modifié du poumon à l'état dégonflé.
PCT/EP2020/064177 2019-05-22 2020-05-20 Navigation chirurgicale basée sur une imagerie peropératoire WO2020234409A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962851174P 2019-05-22 2019-05-22
US62/851,174 2019-05-22

Publications (1)

Publication Number Publication Date
WO2020234409A1 true WO2020234409A1 (fr) 2020-11-26

Family

ID=70802864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/064177 WO2020234409A1 (fr) 2019-05-22 2020-05-20 Navigation chirurgicale basée sur une imagerie peropératoire

Country Status (1)

Country Link
WO (1) WO2020234409A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140073907A1 (en) * 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
WO2016178690A1 (fr) * 2015-05-07 2016-11-10 Siemens Aktiengesellschaft Système et procédé de guidage d'interventions chirurgicales laparoscopiques par l'intermédiaire d'une augmentation du modèle anatomique
US20180161102A1 (en) * 2014-10-30 2018-06-14 Edda Technology, Inc. Method and system for estimating a deflated lung shape for video assisted thoracic surgery in augmented and mixed reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140073907A1 (en) * 2012-09-12 2014-03-13 Convergent Life Sciences, Inc. System and method for image guided medical procedures
US20180161102A1 (en) * 2014-10-30 2018-06-14 Edda Technology, Inc. Method and system for estimating a deflated lung shape for video assisted thoracic surgery in augmented and mixed reality
WO2016178690A1 (fr) * 2015-05-07 2016-11-10 Siemens Aktiengesellschaft Système et procédé de guidage d'interventions chirurgicales laparoscopiques par l'intermédiaire d'une augmentation du modèle anatomique

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
UNERI ALI ET AL: "Deformable registration of the inflated and deflated lung in cone-beam CT-guided thoracic surgery: Initial investigation of a combined model- and image-driven approach", MEDICAL PHYSICS, AIP, MELVILLE, NY, US, vol. 40, no. 1, 18 December 2012 (2012-12-18), pages 17501-1 - 17501-8, XP012170938, ISSN: 0094-2405, [retrieved on 20121218], DOI: 10.1118/1.4767757 *

Similar Documents

Publication Publication Date Title
US11717376B2 (en) System and method for dynamic validation, correction of registration misalignment for surgical navigation between the real and virtual images
Luo et al. Augmented reality navigation for liver resection with a stereoscopic laparoscope
US20220156925A1 (en) Dynamic interventional three-dimensional model deformation
US11564748B2 (en) Registration of a surgical image acquisition device using contour signatures
US10074176B2 (en) Method, system and apparatus for displaying surgical engagement paths
Bertolo et al. Systematic review of augmented reality in urological interventions: the evidences of an impact on surgical outcomes are yet to come
US10074177B2 (en) Method, system and apparatus for quantitative surgical image registration
US20130245461A1 (en) Visualization of Anatomical Data by Augmented Reality
WO2023246521A1 (fr) Procédé, appareil et dispositif électronique de localisation de lésion sur la base d'une réalité mixte
EP3398552A1 (fr) Commande de visionneuse d'images médicales à partir de la caméra du chirurgien
Liu et al. Toward intraoperative image-guided transoral robotic surgery
US20210290309A1 (en) Method, system and apparatus for surface rendering using medical imaging data
US20230172574A1 (en) System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
Liu et al. Intraoperative image‐guided transoral robotic surgery: pre‐clinical studies
US20240041558A1 (en) Video-guided placement of surgical instrumentation
Alam et al. A review on extrinsic registration methods for medical images
US20140275994A1 (en) Real time image guidance system
CN114828767A (zh) 动态组织影像更新
WO2020234409A1 (fr) Navigation chirurgicale basée sur une imagerie peropératoire
US10102681B2 (en) Method, system and apparatus for adjusting image data to compensate for modality-induced distortion
Chen et al. Video-guided calibration of an augmented reality mobile C-arm
Wang et al. Stereoscopic augmented reality for single camera endoscopy: a virtual study
Noblet et al. Registration of 2D monocular endoscopy to 3D CBCT for video-assisted thoracoscopic surgery
Kersten-Oertel et al. 20 Augmented Reality for Image-Guided Surgery
Mirota Video-based navigation with application to endoscopic skull base surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20727624

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20727624

Country of ref document: EP

Kind code of ref document: A1