WO2015091159A2 - Procédé et système d'imagerie pour modelage tridimensionnel combiné de rayons x et optique d'au moins un dispositif d'intervention - Google Patents

Procédé et système d'imagerie pour modelage tridimensionnel combiné de rayons x et optique d'au moins un dispositif d'intervention Download PDF

Info

Publication number
WO2015091159A2
WO2015091159A2 PCT/EP2014/077218 EP2014077218W WO2015091159A2 WO 2015091159 A2 WO2015091159 A2 WO 2015091159A2 EP 2014077218 W EP2014077218 W EP 2014077218W WO 2015091159 A2 WO2015091159 A2 WO 2015091159A2
Authority
WO
WIPO (PCT)
Prior art keywords
interventional device
ray
optical
interventional
imaging data
Prior art date
Application number
PCT/EP2014/077218
Other languages
English (en)
Other versions
WO2015091159A3 (fr
Inventor
Petrus Johannes WITHAGEN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015091159A2 publication Critical patent/WO2015091159A2/fr
Publication of WO2015091159A3 publication Critical patent/WO2015091159A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the invention relates to an imaging system and a method for combined X-ray and optical three-dimensional modeling of at least one interventional device.
  • US 2012 005 767 1 Al describes a system and a method for monitoring a guided intervention device including determining a position of an intervention device inside a subject using a radiation source to image the intervention device.
  • a circular acquisition is performed to update the position of the intervention device, wherein the acquisition includes skipping view angles by turning off a radiation source at given angular positions.
  • a model of the intervention device is generated to provide a virtual image of the intervention device against a background of the subject. Movement of the intervention device according to the described method is modeled during the skipped view angles to provide substantially real-time tracking of the intervention device.
  • WO 2009 003 664 Al describes a system and a method for simulating a manual interventional operation by a user on an object with a surgical instrument.
  • the described system comprises a tool for simulating the surgical instrument, comprising a manual stick supporting at least two spherical markers rigidly connected to each others, at least one of them comprising a pattern on its surface, a box with at least one aperture comprising a working volume reachable with the tool through said aperture, means for capturing the position and axial orientation of said markers within said working volume and deducting from them the movements, the 3D position and the 3D orientation of the tool when operated by the user within said working volume, means for visualizing a 3D model of the surgical instrument simulated by the tool in motion inside said working volume, means for simulating an image corresponding to said object, and means for simulating the action of said surgical instrument.
  • US 2013 216 025 Al describes a system and a method for adaptive imaging including a shape sensing system coupled to an interventional device to measure spatial characteristics of the interventional device in a subject.
  • An image module of the described system is configured to receive the spatial characteristics and generate one or more control signals in accordance with the spatial characteristics.
  • An imaging device of the described system is configured to image the subject in accordance with the control signals.
  • An aspect of the present invention relates to a method for combined X-ray and optical three-dimensional modeling of at least one interventional device, comprising the steps of: acquiring X-ray imaging data of the at least one interventional device in a first scan area by means of an X-ray detector of an X-ray imaging module; acquiring optical imaging data of the at least one interventional device in a second scan area by means of at least one optical camera of an optical imaging module; and calculating a three-dimensional model of the at least one interventional device based on the acquired X-ray imaging data and the acquired optical imaging data by means of a 3D-processing module.
  • a further aspect of the invention relates to an imaging system for combined X- ray and optical three-dimensional modeling of at least one interventional device, the imaging system comprising: an X-ray imaging module comprising an X-ray detector, which is configured to acquire X-ray imaging data of the at least one interventional device in a first scan area; an optical imaging module comprising at least one optical camera, which is configured to acquire optical imaging data of the at least one interventional device in a second scan area; a 3D-processing module, which is configured to calculate a three- dimensional model of the at least one interventional device based on the acquired X-ray imaging data and the acquired optical imaging data and which is configured to plan an interventional procedure by using the three-dimensional model of the at least one
  • interventional device and a display module, which is adapted to display the three- dimensional model of the at least one interventional device.
  • the second phase cannot be performed.
  • the object needs to be placed without guidance.
  • the user has to position it based only on the planning image.
  • the user advantageously has a three- dimensional image of the planning with the interventional device positioned at the right location and orientation.
  • the present invention advantageously allows the user to plan his task to position the actual device in the actual position at the planned position and/or the planned orientation.
  • the user has the visual image from his eyes, e.g. during open surgery, or a three-dimensional X-ray images, e.g. minimally invasive.
  • the present invention provides high resolution optical cameras integrated in a separate housing or integrated in the medical system close to the patient and registered to the X-ray system.
  • the present invention further provides a three-dimensional model of the patient, a three-dimensional model of the device, e.g. delivered by the manufacturer or created with the optical system itself, and methods to recognize the position of the device from the images and to overlay the three-dimensional model of the device at its current position and/or orientation over the three-dimensional scan of the patient.
  • the optical camera can be used to optically track the interventional device and in real-time update the position of the associated three-dimensional model, overlaid on the images used for planning the interventional procedure. This approach simplifies the planning and positioning of the interventional device at the planned position.
  • the camera is fixed to and/or geometrically registered with the X-ray detector, no position calibration or geometric registration of the optical camera is necessary, besides the self-registration or auto calibration of the X-ray computed tomography or medical X-ray imaging system. Also, the position of the camera can be changed or modified during the intervention or interventional procedure without losing the tracking of the interventional device or having to re-calibrate the system. This approach simplifies the usage of the imaging system.
  • the method further comprises the step of planning an interventional procedure by using the three-dimensional model of the at least one interventional device.
  • the three- dimensional model of the at least one interventional device is calculated based on three- dimensional data of the interventional device.
  • the method further comprises the step of virtually positioning the three-dimensional model of the at least one interventional device for planning the interventional procedure.
  • the method further comprises the step of acquiring X-ray imaging data of a patient undergoing an interventional procedure by means of the at least one X-ray detector of the X-ray imaging module.
  • the method further comprises the step of acquiring optical imaging data of a patient undergoing an interventional procedure by means of the at least one optical camera of the optical imaging module.
  • the method further comprises the step of acquiring imaging data of a needle, of a surgical instrument or of an implant or of any other interventional device, wherein the imaging data relates to X-ray imaging data or to optical imaging data.
  • the method further comprises the step of positioning the three-dimensional model of the at least one
  • interventional device using real-time guidance and feedback from a tracking system, which tracks the at least one interventional device during the interventional procedure, generating a tracked position of the at least interventional device.
  • the method further comprises the step of comparing the tracked position of the at least interventional device with a planned position of the at least interventional device during the interventional procedure.
  • the at least one optical camera is geometrically registered to the X-ray detector.
  • the 3D-processing module is configured to calculate the three-dimensional model of the at least one
  • interventional device based on three-dimensional data of the interventional device.
  • Figure 1 shows a schematic diagram of an imaging system for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention
  • Figure 2 shows a schematic diagram of an imaging system for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention.
  • Figure 3 shows a flowchart diagram of a method for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention.
  • Figure 1 shows a schematic diagram of an imaging system for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention.
  • Figure 1 shows an optical imaging module 20 for an imaging system 100, the optical imaging module 20 comprising at least one optical camera 21, which is configured to acquire optical imaging data of at least of interventional device and which is configured to be coupled to an X-ray detector 11 of the imaging system 100.
  • the at least one optical camera 21 may be integrated into the X-ray detector 11, which means in other words that the at least one optical camera 21 and the X-ray detector 11 may or may not share a common housing. Further the at least one optical camera 21 may be integrated in separate a free-standing mount or in a separate housing.
  • Figure 2 shows a schematic diagram of an imaging system for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention.
  • an imaging system 100 for combined X-ray and optical three-dimensional modeling of at least one interventional device 50 is shown in Figure 2, the imaging system 100 may comprise an X-ray imaging module 10, an optical imaging module 20, a 3D- processing module 30, and a display module 40.
  • the X-ray imaging module 10 may comprise an X-ray detector 11, which is configured to acquire X-ray imaging data of the at least one interventional device in a first scan area FSl.
  • the optical imaging module 20 may comprise at least one optical camera 21, which is configured to acquire optical imaging data of the at least one interventional device in a second scan area FS2.
  • the 3D-processing module 30 may be configured to calculate a three- dimensional model of the at least one interventional device 50 based on the acquired X-ray imaging data and the acquired optical imaging data and which is configured to plan an interventional procedure by using the three-dimensional model or data representation of the at least one interventional device 50.
  • the display module 40 may be adapted to display the three-dimensional model of the at least one interventional device 50.
  • the X-ray imaging module 10, the optical imaging module 20, the 3D- processing module 30, and the display module 40 may be coupled by a network or by any other communication system that transfers data between these components.
  • the X-ray detector 10 may be an image detector and may be employed, possibly comprising individual detector elements being angled towards one another with regard to the X-ray source 70 or the focal spot of an X-ray generating device.
  • the at least one optical camera 21 may be a digital camera which is a camera that encodes digital images and videos digitally.
  • the 3D-processing module 30 may be a digital signal processor, DSP, which is a specialized microprocessor with an architecture optimized for the operational needs of digital signal processing or a field-programmable gate array (FPGA) which is an integrated circuit designed to be configured by a customer or a designer after manufacturing.
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • the 3D- processing module 30 may comprise a processor which is configured to perform the steps of the method for combined X-ray and optical three-dimensional modeling of at least one interventional device 50.
  • the display module 40 may be a holographic display or a liquid-crystal display, LCD, a flat panel display, an electronic visual display, or a video display.
  • the interventional device 50 may be a needle, a surgical instrument or an implant.
  • the imaging system 100 may further comprise an operating table 60 and an X- ray source 70.
  • the operating table 60 may be adapted for a patient P undergoing an interventional procedure, which is monitored by the imaging system 100.
  • Figure 3 shows a flowchart diagram of a method for combined X-ray and optical three-dimensional modeling of at least one interventional device according to an exemplary embodiment of the invention.
  • the method for combined X-ray and optical three-dimensional modeling of at least one interventional device may comprise the following steps:
  • acquiring S 1 X-ray imaging data of the at least one interventional device in a first scan area FS 1 by means of an X-ray detector 11 of an X- ray imaging module is performed.
  • acquiring S2 optical imaging data of the at least one interventional device in a second scan area FS2 by means of at least one optical camera 21 of an optical imaging module 20 is conducted.
  • calculating S3 a three-dimensional model of the at least one interventional device based on the acquired X-ray imaging data and the acquired optical imaging data by means of a 3D-processing module 30 is performed.
  • the steps of the method for combined X-ray and optical three-dimensional modeling of at least one interventional device may be performed by recursive or iterative repetition or performed in a reverse order or simultaneously.

Abstract

L'invention concerne un système et un procédé pour un modelage tridimensionnel combiné de rayons X et optique d'au moins un dispositif d'intervention (50), comprenant les étapes : d'acquisition (S1) de données d'imagerie de rayons X dudit au moins un dispositif d'intervention dans une première zone de balayage (FS1) au moyen d'un détecteur de rayons X (11) d'un module d'imagerie de rayons X ; d'acquisition (S2) de données d'imagerie optique dudit au moins un dispositif d'intervention dans une seconde zone de balayage (FS2) au moyen d'au moins une caméra optique (21) d'un module d'imagerie optique (20) ; et de calcul (S3) d'un modèle tridimensionnel dudit au moins un dispositif d'intervention en se basant sur les données d'imagerie de rayons X acquises et les données d'imagerie optique acquises au moyen d'un module de traitement 3D (30).
PCT/EP2014/077218 2013-12-18 2014-12-10 Procédé et système d'imagerie pour modelage tridimensionnel combiné de rayons x et optique d'au moins un dispositif d'intervention WO2015091159A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP13198040.1 2013-12-18
EP13198040 2013-12-18

Publications (2)

Publication Number Publication Date
WO2015091159A2 true WO2015091159A2 (fr) 2015-06-25
WO2015091159A3 WO2015091159A3 (fr) 2015-08-20

Family

ID=49841550

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/077218 WO2015091159A2 (fr) 2013-12-18 2014-12-10 Procédé et système d'imagerie pour modelage tridimensionnel combiné de rayons x et optique d'au moins un dispositif d'intervention

Country Status (1)

Country Link
WO (1) WO2015091159A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017082449A1 (fr) * 2015-11-13 2017-05-18 한국전기연구원 Procédé et système de génération d'image tridimensionnelle à l'aide d'une imagerie par rayons x multi-énergie et d'une image optique

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009003664A1 (fr) 2007-06-29 2009-01-08 Dies Srl Système pour simuler une opération d'intervention manuelle
US20120057671A1 (en) 2009-05-20 2012-03-08 Koninklijke Philips Electronics N.V. Data acquisition and visualization mode for low dose intervention guidance in computed tomography
US20130216025A1 (en) 2010-10-27 2013-08-22 Koninklijke Philips Electronics N.V. Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009003664A1 (fr) 2007-06-29 2009-01-08 Dies Srl Système pour simuler une opération d'intervention manuelle
US20120057671A1 (en) 2009-05-20 2012-03-08 Koninklijke Philips Electronics N.V. Data acquisition and visualization mode for low dose intervention guidance in computed tomography
US20130216025A1 (en) 2010-10-27 2013-08-22 Koninklijke Philips Electronics N.V. Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017082449A1 (fr) * 2015-11-13 2017-05-18 한국전기연구원 Procédé et système de génération d'image tridimensionnelle à l'aide d'une imagerie par rayons x multi-énergie et d'une image optique
US10244999B2 (en) 2015-11-13 2019-04-02 Korea Electrotechnology Research Institute Three-dimensional image generating method and system using multi-energy X-ray image and optical image

Also Published As

Publication number Publication date
WO2015091159A3 (fr) 2015-08-20

Similar Documents

Publication Publication Date Title
Gavaghan et al. A portable image overlay projection device for computer-aided open liver surgery
JP7204663B2 (ja) 慣性計測装置を使用して手術の正確度を向上させるためのシステム、装置、及び方法
JP2020511239A (ja) ナビゲーション手術における拡張現実ディスプレイのためのシステム及び方法
US9924914B2 (en) X-ray recording system
US7643862B2 (en) Virtual mouse for use in surgical navigation
EP2950735B1 (fr) Correction d'alignement basée sur la détection des changements dans les données d'image
US20070167701A1 (en) Computer assisted orthopaedic surgery system with light source and associated method
US20070016008A1 (en) Selective gesturing input to a surgical navigation system
US20200352657A1 (en) Operating room remote monitoring
US20210378757A1 (en) Augmented Reality with Medical Imaging
JP2014511731A (ja) ダイナミック3dヘルスケア環境における安全性
US20220323164A1 (en) Method For Stylus And Hand Gesture Based Image Guided Surgery
WO2020186075A1 (fr) Procédé d'enregistrement chirurgical fluoroscopique
JP7466541B2 (ja) 医療用x線撮像機器の位置決め
CN111973273A (zh) 基于ar技术的手术导航系统、方法、设备和介质
Habert et al. RGBDX: First design and experimental validation of a mirror-based RGBD X-ray imaging system
EP2944284B1 (fr) Système de guidage de précision d'interventions chirurgicales sur un patient
US11576557B2 (en) Method for supporting a user, computer program product, data medium and imaging system
WO2015091159A2 (fr) Procédé et système d'imagerie pour modelage tridimensionnel combiné de rayons x et optique d'au moins un dispositif d'intervention
US20240115325A1 (en) Camera tracking system for computer assisted surgery navigation
US11432898B2 (en) Tracing platforms and intra-operative systems and methods using same
Ansari et al. A Hybrid-Layered System for Image-Guided Navigation and Robot-Assisted Spine Surgeries
JP2024056663A (ja) コンピュータ支援外科手術ナビゲーションのためのカメラ追跡システム
WO2022008701A1 (fr) Imagerie médicale à navigation
Rodman The fundamentals of... Image-guided surgery

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14809428

Country of ref document: EP

Kind code of ref document: A2