WO2006067676A2 - Visualisation d'un dispositif interventionnel guide - Google Patents

Visualisation d'un dispositif interventionnel guide Download PDF

Info

Publication number
WO2006067676A2
WO2006067676A2 PCT/IB2005/054212 IB2005054212W WO2006067676A2 WO 2006067676 A2 WO2006067676 A2 WO 2006067676A2 IB 2005054212 W IB2005054212 W IB 2005054212W WO 2006067676 A2 WO2006067676 A2 WO 2006067676A2
Authority
WO
WIPO (PCT)
Prior art keywords
interventional device
planes
bounding
interventional
orientation
Prior art date
Application number
PCT/IB2005/054212
Other languages
English (en)
Other versions
WO2006067676A3 (fr
Inventor
Eltjo H. Haselhoff
Guillaume R. P. Thelissen
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2006067676A2 publication Critical patent/WO2006067676A2/fr
Publication of WO2006067676A3 publication Critical patent/WO2006067676A3/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the invention relates to a method of visualization of a tracked interventional device in the field of interventional procedures, which includes the steps of
  • one or more planes having a fixed relationship to the position of the tracked device are used for this purpose.
  • one method incorporates three orthogonal slices that intersect at the tip of an interventional catheter.
  • the known method relates to the field of interventional magnetic resonance (MR) imaging, and specifically to image guided interventional procedures performed inside an open-access MR imaging system.
  • MR magnetic resonance
  • a three-dimensional MR image data set of the patient's head is acquired, and registered to the anatomy of the patient.
  • the location of an interventional device for example a biopsy needle, is tracked using an optical tracking system.
  • an interventional device for example a biopsy needle
  • it is rendered in the three-dimensional view, and two-dimensional slices of data reformatted from the previously acquired three-dimensional data set are displayed.
  • the reformatted slice planes follow the position of the tracked interventional device, sweeping through the volumetric data set.
  • This object is achieved by using a method of visualizing the position of an interventional device according to the invention, which is characterized in that the multiple planes identifying the position of the interventional device are displayed as bounding planes, as further explained below.
  • Bounding planes are planes displayed in a multidimensional display, where each plane is only rendered up to where it intersects another plane.
  • each intersecting plane is rendered in its entirety, as shown in Fig. 1.
  • This has the disadvantage that the display contains redundant information, which could be confusing to the operating surgeon. It is an insight of the inventors that by using bounding planes as defined above, only relevant portions of the data are displayed, thereby increasing the readability of the displayed image.
  • the invention is applicable in a multitude of instances where an interventional device needs to be tracked in or near an object, using a multi-dimensional data set of the object for visualizing the position of the interventional device in relation to the object.
  • tracking methods include techniques based on MR imaging, MR spectroscopy, electromagnetism (EM), ultrasound (U/S), radio frequency (RF), X-ray, computed tomography (CT), etc.
  • EM electromagnetism
  • U/S ultrasound
  • RF radio frequency
  • X-ray computed tomography
  • CT computed tomography
  • Any type of three-dimensional image data set can be used for the visualization, including MRI, CT, X-ray, U/S, etc.
  • the object could be a human patient, with the structure being the patient's internal anatomy.
  • the object could alternatively be a clinical mannequin like the SimMan (Laerdal Medical) or the Human Patient Simulator (Medical Education Technologies), and the structure could be the plumbing or tubing etc., inside the mannequin.
  • the interventional device As the interventional device is brought into the vicinity of a region of interest of the object, for example, a part of the patient's anatomy like the head, the interventional device is tracked and rendered on a display device, along with a three-dimensional view of the entire object, for example, the head.
  • a typical view displayed of the object could be as viewed along the axis of the interventional device, or along the orientation determined by the tracked parts or points on the interventional device, though other views are equally plausible.
  • multiple bounding planes that contain the tracked part or parts are simultaneously rendered.
  • a typical example might include rendering three orthogonal bounding planes, for example coronal, axial and sagittal (CAS) planes, with respect to the patient's anatomy.
  • CAS sagittal
  • the intersection of the rendered bounding planes indicates the position of at least one of the tracked parts.
  • the invention can be used to verify the position of the interventional device with respect to the object's structure.
  • one or more parts of the device may be tracked.
  • One such commonly tracked part is the tip of a biopsy needle.
  • Other tracked parts include tips and other points along the length of a catheter, tips of needles used for ablation, parts of endoscopic devices, etc.
  • a three-dimensional image data set is acquired prior to the start of the interventional procedure.
  • the data are fit into a generic "hollow shell" model of the head such that nothing is displayed beyond the boundaries of the shell model.
  • a more realistic head model created from the acquired data set using volume-rendering techniques may be used. If the head is fixed by some means, for example a stereotactic frame, and is not moved after the image acquisition and prior to the interventional procedure, then the image data acquired will correspond to the anatomy. In such and similar cases where there is no movement of the head, no further preparation of the image data set may be required. However, in cases where there may be movement, some form of hardware or software registration may need to be performed, so that the acquired image data set correlates with the patient's anatomy.
  • Examples of hardware registration include physically adjusting the patient's position, physically adjusting hardware settings and/or positions, adjusting acquisition parameters, etc., depending on the acquisition method being used.
  • a typical rendering using the bounding planes would depict a solid sector cutaway of the patient's head with its origin or intersection point corresponding to the tip of the interventional device as shown, for example in Fig. 3.
  • the intersection of the bounding planes preferably identifies the position of the interventional device within the anatomy of the head.
  • corresponding bounding planes extracted from the acquired data set are preferably automatically and periodically repositioned on the multi-dimensional display such that their intersection identifies the position of the interventional device.
  • the intersection of the planes will always indicate the position of the tracked part or parts of the interventional device with respect to the patient's anatomy.
  • movement of the interventional device may result in an apparent opening or closing of the solid sector cutaway.
  • the orientation of the bounding planes can be determined by the orientation of the interventional device.
  • a preferred orientation would be a view where the interventional device subtends an adjustable acute angle with each of the three bounding planes used to identify its position.
  • Another preferred orientation would be a view where the interventional device subtends an adjustable obtuse angle with one or more of the bounding planes.
  • the bounding planes could be arranged to visualize the anatomy at locations neighboring the position of the interventional device.
  • the bounding planes could display a location slightly ahead, behind, or on the side of a current position of the interventional device, thereby giving the operator a quick view of the anatomy neighboring the area of interest.
  • pilot planes are henceforth referred to as pilot planes. Such pilot planes are useful in confirming the trajectory of the interventional device during an interventional procedure, for example, in a situation where some of the structures have shifted due to gross patient movement, or where movement of intra-cranial structures has occurred, for example, due to release of pressure after the skull was opened, etc.
  • pilot planes to visualize locations neighboring the position of the interventional device could be controlled by the operator, such as a surgeon or an interventional radiologist, using any of an array of known techniques like foot pedals, voice activated devices, etc. It may also be done manually by another person at the instruction of the operator.
  • the orientation of the pilot planes can be determined by the orientation of the interventional device, or independently by the operator.
  • the original bounding planes identifying the position of the interventional device preferably need not be disturbed to visualize neighboring anatomy. Instead, an additional set of bounding planes could be used as the pilot planes, preferably under the control of an operator, thereby giving the operator the ability to simultaneously visualize both the current position and a possible future position of the interventional device.
  • the invention further relates to a system as defined in Claim 8, which is arranged to acquire and display data according to the invention.
  • the computer program in accordance with the invention is defined in Claim 9.
  • the computer program in accordance with the invention can be loaded into the working memory of the system claimed in Claim 8.
  • the computer program may be available on a data carrier, for example on CD-ROM or DVD-ROM discs; it is also possible to download the computer program from a network, such as the World Wide Web.
  • the system is also arranged to prepare the acquired images for multi-dimensional display, and to display multiple bounding planes that intersect such that their intersection identifies the position of an interventional device with respect to the structure of an object.
  • such systems are provided with means to acquire images from a region of interest of an object. They are also, generally speaking, provided with means to process the acquired images, and display them on a viewer, such as a computer monitor.
  • Fig. 1 shows planes rendered in their entirety according to prior art.
  • Fig. 2 shows a block diagram of a system set up according to the invention, to (1) acquire images from a region of interest of an object and prepare them for multidimensional display on a display device, (2) track an interventional device, and (3) display multiple bounding planes that intersect, where the intersection identifies the position of the interventional device.
  • Fig. 3 shows a preferred embodiment of the invention, where the interventional device subtends adjustable acute angles with each of the three bounding planes, and where the intersection point of the three bounding planes corresponds to the tip of the interventional device.
  • Fig. 4 shows a preferred embodiment of the invention, where the interventional device subtends an adjustable obtuse angle with one bounding plane and adjustable acute angles with the other two bounding planes, and where the intersection point of the three bounding planes corresponds to the tip of the interventional device.
  • Fig. 2 is a diagrammatic representation of an interventional radiology suite in a healthcare institution that is set up to operate according to the invention.
  • the figure shows a plurality of image acquisition systems 10 whereby the image information of a patient to be examined is acquired.
  • the images so acquired can be in the form of multiple slices, or a three-dimensional volume.
  • Shown in particular are an X-ray CT system, an U/S system and an MR imaging system.
  • Each of the imaging means is connected to a data processor 40, for example a computer.
  • the data processor is programmed to carry out the method according to the invention to prepare the images for multi-dimensional display.
  • Fig. 2 also shows a plurality of interventional devices 20 that could be used to perform a variety of interventional procedures on the patient.
  • Fig. 2 further shows a plurality of tracking means 30 to track the interventional device. Shown in particular are an MR imaging system, an X-Ray system and a U/S system. Depending on the tracking means 30, the interventional device 20 is modified to enable it to be tracked using the tracking method.
  • a three-dimensional image of the area of interest of the patient 50 is acquired using an image acquisition means 10.
  • the data processor 40 prepares the image data for multi-dimensional display and displays it on the display means 60.
  • the tracking means 30 transmits the position of the interventional device 20 to the data processor 40.
  • the display device 60 displays the location and orientation of the interventional device 20 in relation to the patient 50.
  • the multi-dimensional display is updated periodically and automatically to indicate the new position of the interventional device 20 on the display means 60.
  • the appropriate bounding planes are calculated by the data processor 40, and displayed on the display means 60, such that the intersection of the bounding planes indicates the tracked portion of the interventional device 20.
  • Fig. 3 shows a preferred embodiment of the invention, where the interventional device 20 subtends an adjustable acute angle with each of the three bounding planes.
  • the intersection point of the three bounding planes follows the position of the tracked portion of the interventional device 20, which is specifically the tip of the interventional device 20 in this figure. It may be noted that as the interventional device 20 is inserted further into the head and along the direction of its long axis, new bounding planes would be displayed such that their intersection indicated the position of the tip of the interventional device deeper within the anatomy. This would result in an apparent opening of the solid sector cutaway. If the interventional device 20 were moved in a direction other than in the direction of its long axis, the new bounding planes displayed would result in an apparent translation of the bounding planes.
  • Fig. 4 shows a preferred embodiment of the invention, where the interventional device 20 subtends an adjustable obtuse angle with one of the bounding planes, and subtends adjustable acute angles with the other two bounding planes.
  • the intersection point of the three bounding planes indicates the tracked portion of the interventional device 20, specifically its tip in this figure.
  • the image in Fig. 4 shows a larger (or more open) solid sector cutaway compared to Fig. 3. It may be noted that the view in Fig.
  • FIG. 4 shows the anterior-left cutaway of the head, though other views, like anterior-right (as shown in Fig. 3), posterior-right, or combinations of oblique planes are also equally possible.
  • new bounding planes would be displayed such that their intersection point still indicated the position of the tip of the interventional device 20. This would result in an apparent translation of the solid sector cutaway in the direction of travel of the interventional device. Depending on the direction of movement of the interventional device, such apparent translation may happen in combination with an apparent opening or closing of the solid sector cutaway as well. It may also be noted that movement of the interventional device may result in an apparent opening or closing of the solid sector cutaway without any apparent translation at all.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Epidemiology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un procédé de visualisation de la position d'un dispositif interventionnel qui consiste à acquérir les images d'une zone cible d'un objet, et à afficher les divers plans qui se croisent. L'intersection identifie la position du dispositif interventionnel. Ces divers plans permettant de localiser le dispositif interventionnel sont affichés comme plans limite. Les plans limite, tels qu'ils sont utilisés dans l'invention, sont des plans affichés sur un écran multidimensionnel, chaque plan n'étant représenté qu'au point d'intersection avec un autre plan. Seules les parties pertinentes des données sont affichées, ce qui permet d'augmenter la lisibilité de l'image affichée.
PCT/IB2005/054212 2004-12-20 2005-12-13 Visualisation d'un dispositif interventionnel guide WO2006067676A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP04106740 2004-12-20
EP04106740.6 2004-12-20

Publications (2)

Publication Number Publication Date
WO2006067676A2 true WO2006067676A2 (fr) 2006-06-29
WO2006067676A3 WO2006067676A3 (fr) 2007-04-05

Family

ID=36602140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2005/054212 WO2006067676A2 (fr) 2004-12-20 2005-12-13 Visualisation d'un dispositif interventionnel guide

Country Status (1)

Country Link
WO (1) WO2006067676A2 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090137907A1 (en) * 2007-11-22 2009-05-28 Kabushiki Kaisha Toshiba Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method
DE102008004468A1 (de) * 2008-01-15 2009-07-23 Siemens Aktiengesellschaft Verfahren zur Kontrolle der Führung eines interventionellen Instruments in einem stereographischen 3 D-Bilddatensatz des untersuchten Objekts
US20090310847A1 (en) * 2008-03-25 2009-12-17 Takeo Matsuzaki Medical image processing apparatus and x-ray diagnosis apparatus
ITGE20080064A1 (it) * 2008-07-24 2010-01-25 Esaote Spa Dispositivo e metodo di guida di utensili chirurgici mediante imaging ecografico.
WO2010073165A1 (fr) * 2008-12-23 2010-07-01 Koninklijke Philips Electronics, N.V. Imagerie acoustique tridimensionnelle automatisée pour guidage d'intervention médicale

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999000052A1 (fr) * 1997-06-27 1999-01-07 The Board Of Trustees Of The Leland Stanford Junior University Procede et dispositif permettant de generer des images tridimensionnelles a des fins de 'navigation'

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999000052A1 (fr) * 1997-06-27 1999-01-07 The Board Of Trustees Of The Leland Stanford Junior University Procede et dispositif permettant de generer des images tridimensionnelles a des fins de 'navigation'

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JULI YAMASHITA ET AL: "Real Time 3D Model Based Navigation System for Endoscopic Paranasal Sinus Surgery" IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 46, no. 1, January 1999 (1999-01), pages 107-116, XP002278967 ISSN: 0018-9294 *
ZAMORANO L ET AL: "COMPUTER-ASSISTED NEUROSURGERY SYSTEM: WAYNE STATE UNIVERSITY HARDWARE AND SOFTWARE CONFIGURATION" COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, PERGAMON PRESS, NEW YORK, NY, US, vol. 18, no. 4, July 1994 (1994-07), pages 257-271, XP009001301 ISSN: 0895-6111 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090137907A1 (en) * 2007-11-22 2009-05-28 Kabushiki Kaisha Toshiba Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method
US10881375B2 (en) 2007-11-22 2021-01-05 Canon Medical Systems Corporation Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method
DE102008004468A1 (de) * 2008-01-15 2009-07-23 Siemens Aktiengesellschaft Verfahren zur Kontrolle der Führung eines interventionellen Instruments in einem stereographischen 3 D-Bilddatensatz des untersuchten Objekts
US20090310847A1 (en) * 2008-03-25 2009-12-17 Takeo Matsuzaki Medical image processing apparatus and x-ray diagnosis apparatus
ITGE20080064A1 (it) * 2008-07-24 2010-01-25 Esaote Spa Dispositivo e metodo di guida di utensili chirurgici mediante imaging ecografico.
EP2147636A1 (fr) 2008-07-24 2010-01-27 Esaote S.p.A. Dispositif et procédé pour le guidage d'outils chirurgicaux par imagerie ultrasonique
US20100022871A1 (en) * 2008-07-24 2010-01-28 Stefano De Beni Device and method for guiding surgical tools
US10492758B2 (en) 2008-07-24 2019-12-03 Esaote, S.P.A. Device and method for guiding surgical tools
WO2010073165A1 (fr) * 2008-12-23 2010-07-01 Koninklijke Philips Electronics, N.V. Imagerie acoustique tridimensionnelle automatisée pour guidage d'intervention médicale

Also Published As

Publication number Publication date
WO2006067676A3 (fr) 2007-04-05

Similar Documents

Publication Publication Date Title
US20200138516A1 (en) Systems and methods for ultrasound image-guided ablation antenna placement
CA2346613C (fr) Procede et appareil permettant de placer un dispositif dans un corps
Peters Image-guided surgery: from X-rays to virtual reality
CN1907233B (zh) 自动规划经由皮肤的最小侵入手术进入路径的装置和方法
JP5632286B2 (ja) Mri画像データ及び外科用具の既定のデータを使用してリアルタイムで視覚化するmri外科システム
EP2222224B1 (fr) Procédé et système pour une planification chirurgicale préopération percutanée
EP2144568B1 (fr) Procédé de ciblage
US8498692B2 (en) Method for displaying a medical implant in an image and a medical imaging system
JP2014522274A (ja) 誘導脳刺激機能データを脳のライブ画像上に重ねる方法及びシステム
Gering A system for surgical planning and guidance using image fusion and interventional MR
Bao et al. Ultrasound-to-computer-tomography registration for image-guided laparoscopic liver surgery
JP2021526050A (ja) 処置を実施および評価するためのシステムおよび方法
Carl et al. Preoperative 3-dimensional angiography data and intraoperative real-time vascular data integrated in microscope-based navigation by automatic patient registration applying intraoperative computed tomography
EP2686829B1 (fr) Repérage d'une déformation du cerveau au cours d'une opération neurochirurgicale
WO2006067676A2 (fr) Visualisation d'un dispositif interventionnel guide
Melzer et al. Technology and principles of tomographic image-guided interventions and surgery
CN112236099A (zh) 执行和评估程序的系统和方法
Adams et al. CAS—a navigation support for surgery
Ito et al. Magnetically guided 3-dimensional virtual neuronavigation for neuroendoscopic surgery: technique and clinical experience
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
Cameron et al. Virtual-reality-assisted interventional procedures.
Mundeleer et al. Development of a computer assisted system aimed at RFA liver surgery
Scherer et al. New preoperative images, surgical planning, and navigation
Vandermeulen et al. Prototype medical workstation for computer-assisted stereotactic neurosurgery
Bates et al. Implementation of an oblique-sectioning visualization tool for line-of-sight stereotactic neurosurgical navigation using the avw toolkit

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005826744

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: 2005826744

Country of ref document: EP