WO2012001550A1 - Procédé et système de création d'un système de coordonnées centré sur le médecin - Google Patents

Procédé et système de création d'un système de coordonnées centré sur le médecin Download PDF

Info

Publication number
WO2012001550A1
WO2012001550A1 PCT/IB2011/052339 IB2011052339W WO2012001550A1 WO 2012001550 A1 WO2012001550 A1 WO 2012001550A1 IB 2011052339 W IB2011052339 W IB 2011052339W WO 2012001550 A1 WO2012001550 A1 WO 2012001550A1
Authority
WO
WIPO (PCT)
Prior art keywords
physician
centric
data
image
tracking
Prior art date
Application number
PCT/IB2011/052339
Other languages
English (en)
Inventor
Neil David Glossop
Thomas Shu Yin Tang
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2012001550A1 publication Critical patent/WO2012001550A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/368Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position

Definitions

  • the present invention generally relates to an image guided intervention involving a graphic icon of an instrument on a pre-operative scan or an intra-operative scan.
  • the present invention specifically relates to an image guided intervention involving a display of the graphic icon of the instrument within the scan from the viewpoint of a physician.
  • a physician During a minimally invasive intervention where it is not possible for a physician to directly view a tip of an instrument tip within a body of a patient (e.g., a needle biopsy, a radiofrequency ablation, etc.), the physician normally makes use of frequent imaging of the patient's body to help properly position the instrument. Particularly, the instrument is inserted in the patient's body, and a location and an orientation of the instrument is adjusted within the images while the physician is viewing the images.
  • IGI computer assisted image guided intervention
  • a tracked instrument e.g., a needle, a guidewire, a catheter, etc.
  • a pre-operative scan and/or an intra-operative scan of the patient e.g., an ultrasound scan, a CT scan, MRI scan, etc.
  • the tracked instrument may also be displayed as a graphic icon indicating its position and orientation relative to another tracked instrument or device.
  • an image of the instrument's position and orientation is overlayed on pre-operative scan and/or intra-operative scan.
  • the display on the computer monitor may display the physician's movements in real-time, highlighting the instrument's location, trajectory or other information relative to the target.
  • IGI systems typically use a device known as a "position sensor" to determine the position and orientation of instruments.
  • Position indicating elements attached to instruments and the position sensor electronics enable the position sensor to track the position and orientation of the position-indicating element and therefore the instrument.
  • Common position sensor technology includes electromagnetic and optical technologies.
  • motions of the instrument may be sometimes displayed in counterintuitive form. For example, moving a probe to the left might be displayed on the monitor as a motion to the right. If the physician moves position, or moves the display device, the perception may reverse. This is counterintuitive, disorienting and distracting to the physician. Although the physician may eventually "learn" the motions required to perform the correct motions on the screen, it can take some time and reduce the usability of the system. This learning time varies and can also lead to incorrect instrument placement. In addition to counterintuitive positioning of the instrument, the same problem can affect rotations. For example, a yaw action may result in a pitch action on the screen.
  • the present invention provides systems, devices and methods for assisting the hand-eye coordination of the physician so that manipulations of the instruments concur with the expected action of the graphic icon of the instrument on the display.
  • One form of the present invention is an image-guided system employing an imaging device, an instrument tracking device, a physician-centric display module and a registration device.
  • the imaging device generates imaging data indicative of an image of an anatomical region of a patient within an imaging coordinate system
  • the instrument tracking device generates tracking data indicative of a location (i.e., a position and/or an orientation) of an instrument within a tracking coordinate system
  • the physician-centric display module generates physician-centric data indicative of a physician centric viewpoint within a physician coordinate system.
  • the registration device In response to the data, the registration device generates a display illustrating a physician-centric viewpoint of a graphic icon of the instrument relative to the image of the anatomical region as a function of a transformation application of the physician-centric data to a registration of the imaging data and the tracking data.
  • An illustration of the physician-centric viewpoint of the graphic icon of the instrument relative to the image of the anatomical region includes the graphic icon replicating motion of the instrument within the tracking coordinate system from a viewpoint of a physician.
  • FIG. 1 illustrates an exemplary embodiment of an image-guided system as known in the art.
  • FIG. 2 illustrates an exemplary embodiment of a data registration executed by the image-guided system shown in FIG. 1 as known in the art.
  • FIG. 3 illustrates an exemplary embodiment of an image-guided system in accordance with the present invention.
  • FIG. 4 illustrates an exemplary data registration executed by the image-guided surgical system shown in FIG. 3 in accordance with the present invention.
  • FIG. 5 illustrates an exemplary embodiment of a pre-operative physician-centric viewpoint computation in accordance with the present invention.
  • FIG. 6 illustrates an exemplary embodiment of an intra-operative physician-centric viewpoint computation in accordance with the present invention.
  • FIG. 7 illustrates an exemplary embodiment of tracking indicators in accordance with the present invention.
  • FIG. 8 illustrates an exemplary embodiment of tracking patches in accordance with the present invention.
  • FIG. 9 illustrates an exemplary embodiment of a volume orientation display in accordance with the present invention.
  • FIG. 1 illustrates an image-guided system 10 employing an imaging device 20, an instrument tracking device 30, a registration device 40 and a display device 50.
  • imaging device 20 is broadly defined herein as any device structurally configured for generating imaging data indicative of an image of an anatomical region of a patient (e.g., brains, heart, lungs, abdomen, etc.) within an imaging coordinate system, such as, for example, a generation of imaging data ("ID") indicative of image 23 of an anatomical region of a patient within an imaging coordinate system 22 as shown in FIG. 2.
  • imaging device 20 include, but are not limited to, any known type of magnetic resonating imaging device, any known type of X-ray imaging device, any known type of ultrasound imaging device and any known type of computed tomography imaging device.
  • instrument tracking device 30 is broadly defined herein as any device structurally configured for generating tracking data indicative of tracking an instrument of any type within a tracking coordinate system, such as, for example, a generation of tracking data ("TD") 31 indicative of a tracking of an instrument 33 within a tracking coordinate system 32 as shown in FIG. 2.
  • TD tracking data
  • instrument tracking device 30 include, but are not limited to, any known type of electromagnetic tracking device and any known type of optical tracking device.
  • instrument 33 include, but are not limited to, surgical instruments/tools, imaging instruments/tools and therapeutic
  • registration device 40 is broadly defined herein as any device structurally configured for registering image data 21 and tracking data 31 to thereby generate a transformation matrix that facilitates a registered display (“RGD") 41 via display device 50 of a graphic icon of the instrument relative to the image of the anatomical region, such as, for example, a registration of imaging data 21 and tracking data 31 via a registration algorithm 42 as shown in FIG. 2 to thereby generate a transformation matrix 43 that facilitates a registered display 44 of a graphic icon 34 of instrument 33 relative to image 23 of the anatomical region as shown in FIG. 2.
  • registration algorithm 42 includes, but is not limited to, an iterative closet point (“ICP") algorithm and a singular value decomposition ("SVD”) algorithm as known in the art.
  • system 10 was not structurally configured to ascertain the physician's viewpoint of image guided intervention and consequently, significantly more often than not, motions of graphic icon 34 of instrument 33 are not displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician.
  • the present invention provides a physician-centric display module for ascertaining the physician's viewpoint of image-guided intervention whereby graphic icon 34 of instrument 33 may consistently be displayed as a replication of the actual motions of instrument 33 relative to the anatomical region of the patient from the viewpoint of the physician.
  • FIG. 3 illustrates an image-guided intervention system 11 employing imaging device 20, instrument tracking device 30, a physician-centric display module 70, a registration device 80 and display device 50.
  • physician-centric display module 70 is broadly defined herein as any software, firmware and/or hardware structurally configured for generating physician-centric data ("PCD") 71 indicative of a physician-centric viewpoint within a physician coordinate system (i.e., the viewpoint of the physician is explicitly or implicitly centered within the physician coordinate system), such as, for example, a generation of physician-centric data 71 indicative of a physician-centric viewpoint 73 within a physician coordinate system 72 as shown in FIG. 4.
  • physician-centric data 71 may include an exact computation, an estimation or an approximation of physician-centric viewpoint 73 within the physician coordinate system 72.
  • module 70 Various embodiments of module 70 are subsequently provided herein in connection with the description of FIGS. 5-9.
  • registration device 80 is broadly defined herein as any device structurally configured for processing image data 21, tracking data 31 and physician-centric data 71 via a mathematical execution 82 of a registration algorithm and a transformation application that facilitates a display of a physician-centric viewpoint of the graphic icon of the instrument.
  • the "physician-centric viewpoint of the graphic icon of the instrument” is broadly defined herein as a display of the graphic icon of the instrument that replicates motion of the instrument from the viewpoint of the physician as the physician navigates the instrument within the tracking coordinate system. More particularly, the displayed motion of the graphic icon of the instrument will mimic the actual motion of the instrument in terms of direction, orientation and proportional degree of movement from the viewpoint of the physician.
  • the physician moves the instrument to his/her left from a first location on an anatomical object to a second location on the anatomical object
  • the displayed motion of the graphic icon of the instrument from the first location on a displayed image of the anatomical object to the second location on the displayed image of the anatomical object will mimic this physician navigated motion of the instrument from the first location on the anatomical object to the second location on the the anatomical object.
  • registration device 80 registers image data 21 and tracking data 31 as previously described herein to generate a base transformation matrix 83.
  • Physician-centric data 71 includes a physician-centric
  • transformation matrix 84 corresponding to a physician-centric viewpoint 73 within physician coordinate system 74
  • registration device 80 applies the transformation matrix 84 to transformation matrix 83 as know in the art to generate a display 85 of graphic icon 34 relative to image 23 of the anatomical region that replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32. This is highlighted by the arrows respectively pointing from graphic icon 34 to image 23 in display 85 and instrument 33 to anatomical region 24 shown in coordinate system 32.
  • FIG. 6 herein provides more detail of this embodiment.
  • registration device 80 applies physician-centric transformation matrix 84 to imaging data 21 as known in the art, and thereafter registers the transformed imaging data 21 to tracking data 31 as known to the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32.
  • physician-centric transformation matrix 84 to imaging data 21 as known in the art, and thereafter registers the transformed imaging data 21 to tracking data 31 as known to the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by a physician 74 within tracking coordinate system 32.
  • registration device 80 applies physician-centric transformation matrix 84 to tracking data 31 as known in the art, and thereafter registers the transformed tracking data 31 to imaging data 31 as known in the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by physician 74 within tracking coordinate system 32.
  • physician-centric transformation matrix 84 to tracking data 31 as known in the art, and thereafter registers the transformed tracking data 31 to imaging data 31 as known in the art to generate base transformation matrix 83 to thereby facilitate display 85 of graphic icon 34 relative to image 23 of the anatomical region that again replicates motion of surgical instrument 33 as navigated by physician 74 within tracking coordinate system 32.
  • registration device 80 In practice, for all three embodiments, those having ordinary skill in the art will appreciate the specific transformation application(s) utilized by registration device 80 is(are) dependent upon many variables including, but not limited to, the actual construction of coordinate systems 22 and 32, the order of multiplication, and the required application of the resultant transformation to display the graphical data. Thus, there are numerous operational modes of module 70 and device 80 in practice, and any operational mode of module 70 and registration device 80 in practice is dependent on the specific application of an image-guiding system of the present invention.
  • FIGS. 5-9 will now be described with an emphasis of the physician-centric transformation matrix aspect of the present invention.
  • FIG. 5 illustrates a virtual plan of a surgical room 90 that is annotated to determine a patient location 92 (e.g., prone, decubitus right, supine, etc.) relative to a physician-centric position 91 and a display location 93 relative to a physician-centric position 91. From this information, it is possible to "correct" the physician's motions so that they accurately reflect the expected movements. Using this information, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination. In practice, surgical room 90 may be in advance whereby the physician is required to place the patient in a given location and orientation and the physician is further required to stand in a particular location relative to the display device.
  • FIG. 1 illustrates a virtual plan of a surgical room 90 that is annotated to determine a patient location 92 (e.g., prone, decubitus right, supine, etc.) relative to a physician-centric position 91 and a display location
  • module 70 being "taught" physician-centric transformation matrix 84 in a manner that enables efficient hand eye coordination by leading the physician through a series of movements to record his reaction.
  • the physician may be instructed to move a coordinating instrument 101 that has been equipped with a position indicating element in a direction indicated on a display device 100.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
  • FIG. 7 illustrates the relative location of patient and physician may be determined automatically by a set 110 of special indicators 111-113 utilized via imaging of the anatomical region or tracking of the instrument. More particularly, during a indicator 111 may be placed on the physician on the same side of the patient as the physician, indicator 112 may be placed on the patient's head and indicator 113 may be placed on the patient on a side of the patient opposite the physician. Using this information during a pre-operative scan/intra-operative scan of the patient or a tracking of the instrument, module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.
  • indicators 111-113 are distinguishable from one another through geometry, selection of materials or by some other physical property that may manifest itself as "brightness" in an MR or CT scan. Additionally, indicators 111-113 may contain position indicating elements detectable by a position sensor arranged so as to also render an orientation of indicators 111-113 and purpose (i.e. to identify the patient's right hand side, physician side, etc.) known to the position sensor.
  • FIG. 8 illustrates a set 120 of a physician patch 121 to be applied to the physician and a patient patch 122 to be applied the patient to assist with proper display and reaction of the graphic display can be done prior to, during or after a diagnostic scan (e.g., CT scan) to help indicate where the physician is located relative to the patient and the orientation of the patient.
  • patches 121 and 122 may be individually distinguishable by the position sensing device alone, which is able to tell one from another along with its orientation.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye
  • FIG. 9 illustrates a graphical view of three (3) base displays 130-132 of an image of the anatomical region that facilitates the physician in rotating a 3D display 133 of the image to an orientation that mirrors the physician's view of the patient on the procedure table. More particularly, the physician uses an interface (e.g. a mouse or a trackball) to rotate the 3D display 133 so what is displayed on the screen duplicates his view of the image from his current location.
  • module 70 may compute physician-centric transformation matrix 84 as known in the art to be applied so the physician may experience optimal hand-eye coordination.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

L'invention porte sur un système guidé par image (11) employant un dispositif d'imagerie (20), sur un dispositif de suivi d'instrument (80), sur un module d'affichage centré sur le médecin (70) et sur un dispositif d'enregistrement. Un dispositif d'imagerie génère des données d'imagerie (21) indicatives d'une image (23) d'une région anatomique (24) d'un patient dans un système de coordonnées d'imagerie (22). Un dispositif de suivi d'instrument génère des données de suivi (31) indicatives d'un emplacement d'un instrument (33) dans un système de coordonnées de suivi (32). Un module d'affichage centré sur le médecin génère des données centrées sur le médecin (71) indicatives d'un point de vue centré sur le médecin (73) dans un système de coordonnées de médecin (72). Un dispositif d'enregistrement génère un affichage centré sur le médecin (81) illustrant un point de vue centré sur le médecin d'une icône graphique (34) de l'instrument par rapport à l'image de la région anatomique en fonction d'une application de transformation des données centrées sur le médecin pour un enregistrement des données d'imagerie et des données de suivi. Une illustration du point de vue centré sur le médecin de l'icône graphique de l'instrument par rapport à l'image de la région anatomique comprend le mouvement de reproduction d'icône graphique de l'instrument dans le système de coordonnées de suivi à partir d'un point de vue d'un médecin.
PCT/IB2011/052339 2010-06-30 2011-05-27 Procédé et système de création d'un système de coordonnées centré sur le médecin WO2012001550A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36020510P 2010-06-30 2010-06-30
US61/360,205 2010-06-30

Publications (1)

Publication Number Publication Date
WO2012001550A1 true WO2012001550A1 (fr) 2012-01-05

Family

ID=44544053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/052339 WO2012001550A1 (fr) 2010-06-30 2011-05-27 Procédé et système de création d'un système de coordonnées centré sur le médecin

Country Status (1)

Country Link
WO (1) WO2012001550A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015075720A1 (fr) * 2013-11-21 2015-05-28 Elbit Systems Ltd. Système médical de suivi optique

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5776050A (en) * 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20040138556A1 (en) * 1991-01-28 2004-07-15 Cosman Eric R. Optical object tracking system
WO2009094646A2 (fr) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Procédés, systèmes et supports lisibles par ordinateur pour ablation guidée par imagerie

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20040138556A1 (en) * 1991-01-28 2004-07-15 Cosman Eric R. Optical object tracking system
US5776050A (en) * 1995-07-24 1998-07-07 Medical Media Systems Anatomical visualization system
WO2009094646A2 (fr) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Procédés, systèmes et supports lisibles par ordinateur pour ablation guidée par imagerie

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015075720A1 (fr) * 2013-11-21 2015-05-28 Elbit Systems Ltd. Système médical de suivi optique

Similar Documents

Publication Publication Date Title
US10575755B2 (en) Computer-implemented technique for calculating a position of a surgical device
US20210401456A1 (en) Apparatus for Use with Needle Insertion Guidance System
US10342575B2 (en) Apparatus for use with needle insertion guidance system
US7466303B2 (en) Device and process for manipulating real and virtual objects in three-dimensional space
Zhang et al. Electromagnetic tracking for abdominal interventions in computer aided surgery
EP2096523B1 (fr) Système de localisation doté d'un écran tactile virtuel
Andrews et al. Registration techniques for clinical applications of three-dimensional augmented reality devices
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
JP2023053108A (ja) 同時x平面撮像を用いた画像レジストレーション及び誘導
US20190209241A1 (en) Systems and methods for laparoscopic planning and navigation
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
US8977342B2 (en) Medical intervention device
US20110282188A1 (en) Insertion guidance system for needles and medical components
US20060173269A1 (en) Integrated skin-mounted multifunction device for use in image-guided surgery
US20240008846A1 (en) System for tracking and imaging a treatment probe
US20120059220A1 (en) Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
CN110192917B (zh) 用于执行经皮导航规程的系统和方法
US20140206994A1 (en) Accurate visualization of soft tissue motion on x-ray
WO2015188393A1 (fr) Procédé de surveillance de mouvement d'organe humain, système de navigation chirurgical, et supports lisibles par ordinateur
EP3544538B1 (fr) Systèmes destinés à diriger une instrumentation d'intervention
EP2720636A1 (fr) Système et procédé pour mettre en oeuvre une injection guidée pendant une chirurgie endoscopique
JP2008126075A (ja) Ctレジストレーション及びフィードバックの視覚的検証のためのシステム及び方法
WO2008035271A2 (fr) Dispositif d'enregistrement d'un modèle 3d
Traub et al. Advanced display and visualization concepts for image guided surgery
WO2016108110A1 (fr) Visualisation et suivi de position/orientation relative entre un dispositif d'intervention et des cibles anatomiques de patient dans des systèmes et des procédés de guidage par image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11738287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11738287

Country of ref document: EP

Kind code of ref document: A1