WO2011070492A1 - Visualisation d'ultrasons dans des images radiographiques - Google Patents

Visualisation d'ultrasons dans des images radiographiques Download PDF

Info

Publication number
WO2011070492A1
WO2011070492A1 PCT/IB2010/055570 IB2010055570W WO2011070492A1 WO 2011070492 A1 WO2011070492 A1 WO 2011070492A1 IB 2010055570 W IB2010055570 W IB 2010055570W WO 2011070492 A1 WO2011070492 A1 WO 2011070492A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
ultrasound probe
ray
probe
ray image
Prior art date
Application number
PCT/IB2010/055570
Other languages
English (en)
Inventor
Nicolas P. B. Gogin
Gang Gao
Gerardus H. M. Gijbers
Kawaldeep S. Rhode
Original Assignee
Koninklijke Philips Electronics N.V.
Kcl Enterprises Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Kcl Enterprises Ltd. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2011070492A1 publication Critical patent/WO2011070492A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions

Definitions

  • the present invention relates to x-ray guided procedures. Especially, the invention relates to a method for processing an x-ray image. Furthermore, the invention relates to a system comprising an x-ray system as well as an ultrasound system, wherein the system is equipped with a computer program for performing the method.
  • One of the challenges of image-guided medical and surgical procedures is to efficiently use the information provided by the many imaging techniques the patient may have been through before and during the intervention.
  • the physician In cardiology, for example the physician often has access to real-time x-ray images acquired by a C-arm. These images have a very good spatial and temporal accuracy enable to follow precisely the progression of thin catheters and other interventional tools.
  • a solution consists in using a second imaging modality which is both 3D and able to image soft-tissues.
  • One possible choice for this second imaging system is 3D ultrasound imaging.
  • trans-esophageal probes can be navigated right next to the heart, producing real-time volumetric images with anatomical details that are hardly visible with standard transthoracic ultrasound.
  • Typical interventions currently involving this modality combination are ablation for atrial fibrillation, PFO closure (or other septal default repair), and percutaneous valve repair (PVR). All those interventions are x-ray centric, but in all of them, the simultaneous involvement of ultrasound is either very helpful or completely mandatory to monitor the placement of the tool/endoprosthesis with respect to the soft-tissue anatomy.
  • the ultrasound probe can deliver very useful images of the anatomy, an important drawback is the compromise that exists between the temporal acquisition frame rate and the extent of the field of view. It is therefore necessary to have a small field of view to acquire images at high frame rate.
  • a volume with a large field of view is first acquired and is used to select small sub-regions within this first acquisition corresponding to the area of interest.
  • the area of interest would include the interventional tools or some of them. So in practice, the acquisition volume could be targeted around the interventional tools.
  • Ultrasound through x-ray registration is usually performed using image-based registration techniques aiming at lining common structures visualized by both modalities. This approach has several drawbacks.
  • TEE trans-esophageal echocardiograms
  • Ultrasound to x-ray registration can also be achieved using tracking systems which give the position of the ultrasound probe with respect to the x-ray imaging system.
  • the ultrasound probe does not come with a standard tracking system that could be attached to the x-ray imaging system.
  • Many systems have been designed to gap that void using physical trackers such as magnetic devices. These systems may be expensive and have several disadvantages: they can be disrupted by interference and require additional calibration steps which are prone to error.
  • this is achieved by a method for processing an x-ray image, comprising the steps of receiving an x-ray image, detecting an ultrasound probe in the x-ray image, and visualizing acquisition settings of the ultrasound probe within the x-ray image.
  • the operator can easily adjust the acquisition settings thanks to the information visualized in x-ray. It provides an interactive way to change the acquisition settings of the ultrasound acquisition system during an interventional procedure.
  • the acquisition setting may be the field of view of an ultrasound probe.
  • the volume of the field of view of the ultrasound probe can be represented as a truncated pyramid in 3D. This pyramid may be indicated by the outlines of an area which can be visualized by an ultrasound system. Further, the pyramid may be defined by its centre together with parameters like the distance to the ultrasound sensor of the probe, a width, length, angle and/or a depth of the pyramid.
  • the volume of the field of view may also be a truncated pyramid in one plane having a constant thickness perpendicular to said plane. With an appropriate calibration, the truncated pyramid can be projected and displayed in the x-ray image. As the operator changes the acquisition of the probe, the display of the acquisition volume in the x-ray image is automatically updated to provide a direct feedback to the operator.
  • one or more parameter like a main direction, an angle, a distance, a frame rate or a coordinate system, may be visualized in the x-ray image.
  • the visualization of such parameters may be provided by for example points or lines or by numerals at an appropriate position in the x-ray image.
  • a main direction may be a direction perpendicular to the surface of the ultrasound sensor or sensors at the ultrasound probe.
  • a distance may be the distance of the ultrasound sensor to the center of the field of view or to a center of a reference coordinate system or to an interventional device also visible in the x-ray image or to any other predetermined point in the x-ray image.
  • the method comprises the step of registering the probe including an estimation of a position and an orientation of the probe relative to a reference coordinate system.
  • the reference coordinate system may be any pre-determined coordinate system.
  • the reference coordinate system may be within the plane of the x-ray image or may be defined relative to the C-arm of an x-ray system which may be used while performing the method.
  • the method may comprise a step of matching a digitally rendered projection of a 3D model of the probe with the detected probe in the x-ray image, wherein the estimation of the position and orientation of the probe is retrieved from the 3D model of the probe.
  • the 3D model may be retrieved from a CT acquisition or may be a computer-aided design model.
  • a 2D x-ray image of an ultrasound probe may be registered with a
  • 3D model of the probe which can be either a 3D acquisition of the probe or a computer-aided design (CAD).
  • CAD computer-aided design
  • This registration is performed by matching a digitally rendered radiograph of the probe and the real x-ray projection of the probe.
  • a graphic processing unit (GPU) based algorithm may be used to generate digitally rendered radiograph in an efficient way.
  • the 2D-3D registration of the ultrasound probe gives the 3D pose of the probe with respect to the x-ray imaging system.
  • There are several interesting applications such as merging the ultrasound image with x-ray image or ultrasound volume compounding in order to build an extended field of view.
  • the method further comprises the step of detecting an interventional device in the x-ray image and manipulating the probe so that the interventional device is within the field of view of the probe. It is noted that this manipulation may be performed manually as well as automatically. Accordingly, it may be possible to detect and track an interventional device in 2D x-ray image and to steer an ultrasound probe beam towards this device.
  • the field of view of a probe can be automatically steered, and additionally the appearance of the intervention device in the fluoroscopy may be modified by for example blinking, flashing or coloring, when the device or at least a part of the device enters or is present in the field of view of the ultrasound probe.
  • the visualization will be enhanced and will dramatically help the steering of the ultrasound probe beam in the interventional context.
  • the method may further comprise the step of overlaying an ultrasound image provided by the probe over the x-ray image. Furthermore, it may be possible to overlay a plurality of ultrasound images over only one x-ray image. This may provide for an extended field of view.
  • interventional device may be a flexible or stiff catheter or a biopsy device, a canula or trokar.
  • the ultrasound probe may also be a trans-esophageal echocardiography ultrasound probe.
  • a computer program is provided by means of which the above described method may be performed automatically, or at least predominantly automatically. Therefore, the computer program comprises sets of instructions for storing an x-ray image generated by an x-ray system, sets of instructions for detecting an ultrasound probe in that x-ray image, and sets of instructions for visualizing acquisition settings of the ultrasound probe within the x-ray image.
  • the computer program may comprise sets of instructions for estimating the position and orientation of the ultrasound probe relative to a reference coordinate system. Further, the computer program may comprise sets of instructions for receiving data representing a 3D model of the ultrasound probe.
  • Such a computer program may be implemented according to a further embodiment of the invention in a system including an x-ray system, an ultrasound system with a ultrasound probe, and a processing unit.
  • a system will include also a monitor for a visualization of the ultrasound as well as the x-ray images.
  • Such a computer program is preferably loaded into a work memory of a data processor.
  • the data processor is thus equipped to carry out the method of the invention.
  • the invention relates to a computer readable medium, such as a CD-ROM, at which the computer program may be stored.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the work memory of the data processor from such a network.
  • Figure la shows an ultrasound probe retrieved from a CT acquisition.
  • Figure lb shows a non-aligned 3D model.
  • Figure lc shows an aligned 3D model.
  • Figure 2 shows an x-ray image including an ultrasound probe.
  • Figure 3 shows an x-ray image including an ultrasound probe as well as a schematic visualization of the field of view of said probe.
  • Figure 4 is a diagram illustrating the system and method according to the invention.
  • Figure 1 shows, from left to right, an x-ray target image of an ultrasound probe, a non-aligned digitally rendered radiograph (DRR) of an ultrasound probe, as well as an aligned DRR.
  • DRR digitally rendered radiograph
  • the 3D model of figure lb is orientated so that a projection thereof matches with the projection of the probe in the x-ray image of figure la.
  • the orientated 3D model of figure lc may be subsequently combined with the x-ray image.
  • Figure 2 shows such an overlay of an aligned DRR 110 on top of an x-ray image of chest 300 and heart 320, after intensity based registration, and an estimation of the position and orientation of the probe. This gives the position/orientation of the probe with respect to the x-ray imaging system. If both systems are calibrated, the ultrasound image can be merged with the x-ray image. Also shown in figure 2 are interventional devices 200, for example catheters.
  • a coordinate system in front of the ultrasound probe 110 indicates the estimated orientation of the ultrasound sensor elements relative to the image plane of the x-ray image.
  • An x-ray acquisition system is configured to produce real-time 2D x-ray images of an anatomical region during an interventional procedure. This modality does not allow clear visualization of complex soft-tissue anatomy such as the heart.
  • An ultrasound acquisition system with for example a trans-esophageal echocardiography (TEE) ultrasound probe, is configured to produce images of the anatomy.
  • This ultrasound acquisition system is assumed to lie at least partially in the field of view of the x-ray acquisition system with sufficient information that it is enough to recover the coordinate system of the images produced by this system. It is the case for example when the whole detector of the ultrasound acquisition system is present in the x-ray image and/or when its position can be estimated from other structures present in the x-ray image.
  • TEE trans-esophageal echocardiography
  • a 3D model of the ultrasound probe may be used to automatically compute the pose of the probe. This may be done by matching the x-ray image of the ultrasound probe with a digitally rendered radiograph generated by transparent projection of the 3D model (cf. figures 1 and 2).
  • An optimization algorithm allows retrieving the 6 pose- parameters of the probe which gives the 3D position of the probe and its 3D orientation with respect to for example the C-arm system defining a reference coordinate system.
  • the volume of acquisition 130 of the ultrasound probe 110 may be represented as a truncated pyramid in 3D, assuming that the position and orientation of the ultrasound probe 110 with respect to the x-ray image is known.
  • an interventional device 200 with its interventional end portion may be located such that the field of view 130 encompasses that interventional end portion of the device 200.
  • an angle 140 determining the angle of beam of the field of view of the ultrasound probe.
  • the angle of beam is 42,3 degree.
  • FIG 4 is a flow chart showing the steps of a method for a combination of ultrasound and x-ray images.
  • the patient is simultaneous imaged by an ultrasound system 100 and an x-ray system 400.
  • a considered ultrasound probe of the ultrasound system 100 is capable of generating synthetically steered beams, preferably in 3D.
  • step SI the ultrasound system 100 and the x-ray imaging system 400 are first mutually registered. This can typically be achieved by imaging the probe of the ultrasound system 100 by the x-ray system 400, and based on the settings 150 and data 160 of the ultrasound system 100 and on the settings 410 of the x-ray system 100, plus on the possible use of a probe 3D model 500 or markers, in determining the position of the probe in the x-ray referential. From this information, and based on the relevant calibration information, one can use the parameters of the probe field of view in the x-ray referential, as described above. Data Sic will be exchanged for visualization of the resulting image.
  • step S2 at the same time, the intervention device (for instant the tip of a catheter), is detected and tracked in the x-ray images.
  • This step relies on data 420 of the x-ray system 400 and on usual object detection means that rely on the spatial signature of the device and possibly on its motion characteristics (for instance, the device is animated by a cardiac motion plus a steering motion, seen in projection).
  • step S3 it is advantageous to improve the 2D location provided by device tracking in the x-ray images and to try to get a depth estimation of the considered device.
  • Several approaches are possible to reach the goal, among which the exploitation of the devices observed width, the use of other x-ray views under different angulation for instance in bi-plane context or the use of wiggling motions.
  • the width of the ultrasound probe may be estimated, wherein subsequently possible locations of the ultrasound probe are
  • the device-improved location S3a can then be compared to the found ultrasound field of view Sib, and several commands can be issued accordingly. For instance, a device flashing/blinking command can be issued to the imaging processing channel of the x-ray data stream, or a probe steering command S4a can be sent to the relevant module.
  • step S5 i.e. the visualization of the device in the x-ray image which is adapted based on events such as the entering (blinking/flashing) or the presence (coloring) of the device in the ultrasound field of view.
  • This provides the ultrasound user with an easy way of controlling the steering of the probe based on the high resolution x-ray images. Of course, this steering is also made easier by the visualization of the ultrasound cone as shown in figure 3.
  • the result of step S5 is an enhanced 2D view S5a facilitating the steering of the ultrasound probe.
  • a command S6a can be issued to the beam-steering module of the ultrasound system 100, as to which field of view one should generate in order to nicely visualize the device at the center of the ultrasound cone (volume or image).
  • the probe steering module based on the ultrasound/x-ray registration information will determine and apply the relevant set parameters enabling this device-driven steering.
  • a computer program may be stored/distributed on a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as a part of another hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope. LIST OF REFERENCE SIGNS

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention permet de détecter et de suivre un dispositif d'intervention dans une image fluoroscopique 2D et de diriger un faisceau de sonde ultrasonore vers ce dispositif. Elle concerne par conséquent un procédé et un système correspondant qui permettent de visualiser des paramètres d'acquisition d'une sonde ultrasonore dans une image fluoroscopique, afin de faciliter le positionnement et l'orientation de la sonde ultrasonore par rapport à la fluoroscopie.
PCT/IB2010/055570 2009-12-09 2010-12-03 Visualisation d'ultrasons dans des images radiographiques WO2011070492A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP09306202.4 2009-12-09
EP09306202 2009-12-09

Publications (1)

Publication Number Publication Date
WO2011070492A1 true WO2011070492A1 (fr) 2011-06-16

Family

ID=43733592

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/055570 WO2011070492A1 (fr) 2009-12-09 2010-12-03 Visualisation d'ultrasons dans des images radiographiques

Country Status (1)

Country Link
WO (1) WO2011070492A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014072890A1 (fr) 2012-11-06 2014-05-15 Koninklijke Philips N.V. Amplification d'images échographiques
WO2014102718A1 (fr) 2012-12-28 2014-07-03 Koninklijke Philips N.V. Modélisation de scène en temps réel combinant une imagerie ultrasonore tridimensionnelle (3d) et une imagerie à rayons x bidimensionnelle (2d)
WO2015193150A1 (fr) * 2014-06-17 2015-12-23 Koninklijke Philips N.V. Dispositif de guidage pour sonde d'échocardiographie transœsophagienne
CN107397560A (zh) * 2017-08-21 2017-11-28 中国科学院苏州生物医学工程技术研究所 线靶固定装置及超声体模
EP3659514A1 (fr) * 2018-11-29 2020-06-03 Koninklijke Philips N.V. Identification et localisation de dispositif à base d'image
CN111989045A (zh) * 2018-03-19 2020-11-24 皇家飞利浦有限公司 多模态成像对准

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6996430B1 (en) * 1999-08-16 2006-02-07 Super Dimension Ltd Method and system for displaying cross-sectional images of a body
WO2008062358A1 (fr) * 2006-11-22 2008-05-29 Koninklijke Philips Electronics N.V. Combiner des rayons x avec des données acquises de manière intravasculaire
US20080146919A1 (en) * 2006-09-29 2008-06-19 Estelle Camus Method for implanting a cardiac implant with real-time ultrasound imaging guidance
US20090088628A1 (en) * 2007-09-27 2009-04-02 Klaus Klingenbeck-Regn Efficient workflow for afib treatment in the ep lab
US20090185657A1 (en) * 2008-01-18 2009-07-23 Siemens Aktiengesellschaft Registration method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6996430B1 (en) * 1999-08-16 2006-02-07 Super Dimension Ltd Method and system for displaying cross-sectional images of a body
US20080146919A1 (en) * 2006-09-29 2008-06-19 Estelle Camus Method for implanting a cardiac implant with real-time ultrasound imaging guidance
WO2008062358A1 (fr) * 2006-11-22 2008-05-29 Koninklijke Philips Electronics N.V. Combiner des rayons x avec des données acquises de manière intravasculaire
US20090088628A1 (en) * 2007-09-27 2009-04-02 Klaus Klingenbeck-Regn Efficient workflow for afib treatment in the ep lab
US20090185657A1 (en) * 2008-01-18 2009-07-23 Siemens Aktiengesellschaft Registration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FRENCH D ET AL: "Computing Intraoperative Dosimetry for Prostate Brachytherapy Using TRUS and Fluoroscopy<1>", ACADEMIC RADIOLOGY, RESTON, VA, US, vol. 12, no. 10, 1 October 2005 (2005-10-01), pages 1262 - 1272, XP025311552, ISSN: 1076-6332, [retrieved on 20051001], DOI: DOI:10.1016/J.ACRA.2005.05.026 *
YINGLIANG MA ET AL: "Evaluation of a robotic arm for echocardiography to X-ray image registration during cardiac catheterization procedures", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 2009. EMBC 2009. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE, IEEE, PISCATAWAY, NJ, USA, 3 September 2009 (2009-09-03), pages 5829 - 5832, XP031567118, ISBN: 978-1-4244-3296-7 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11373361B2 (en) 2012-11-06 2022-06-28 Koninklijke Philips N.V. Enhancing ultrasound images
CN113855059A (zh) * 2012-11-06 2021-12-31 皇家飞利浦有限公司 增强超声图像
WO2014072890A1 (fr) 2012-11-06 2014-05-15 Koninklijke Philips N.V. Amplification d'images échographiques
US10157491B2 (en) 2012-12-28 2018-12-18 Koninklijke Philips N.V. Real-time scene-modeling combining 3D ultrasound and 2D X-ray imagery
WO2014102718A1 (fr) 2012-12-28 2014-07-03 Koninklijke Philips N.V. Modélisation de scène en temps réel combinant une imagerie ultrasonore tridimensionnelle (3d) et une imagerie à rayons x bidimensionnelle (2d)
US10939881B2 (en) 2014-06-17 2021-03-09 Koninklijke Philips N.V. Guidance device for a tee probe
JP2017518118A (ja) * 2014-06-17 2017-07-06 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Teeプローブのための誘導デバイス
WO2015193150A1 (fr) * 2014-06-17 2015-12-23 Koninklijke Philips N.V. Dispositif de guidage pour sonde d'échocardiographie transœsophagienne
CN107397560A (zh) * 2017-08-21 2017-11-28 中国科学院苏州生物医学工程技术研究所 线靶固定装置及超声体模
CN107397560B (zh) * 2017-08-21 2023-02-03 中国科学院苏州生物医学工程技术研究所 线靶固定装置及超声体模
CN111989045A (zh) * 2018-03-19 2020-11-24 皇家飞利浦有限公司 多模态成像对准
EP3659514A1 (fr) * 2018-11-29 2020-06-03 Koninklijke Philips N.V. Identification et localisation de dispositif à base d'image
WO2020109604A1 (fr) * 2018-11-29 2020-06-04 Koninklijke Philips N.V. Dispositif de suivi basé sur image
CN113164150A (zh) * 2018-11-29 2021-07-23 皇家飞利浦有限公司 基于图像的设备跟踪

Similar Documents

Publication Publication Date Title
US10238361B2 (en) Combination of ultrasound and x-ray systems
JP5345275B2 (ja) 超音波データと事前取得イメージの重ね合わせ
JP4795099B2 (ja) 超音波を用いた電気解剖学的地図と事前取得イメージの重ね合わせ
JP5265091B2 (ja) 2次元扇形超音波イメージの表示
EP2680755B1 (fr) Visualisation pour guidage de navigation
US8428690B2 (en) Intracardiac echocardiography image reconstruction in combination with position tracking system
JP5622995B2 (ja) 超音波システム用のビーム方向を用いたカテーテル先端部の表示
US8364242B2 (en) System and method of combining ultrasound image acquisition with fluoroscopic image acquisition
CA2614033C (fr) Coloration des cartes electro-anatomiques permettant d&#39;indiquer la saisie des donnees ultrasoniques
EP2099378B1 (fr) Appareil servant à déterminer une position d&#39;un premier objet à l&#39;intérieur d&#39;un second objet
US20160239963A1 (en) Tracking-based 3d model enhancement
JP2018514352A (ja) 後期マーカー配置による融合イメージベース誘導のためのシステムおよび方法
JP2006305358A (ja) 超音波輪郭再構築を用いた3次元心臓イメージング
JP2006305359A (ja) 超音波輪郭再構築を用いた3次元心臓イメージングのためのソフトウエア製品
US20140023250A1 (en) Medical imaging systen and method for providing an image representation supporting accurate guidance of an intervention device in a vessel intervention procedure
EP1727471A1 (fr) Systeme pour guider un instrument medical dans le corps d&#39;un patient
EP2925232B1 (fr) Intégration de modalités ultrasons et rayons x
Housden et al. Evaluation of a real-time hybrid three-dimensional echo and X-ray imaging system for guidance of cardiac catheterisation procedures
WO2011070492A1 (fr) Visualisation d&#39;ultrasons dans des images radiographiques
US11628014B2 (en) Navigation platform for a medical device, particularly an intracardiac catheter
AU2013251245B2 (en) Coloring electroanatomical maps to indicate ultrasound data acquisition
Housden et al. X-ray fluoroscopy–echocardiography

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10805837

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10805837

Country of ref document: EP

Kind code of ref document: A1