WO2015121301A1 - Optimisation d'imagerie médicale - Google Patents

Optimisation d'imagerie médicale Download PDF

Info

Publication number
WO2015121301A1
WO2015121301A1 PCT/EP2015/052871 EP2015052871W WO2015121301A1 WO 2015121301 A1 WO2015121301 A1 WO 2015121301A1 EP 2015052871 W EP2015052871 W EP 2015052871W WO 2015121301 A1 WO2015121301 A1 WO 2015121301A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
volume
imaging
data
settings
Prior art date
Application number
PCT/EP2015/052871
Other languages
English (en)
Inventor
Norbert Strobel
Martin Willibald KOCH
Matthias Hoffmann
Marcus Pfister
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2015121301A1 publication Critical patent/WO2015121301A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/545Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data

Definitions

  • the present embodiments relate to medical imaging optimization .
  • Radio frequency (RF) catheter ablation may be used to treat heart arrhythmias.
  • RF catheter ablation may be performed using a C-arm fluoroscopy system, with standard X-ray angula- tions.
  • the main goal during ablative treatment of atrial fibrillation, for example, is the isolation of the pulmonary veins (PV) .
  • PV pulmonary veins
  • anatomic criteria may define an optimal view.
  • the optimal projection is to provide a clear identification of the left atrium and PV junction, as well as a parallel view on the PV ostium.
  • the patient is to be positioned such that the therapy region of interest may be seen well.
  • the X-ray angulations are empirically selected based on gen ⁇ eral heart anatomy. Common angulations are anterior-posterior (AP) , 30-45° right anterior oblique (RAO), and 45-60° left anterior oblique (LAO) .
  • AP anterior-posterior
  • REO right anterior oblique
  • LAO left anterior oblique
  • optimal X-ray angulations for the C-arm fluoroscopy system may be adjusted by a user of the C-arm fluoroscopy system manually.
  • the patient is to be placed such that the organ of interest (e.g., the left atrium of the heart) is placed at the iso-center of the C-arm imaging system.
  • 3D data representing a volume to be imaged is registered with an imaging device, and the imaging device setting is determined based on an identified type of imaging.
  • a virtual view of the volume is generated from the registered 3D data based on the imaging device setting, and is dis ⁇ played.
  • the imaging device setting is adjusted based on the displayed virtual view.
  • An adjusted virtual view of the volume is generated from the registered 3D data based on the ad- justed imaging device setting.
  • a method for optimization of X-ray imaging includes identifying 3D data representing a volume.
  • a processor registers the 3D data representing the volume with an im- aging device.
  • Data representing a type of intended imaging of the volume is identified.
  • One or more first imaging device settings are determined based on the registered 3D data representing the volume and the type of the intended imaging.
  • a virtual view of the volume is generated from the registered 3D data representing the volume based on the one or more determined first imaging device settings.
  • a display in communication with the processor displays the generated virtual view of the volume.
  • One or more second imaging device settings are determined.
  • the determination of the one or more second imag- ing device settings includes an adjustment of an imaging device setting of the one or more determined first imaging device settings.
  • An adjusted virtual view of the volume is gen ⁇ erated from the registered 3D data representing the volume based on the adjusted imaging device setting.
  • a non-transitory computer-readable stor ⁇ age medium stores instructions executable by one or more processors to optimize X-ray imaging.
  • the instructions include identifying 3D data representing a volume.
  • the instructions also include identifying data representing a type of an intended imaging of the volume, and determining, based on the 3D data representing the volume and the type of the intended imaging, one or more imaging device settings.
  • the instruc- tions include generating a virtual view of the volume from the 3D data representing the volume based on the one or more determined imaging device settings.
  • the instructions also include displaying the generated virtual view of the volume, receiving a first input, and adjusting at least one imaging device setting of the one or more determined imaging device settings based on the first input.
  • a system for X-ray imaging includes a processor, a display in communication with the processor, and an imaging device in communication with the processor.
  • the processor is configured to identify 3D data representing a vol ⁇ ume, identify data representing a type of an intended imaging of the volume, and determine, based on the registered 3D data representing the volume and the type of the intended imaging, one or more first imaging device settings.
  • the processor is further configured to generate a virtual view of the volume from the registered 3D data representing the volume based on the one or more determined first imaging device settings, and determine one or more second imaging device settings.
  • the determination of the one or more second imaging device settings includes an adjustment of an imaging device setting of the one or more determined first imaging device settings.
  • the processor is also configured to generate an adjusted virtual view of the volume from the registered 3D data representing the volume based on the adjusted imaging device setting, and receive an input.
  • the display is configured to display the generated virtual view of the volume and display the generat- ed adjusted virtual view of the volume.
  • the determination of the one or more second imaging device settings is after the generated virtual view of the volume is displayed.
  • the imaging device is configured to generate data representing the volume based on the received input, using the one or more de ⁇ termined second imaging device settings.
  • Figure 1 shows one embodiment of an imaging system
  • Figure 2 shows an imaging system including one embodiment of an imaging device
  • Figure 3 shows a flowchart of one embodiment of a method for imaging optimization
  • Figure 4 shows exemplary planes of an imaging system.
  • Three-dimensional (3D) data may be used to set up views without the need for additional image acquisitions.
  • This method may be applied for X-ray imaging devices, as the method may save a patient to be imaged from further X-ray radiation.
  • resulting views and resulting dose to the pa- tient may be simulated.
  • new views may still be quickly rendered without a need to actually carry out the ac ⁇ tual imaging, which may save time.
  • an imaging device generates 3D data that represents a treatment region, the 3D data is registered with the imaging device, and the registered 3D data is used to determine an optimal imaging view on the treatment region and an optimal imaging technique for the optimal imaging view.
  • the imaging view on the treatment region may be defined by, for example, imaging device parameters (e.g. in case of X-ray imaging, imaging device position, view orientation, patient table position, source-image-distance (SID) , source-object- distance (SOD), zoom selection, collimator/shutter settings, wedge and finger filter positioning, and/or other imaging de ⁇ vice settings) .
  • the imaging technique may be defined by, for example, imaging technique parameters such as dose, tube voltage, pre-filtering, and/or other parameters.
  • augmented fluoroscopy may be used to set up an imaging view (e.g., an X-ray view) .
  • an imaging view e.g., an X-ray view
  • a fluoroscopy overlay image is computed from the registered 3D data to demonstrate how anatomy within the treatment region would look for one or more selected imaging de- vice parameters.
  • views may be set by user interaction with the 3D data and real-time visual feedback.
  • the user may define and/or change one or more imaging device parameters via an input device such as, for example, a mouse and/or a keyboard, and the system may generate and present a simulated view of the treatment region based on the one or more changed imaging device parameters, for example. Suitable views may be explored by the user without the need for X-ray exposure of the patient, for example.
  • the user may specify the treatment region and a desired field of view prior to starting a therapy.
  • the system may automatically compute which views to select, and how to select collimators, wedge filters, finger filters, and/or other devices based on the specified treatment region and desired field of view. Once the user is satisfied with the simulated view of the treatment region, the imaging technique may be optimized.
  • the imaging technique may be optimized using a Monte Carlo simulation method that allows simultaneous estimates of measures of image quality and patient dose.
  • the Monte Carlo simulation acts as a smart agent for defining the image tech ⁇ nique parameters.
  • image fidelity parameters different than dose may be optimized.
  • signal-to-noise ra- tio and scan time may be optimized.
  • simula ⁇ tion methods other than the Monte Carlo simulation method e.g., a virtual MRI method
  • an imaging and/or therapy may be executed using the imaging device, based on the imaging device parameters and the imaging technique parameters.
  • Figure 1 shows one embodiment of an imaging system 100.
  • the imaging system 100 is representative of an imaging modality.
  • the imaging system 100 includes one or more imaging devices 102 and an image processing system 104.
  • a two-dimensional (2D) or a three-dimensional (3D) (e.g., volumetric) image da- taset may be acquired using the imaging system 100.
  • the 2D image data set or the 3D image data set may be obtained contemporaneously with the planning and execution of a medical treatment procedure or at an earlier time. Additional, dif- ferent, or fewer components may be provided.
  • the imaging device 102 includes a C-arm X-ray device.
  • the imaging device 102 may include a gantry-based X-ray system, a magnetic resonance imaging (MRI) system, an ultrasound system, a positron emission tomography (PET) system, a single photon emission computed tomography (SPECT) system, a fluoroscopy, another X-ray system, any other now known or later developed imaging systems, or any combination thereof.
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • fluoroscopy another X-ray system, any other now known or later developed imaging systems, or any combination thereof.
  • the image processing system 104 is or includes a workstation, a processor of the imaging device 102, or another image processing device.
  • the imaging system 100 may be used to optimize an X-ray imaging for RF catheter ablative treatment of atrial fibrillation, for example.
  • the im ⁇ aging system 100 may be used in other image-guided procedures such as endovascular aneurysm repair, valve replacements, biopsy needle placement, etc.
  • the workstation 104 receives data representing a volume generated by the one or more imaging devices 102.
  • FIG 2 shows one embodiment of the imaging system 100 in- eluding the imaging device 102.
  • the imaging device 102 is shown in Figure 2 as a C-arm X-ray device (e.g., a monoplane or biplane C-arm X-ray system) .
  • the imaging device 102 may include an energy source 200 and an imaging detector 202 connected together by a C-arm 204.
  • the imaging device 102 is a biplane system and includes an addi ⁇ tional energy source and an additional imaging detector (e.g., positioned on an additional C-arm) . Additional, different, or fewer components may be provided.
  • the imaging device 102 may be, for example, a gantry- based CT device.
  • the energy source 200 and the imaging detector 202 may be disposed opposite each other.
  • the energy source 200 and the imaging detector 202 are disposed on diametrical- ly opposite ends of the C-arm 204.
  • Arms of the C-arm 204 may be configured to be adjustable lengthwise.
  • the C-arm 204 may be movably attached (e.g., pivotably attached) to a displaceable unit.
  • the C-arm 204 may be moved on a buckling arm robot or other support structure.
  • the robot arm allows the energy source 200 and the imaging detector 202 to move on a defined path around the patient.
  • the C- arm 204 is swept around the patient.
  • contrast agent may be injected intravenously.
  • the energy source 200 and the imaging detector 202 are connected inside a gantry.
  • the energy source 200 may be a radiation source such as, for example, an X-ray source.
  • the energy source 200 may emit ra- diation to the imaging detector 202.
  • the imaging detector 202 may be a radiation detector such as, for example, a digital- based X-ray detector or a film-based X-ray detector.
  • the imaging detector 202 may detect the radiation emitted from the energy source 200. Data is generated based on the amount or strength of radiation detected. For example, the imaging de ⁇ tector 202 detects the strength of the radiation (e.g., intensity) received at the imaging detector 202 and generates data based on the strength of the radiation.
  • the data may be considered imaging data as the data is used to then generate an image.
  • Image data may also include data for a displayed image .
  • the C-arm X-ray device 102 may acguire between 50-500 projections, between 100-200 projections, or between 100-150 projections. In other embodiments, during each rotation, the C-arm X-ray device 102 may acquire between 50-100 projections per second, or between 50-75 projections per second. Any speed, number of projections, dose levels, or timing may be used.
  • a region 206 to be examined (e.g., a volume) is located be ⁇ tween the energy source 200 and the imaging detector 202.
  • the region 206 to be examined may include one or more structures S (e.g., one or more volumes of interest) .
  • the region 206 may or may not include a surrounding area.
  • the re ⁇ gion 206 to be examined may include the one or more struc ⁇ tures S and/or other organs or body parts in the surrounding area of the one or more structures S.
  • the data generated by the one or more imaging devices 102 and/or the image processing system 104 may represent (1) a projection of 3D space to 2D or (2) a reconstruction (e.g., computed tomography) of a 3D region from a plurality 2D projections (e.g., (1) 2D data or (2) 3D data, respectively).
  • the C-arm X-ray device 102 may be used to obtain 2D data or CT-like 3D data.
  • a computer tomography (CT) device may obtain 2D data or 3D data.
  • the data may be obtained from different directions.
  • the imaging device 102 may obtain data representing sagittal, coronal, or axial planes or distribution.
  • the imaging device 102 may be communicatively coupled to the image processing system 104.
  • the imaging device 102 may be connected to the image processing system 104, for example, by a communication line, a cable, a wireless device, a communi- cation circuit, and/or another communication device.
  • the imaging device 102 may communicate the data to the image processing system 104.
  • the image processing system 104 may communicate an instruction such as, for example, a position or angulation instruction to the im- aging device 102. All or a portion of the image processing system 104 may be disposed in the imaging device 102, in the same room or different rooms as the imaging device 102, or in the same facility or in different facilities.
  • the image processing system 104 may represent a plurality of image pro- cessing systems associated with more than one imaging device 102.
  • the imaging device 102 communicates with an archival system or memory.
  • the image processing system 104 retrieves or loads the 2D or 3D data from the memory for processing.
  • the image processing system 104 includes a processor 208, a display 210 (e.g., a monitor), and a memory 212. Additional, different, or fewer components may be provided.
  • the image processing system 104 may include an input device 214, a printer, and/or a network communications interface.
  • the processor 208 is a general processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array, an analog circuit, a digital circuit, another now known or later developed processor, or combinations thereof.
  • the processor 208 may be a single de ⁇ vice or a combination of devices such as, for example, associated with a network or distributed processing. Any of various processing strategies such as, for example, multi- processing, multi-tasking, and/or parallel processing may be used.
  • the processor 208 is responsive to instructions stored as part of software, hardware, integrated circuits, firmware, microcode or the like.
  • the processor 208 may generate an image from the data.
  • the processor 208 processes the data from the imaging device 102 and generates an image based on the data.
  • the processor 208 may generate one or more angiographic images, fluoroscopic images, top-view images, in-plane images, or ⁇ thogonal images, side-view images, 2D images, 3D representations or images (e.g., renderings or volumes from 3D data to a 2D display) , progression images, multi-planar reconstruc- tion images, projection images, or other images from the da ⁇ ta.
  • a plurality of images may be generated from data detected from a plurality of different positions or angles of the imaging device 102 and/or from a plurality of imaging devices 102.
  • the processor 208 may generate a 2D image from the data.
  • the 2D image may be a planar slice of the region 206 to be examined.
  • the C-arm X-ray device 102 may be used to detect data representing voxels of a 3D volume, from which a sagittal image, a coronal image, and an axial image are extracted along a plane.
  • the sagittal image is a side-view image of the region 206 to be examined.
  • the coronal image is a front-view image of the region 206 to be examined.
  • the axial image is a top-view image of the region 206 to be examined.
  • the processor may generate a 3D representation or image from the data.
  • the 3D representation illustrates the region 206 to be examined.
  • the 3D representation may be generated from a reconstructed volume (e.g., by combining 2D datasets, such as with computed tomography) obtained by the imaging device 102.
  • a 3D representation may be generated by analyzing and combining data representing different planes through the patient, such as a stack of sagittal planes, coronal planes, and/or axial planes, or a plurality of planes through the patient at different angles relative to the patient. Ad ⁇ ditional, different, or fewer images may be used to generate the 3D representation.
  • Generating the 3D representation is not limited to combining 2D images. For example, any now known or later developed method may be used to generate the 3D rep ⁇ resentation .
  • the processor 208 may display the generated images on the monitor 210.
  • the processor 208 may generate the 3D representation and communicate the 3D representation to the monitor 210.
  • the processor 208 and the monitor 210 may be connected by a cable, a circuit, another communication coupling or a combination thereof.
  • the monitor 210 is a monitor, a CRT, an LCD, a plasma screen, a flat panel, a projector or another now known or later developed display device.
  • the monitor 210 is operable to generate images for a two-dimensional view or a rendered three-dimensional representation. For example, a two-dimensional image representing a three- dimensional volume through projection or surface rendering is displayed .
  • the processor 208 may communicate with the memory 212.
  • the processor 208 and the memory 212 may be connected by a cable, a circuit, a wireless connection, another communication coupling, or any combination thereof. Images, data, and other information may be communicated from the processor 208 to the memory 212 for storage, and/or the images, the data, and the other information may be communicated from the memory 212 to the processor 208 for processing.
  • the processor 208 may communicate the generated images, image data, or other information to the memory 212 for storage.
  • the memory 212 is a non-transitory computer readable storage media.
  • the computer readable storage media may include vari- ous types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like.
  • the memory 212 may be a single device or a com ⁇ bination of devices.
  • the memory 212 may be adjacent to, part of, networked with and/or remote from the processor 208.
  • the processor 208 is programmed with in ⁇ structions stored in the memory 212 to optimize X-ray imaging during a medical procedure such as, for example, an RF ablative treatment of atrial fibrillation.
  • the goal of such an X- ray imaging optimization may be to isolate the pulmonary veins (PVs) of a patient.
  • PVs pulmonary veins
  • Figure 3 shows a flowchart of one embodiment of a method for optimizing medical imaging (e.g., optimizing an intended im- aging) .
  • the method may be performed using the imaging system 100 shown in Figures 1 and 2 (e.g., at least some of the acts of the method may be performed by the processor 208) or another imaging system.
  • the acts of the method are implemented by one or more processors using instructions from one or more memories.
  • the method is implemented in the order shown, but other orders may be used. Additional, different, or fewer acts may be provided. Similar methods may be used for classifying image data.
  • act 300 3D data representing a volume is identified.
  • the processor 208 for example, may identify the 3D data representing the volume within the memory 212.
  • the 3D data may be based on a plurality of 2D datasets generated with the imag ⁇ ing device 102 or another imaging device (e.g., C-arm comput- ed tomography using syngo DynaCT) .
  • the volume may represent at least a portion of a patient and may include, for example, at least a part of the heart of the patient.
  • the volume may also include tissue, bone, and air surrounding the heart of the patient.
  • the volume includes one or more other or different body parts or organs of the patient.
  • the imaging device used to generate the plurality of 2D datasets is a C-arm X-ray device.
  • the C-arm X-ray device used to generate the plurality of 2D datasets and/or the 3D data may be the same or a different imaging de ⁇ vice than is to be used for the intended imaging.
  • Other imaging devices e.g., a CT device
  • the C-arm X-ray device generates the plu ⁇ rality of 2D datasets by generating a plurality of first pro ⁇ jections into the volume over an angular range.
  • the C-arm X- ray device may generate any number of projections over the angular range.
  • the projections may be generated over one or more rotations in the same or alternating directions.
  • the an ⁇ gular range may be an angular range of a C-arm of the C-arm X-ray device.
  • the C-arm X-ray device generates the plurality of 2D datasets from two angles relative to the patient (e.g., view angles) .
  • a monoplane imaging device may be positioned in two different view angles, and the patient may be imaged at the sequential view angles.
  • a biplane imaging device may be used, and the patient may be imaged at the two different view angles simultaneously
  • the plurality of first 2D datasets may be stored in, for ex- ample, the memory 212, which is in communication with the processor 208, or another memory.
  • the processor 208 for example, generates the 3D data that represents the volume based on the plurality of 2D datasets.
  • the processor may use all or some of the 2D datasets to generate the 3D dataset.
  • the pro- cessor 208 may generate the 3D data when needed for the method of Figure 3, or the processor 208 or another processor may have previously generated the 3D data and stored the 3D data in the memory 212, for example (e.g., on a different day or the same day as the intended imaging) .
  • the 3D data represents the entire patient or a continuous portion of the patient.
  • the 3D data represents a collection of reconstructed anatomical structures and/or devices (e.g., tubes modeling vessels or spheres representing cryoballoons) .
  • the 3D data representing the volume is registered with an imaging device.
  • the 3D data is registered with the imaging device 102 or another imaging device.
  • the 3D data representing the volume may be registered with the imaging device to be used for the intended imaging.
  • the imaging device used to generate the 3D data may be the same or a different imaging device than the imaging device to be used for the intended imaging.
  • Augmented fluoroscopy may be used in that the 3D data may be fused with live fluoroscopy with augmented or supplemented X- ray views (e.g., with computer generated information such as graphics, images, or videos) .
  • An X-ray device e.g., a calibrated X-ray device; the imaging device 102 with a known projection geometry is used for augmented fluoroscopy. If the projection geometry of the imaging device 102, for example, is known, the projection geometry may be used to generate a fused display of fluoro overlay images with live X-ray views.
  • the known projection geometry may also allow objects to be reconstructed from two or more views. 3D objects computed from C-arm views are auto-registered. In other words, in the absence of patient motion, forward projections are automatically registered to X-ray views generated by imaging device 102, for example.
  • Auto-registered 3D data sets may be used for setting up X-ray views.
  • auto-registered 3D data is generated with C-arm computed tomography (e.g., syngo DynaCT) involving a plurality of views acguired while rotating the C- arm at least partially around the patient, as described above in act 300.
  • C-arm computed tomography e.g., syngo DynaCT
  • auto-registered 3D data sets are generated with two reconstructed views using a monoplane imaging device or a biplane imaging device, as also de ⁇ scribed above in act 300.
  • the 3D data may be registered with X-ray views (e.g., 2D data) generated by the imaging device to be used for the intended im- aging (e.g., imaging device 102) .
  • the pre- procedural 3D data is registered with the imaging device 102, for example, or X-ray views generated by the imaging device 102 with 3D/2D or 2D/3D registration.
  • 3D/3D registration may be used.
  • an orientation of a 3D object within the pre-procedural 3D data may be matched with an orientation of a 3D reconstruction result generated by the imaging device 102, for example (e.g., syngo DynaCT) .
  • Registration methods are applied to facilitate the alignment of the 3D data with X-ray views generated by the imaging device 102, for example, with known projection geometry.
  • the appearance of an object e.g., the volume to be imaged
  • the appearance of an object may be computed and/or simulated .
  • data representing a type of an intended imaging of the volume is identified.
  • the processor 208 receives an input identifying the type of the intended imaging of the volume.
  • the input may, for exam ⁇ ple, be from the user via the input device 214 or another in ⁇ put device.
  • the processor 208 identifies the data representing the type of the intended imaging of the volume, which was previously stored in the memory 212, for example.
  • the input is a combination of a click of a first input device 214 (e.g., a mouse) at a particular position within a GUI displayed on the monitor 210 and/or one or more button presses of a second input device
  • the user may click, with the mouse, within a field representing the type of imaging, within the GUI, and the user may use the keyboard to enter the type of imaging.
  • the user may select the type of imaging from a drop down box within the GUI.
  • the type of imaging may be identified in any number of other ways using any number of other input devices.
  • the processor 208 may automatically determine the type of the in- tended imaging of the volume based on other inputs and/or other operating parameters of the imaging device 102, for ex ⁇ ample.
  • the input received from the user may be stored in the memory 212, for example.
  • the data representing the type of the intended imaging may indicate the type of volume to be imaged (e.g., the organ to be imaged) and/or the portion to be imaged.
  • the data representing the type of the intended imaging may speci- fy that the type of imaging is a coronary angiogram, and the organ to be imaged is the left coronary artery or the right coronary artery.
  • Other types of imaging that may be identified include pulmonary angiogram, cerebral angiogram, carotid angiogram, peripheral angiogram, aorta angiogram, and other types of imaging.
  • the data representing the type of the intended imaging may indicate the type of imaging device to be used.
  • the type of imaging device to be used may include a C-arm X-ray device, an MRI de ⁇ vice, an ultrasound device, or any combination thereof.
  • one or more first imaging device settings are determined based on the registered 3D data representing the volume and the type of the intended imaging.
  • the processor 208 may automatically determine the one or more first imaging device settings based on the type of the intended imaging.
  • the memory 212 may store defined relationships between types of imaging and recommended imaging device settings, respectively.
  • the processor 208 may determine one or more imaging device settings from the stored relationships, based on the data representing the type of the imaging identified in act 304.
  • the identified type of imaging may be an input, and the processor 208 may determine the one or more imaging device settings as an out ⁇ put based on the defined relationships stored in the memory 212.
  • the processor 208 may determine optimum angle ranges (e.g., of the source and/or the detector) for the intended imaging to be 30-45° for left anterior oblique (LAO), 20-30° for cranial, 20-30° for caudal, and 30-45° for right anterior oblique (RAO) based on the relationships stored in the memory 212.
  • LAO left anterior oblique
  • REO right anterior oblique
  • the processor may determine optimum angle ranges for the intended imaging to be 30-45° for LAO, 15-20° for cranial, and 30-45° for caudal. These angular ranges (e.g., views) are known to give good results for the left coronary artery and the right coronary artery, respectively.
  • commonly used angulations for a biplane C-arm X-ray system during an AF ablation procedure include:
  • the C-arm view angles aA and ⁇ refer to primary and secondary angles of an A-plane, respectively.
  • the view orientations aB and ⁇ define the corresponding rotations of a B-plane, respectively.
  • Positive view angles for the primary angle a are associated with left anterior (LAO) view, while negative view angles describe projections taken from a right anterior (RAO) view position.
  • Positive view angles for the secondary angle ⁇ denote cranial views, while negative view angles de ⁇ note caudal views .
  • Figure 4 shows exemplary A and B-planes of an interventional C-arm monoplane X-ray system (e.g., imaging system 100) .
  • Fig- ure 4 shows the primary and secondary rotation angles a and ⁇ .
  • the X-ray system is in an initial position, with both rotation angles equal to zero, for example.
  • data related to a planned intervention e.g., using the ablation catheter
  • the data related to the planned interven- tion may be stored in the memory 212 and/or received from the user, for example.
  • the identified data in ⁇ cludes 3D intervention region data of an intervention region within the volume, and/or 3D planning data of a planned intervention path (e.g., of the ablation catheter) in the in- tervention region.
  • the 3D intervention region data is included in or registered with the 3D data.
  • the 3D planning data is included in or registered with the registered 3D intervention region data.
  • the processor 208 for example, generates a 2D projection view (e.g., identifies an optimal view angle) from the 3D data representing the volume based on the identified data related to the planned intervention.
  • the processor 208 may gen ⁇ erate the 2D projection view so as to maximize an area of the 3D intervention region or the planned intervention path in the 2D projection view, to maximize an elongation of the 3D intervention region or the planned intervention path in the 2D projection view, to minimize a number of intersections of the planned intervention path in the 2D projection view, to minimize a number of opaque obstacles in an X-ray beam corresponding to the 2D projection view between an X-ray source operable to generate the X-ray beam and the 3D intervention region or the planned intervention path, or any combination thereof.
  • the processor 208 may optimize the 2D projection view with other parameters in mind.
  • the one or more first imaging device settings include an optimal viewing direction
  • the processor 208 determines the optimal view direc- tion based on the registered 3D data representing the volume and the type of the intended imaging. For example, when catheter ablation for atrial fibrillation is identified in act 304, the processor may determine a plane that best fits the registered 3D data.
  • An ablation line is equidistantly sampled.
  • the plane is estimated to minimize the squared distance of the sample points xi .
  • eigenvectors of the 3D data e.g., a point cloud
  • n ei x e2 (1)
  • the origin point xO is calculated as the center of gravity of the sample point.
  • the processor 208 may define the optimal view direction by a normal vector in 3-D space.
  • the normal vector n (nx, ny, nz) may be transferred into corresponding primary and secondary angles of an unconstrained C-arm system via trigonometry functions:
  • the vector uO is set to [1 0 0] T , the position of the rotation axis.
  • angulations may be reached by a conventional C-arm system. Due to mechanical constraints, primary and secondary angles may only vary within a certain range. The domain of possible angulations differs for the A-plane and the B-plane. Accordingly, a mechanical feasible angulation of the C-arm that is as close to the optimal view as possible is estimat ⁇ ed. This similarity is expressed by the scalar product of the normal vector on the plane, n, and the viewing direction of the C-arm system ⁇ ( ⁇ , ⁇ ) . In other words, for optimized frontal/lateral view directions, the magnitude of the scalar product within the given mechanical C-arm constraints is to be maximized/minimized.
  • the problem may be formulated as a constrained maximization problem: ( a A,min — a A — a A,max
  • angles ⁇ , ⁇ define the viewing direction vA (aA, ⁇ ) of the A-plane.
  • aA, min, aA, max, ⁇ , min, ⁇ , max define the boundaries of the applicable range for each angle separately.
  • both viewing direc ⁇ tions for the A-plane and the B-plane, may be optimized to- gether, where vA ( ⁇ , ⁇ ) defines the view direction of the B-plane. Similar to the constraints on the A-plane angles, there are mechanical limitations for the B-plane, expressed as aB, min, aB, max, ⁇ , min, ⁇ , max. These define the range of possible view directions. Additional constraints are to be taken into account between the two viewing directions of the A-plane and the B-plane. The angle ⁇ , B describes the angle between the two viewing directions vA and vB .
  • the joint optimization may be formulated as follows: argmax ( ⁇ ⁇ ⁇
  • ⁇ and ⁇ are constant weighting terms.
  • the vectors vA and vB are normalized to length 1. Since the domain of applicable angles is closely bound, and the C-arm view vector v ( ⁇ , ⁇ ) may be precomputed, a grid search strategy may be used to solve this problem. To reduce the computational runtime, the grid search may be replaced with a sequential quadratic programming (SQP) approach to solve the constraint optimization problem numerically.
  • SQL sequential quadratic programming
  • the problem stat ⁇ ed in equations 9 and 10 may be transformed into standard form. Unlike written above, SQP requires the objective func- tion to be minimized.
  • the negative object function is minimized : argmin - ( ⁇ ⁇ ⁇
  • imaging device settings may be determined based on the type of the intended imaging from act 304.
  • the imaging device settings determined in act 306 may include a position of the volume, a position of a patient table on which the volume is supported, a position of an X-ray source of the imaging device, a position of a detector of the imag- ing device, an angulation and orientation of an X-ray beam generated by the imaging device, a zoom setting, a collimator setting, a shutter setting, or any combination thereof.
  • the one or more first imaging device settings may include any number of other settings for the imaging device 102, for ex- amp1e .
  • the one or more first imaging device settings determined in act 306 may include an MRI scan protocol.
  • the one or more first imaging device settings determined in act 306 may include ultrasound scanner settings.
  • Imaging device settings not automatically determined by the processor 208 may be otherwise determined and/or identified by the proces- sor 208.
  • the processor 208 receives one or more inputs identifying additional first imaging device settings.
  • the input may, for example, be from the user via the input device 214 or another input device.
  • the input is a combination of a click of a first input device 214 (e.g., a mouse) at a particular position within a GUI displayed on the monitor 210 and/or one or more button presses of a second input device 214 (e.g., a keyboard) .
  • the user may click, with the mouse, within a field representing the imaging device setting to be input, within the GUI, and the user may use the keyboard to enter a value for the imaging device setting.
  • the imaging device settings may be entered by the user and/or determined by the processor 208 in any number of other ways using any number of other input devices.
  • the processor 208 may determine the additional first imaging de ⁇ vice settings by identifying the one or more first imaging device settings on the memory 212.
  • the one or more first imaging device settings may have been previously stored on the memory 212 by the user, or the one or more first imaging de ⁇ vice settings previously stored on the memory 212 may represent settings from a previously executed imaging.
  • the imaging device settings otherwise determined may include, for example, a position of the volume, a position of a patient table on which the volume is supported, a position of an X-ray source of the imaging device, a position of a detector of the imaging device, an angulation and orientation of an X-ray beam generated by the imaging device, a zoom setting (e.g., zoom factor or zoom factors) , a collimator setting, a shutter setting, source- image-distance (SID), source-object-distance (SOD), a filter position (e.g., wedge filter position or finger filter position), or any combination thereof.
  • the one or more first imaging device settings may include any number of other settings for the imaging device 102, for example. If pre- specified (e.g., as part of the X-ray organ program), pre- filtering may be employed as well.
  • the user selects a preconfigured organ program associated with fluoroscopy and radiography, respec ⁇ tively, and the processor 208, for example, optimizes auto- matic exposure control of the imaging device based on the selected preconfigured organ program.
  • the processor 208 may also determine an effect of the intended imaging on the volume based on the determined one or more first imaging device settings. For example, the processor 208 may calculate an expected dose (e.g., a first expected dose) for the intended imaging of the volume based on the registered 3D data, the one or more first imaging device settings, and any otherwise determined imaging device settings (e.g., imaging device settings received from the user) . In one embodiment, the processor 208 calculates the ex ⁇ pected dose using a Monte Carlo simulation algorithm. The expected dose may be calculated in other ways.
  • an expected dose e.g., a first expected dose
  • the processor 208 may calculate (e.g., optimize) other image fidelity parameters such as, for example, image quality (e.g., signal- to-noise-radio; maximized) or scan time (e.g., minimized) .
  • image quality e.g., signal- to-noise-radio; maximized
  • scan time e.g., minimized
  • other simulation algorithms including, for exam ⁇ ple, a virtual MRI method may be used.
  • a virtual view of the volume is generated from the registered 3D data representing the volume based on the one or more determined first imaging device settings.
  • the processor 208 may generate the virtual view from the registered 3D data based on the one or more first imaging device settings determined in act 306 and the other ⁇ wise determined imaging device settings (e.g., not automatically determined based on the identified type of imaging but input by the user) using forward rendering.
  • the processor 208 generates the virtual view of the volume from the registered 3D data using identified data (e.g., received from the user or already stored in a memory) for at least view positions of the imaging device 102 (e.g., view positions of one or more sources and one or more detectors of a C-arm X-ray system; determined in act 306), table position, zoom factor, and collimator position.
  • the generated virtual view of the volume is dis ⁇ played by a display in communication with the processor.
  • the generated virtual view of the volume is displayed to the user via the monitor 210.
  • the generated virtual view may be transmitted to and displayed at other displays.
  • the generated vir ⁇ tual view may be transmitted to another system (e.g., a computer) that is remotely located from the imaging system 100, for example, and displayed to another user (e.g., another doctor) at the other system.
  • the other user may view the generated virtual view at a display of the remotely located system.
  • the user views the generated virtual view at the monitor 210, for example, and determines whether the imaging device settings are sufficient.
  • the user may determine whether the imaging device 102, if positioned according to the imaging device settings determined in act 306, would ob ⁇ tain adequate views of the anatomy involved in the type of imaging identified in act 304.
  • the user may determine whether the entire volume or a particular portion of the volume would be adequately imaged using the imaging device settings determined in act 306.
  • the user may execute the imaging identified in act 304 using the imaging device settings determined in act 306. If the user determines that the imaging device settings are insufficient (e.g., inadequate views of the anatomy involved in the identified type of imaging are produced) , one or more of the imaging device settings may be adjusted.
  • one or more second imaging device settings are determined.
  • the determination of the one or more second imaging device settings includes adjusting an imaging device set ⁇ ting of the one or more determined first imaging device set- tings and/or one or more of the otherwise determined imaging device settings (e.g., received from the user).
  • the processor 208 receives an input identifying at least one of the first imaging device settings to be changed and the amount the first imaging device setting is to be changed.
  • the input may, for example, be from the user via the input device 214 or another input de ⁇ vice.
  • the input is a combination of a click of a first input device 214 (e.g., a mouse) at a par- ticular position within a GUI displayed on the monitor 210 and/or one or more button presses of a second input device 214 (e.g., a keyboard) .
  • the user may click, with the mouse, within a field representing the imaging device setting to be adjusted, within the GUI, and the user may use the key board to enter a new value for the imaging device setting.
  • the imaging device setting may be adjusted in any number of other ways using any number of other input devices.
  • the processor 208 automatically determines the one or more second imaging device settings.
  • the processor 208 may automatically adjust any number of the imaging device settings determined in act 306.
  • the processor 208 may also determine an effect of the intended imaging on the volume (e.g., the patient) based on the determined one or more second imaging device settings and the other first imaging device settings.
  • the processor 208 may calculate an expected dose (e.g., a second expected dose) for the intended imaging of the volume based on the registered 3D data, the one or more second imaging device settings, and the other first imaging device settings.
  • the processor 208 calculates the expected dose using a Monte Carlo simulation algo- rithm.
  • the expected dose may be calculated in other ways.
  • an adjusted virtual view of the volume is generated from the registered 3D data representing the volume based on the adjusted imaging device setting.
  • the adjusted virtual view is displayed on the monitor 210, for example, to the user.
  • the user may determine that the imaging device settings (e.g., the one or more second imaging device settings and the other first imaging device settings (first imaging device settings not adjusted in act 312)) are appropriate, finalized, and/or set for the type of the intended imaging of the volume based on the adjusted virtual view (e.g., after the adjusted virtual view has been displayed to the user) .
  • the imaging device settings e.g., the one or more second imaging device settings and the other first imaging device settings (first imaging device settings not adjusted in act 312)
  • the processor may receive an input from the us ⁇ er indicating that the one or more second imaging device settings and the other first imaging device settings are to be set for the intended imaging of the volume.
  • the input may, for example, be from the user via the input device 214 or an ⁇ other input device.
  • the input is a click of the input device 214 (e.g., a mouse) at a particular posi- tion within a GUI displayed on the monitor 210 (e.g., a "Start" button within the GUI) .
  • the processor 208 may direct the imaging device 102 to execute the indented imaging based on the input received from the user.
  • the imaging device 102 images the pa ⁇ tient using the one or more second imaging device settings and the other first imaging device settings.
  • the patient is not imaged using the imaging device 102 until the input is received from the user.
  • acts 312 and 314 may be repeated.
  • act 312 the same imaging device setting and/or a different imaging device setting may be adjusted.
  • the determination of whether the imaging device settings may be set or are to be further adjusted may be based on a comparison of a previ- ously calculated dose (e.g., the first dose, calculated with the one or more first imaging device settings) with the currently calculated dose (e.g., the second dose, calculated with the one or more second imaging device settings and the other imaging device settings) .
  • the processor 208 for exam- pie, may calculate a difference between the second dose and the first dose, and may display the calculated difference to the user via the monitor 210.
  • the proces- sor 208 may calculate (e.g., optimize) other image fidelity parameters such as, for example, image guality (e.g., signal- to-noise-radio; maximized) or scan time (e.g., minimized) .
  • image guality e.g., signal- to-noise-radio; maximized
  • scan time e.g., minimized
  • other simulation algorithms including, for example, a virtual MRI method may be used.
  • the user determines whether the one or more second imaging device settings and/or the other first imaging device settings are to be finalized and/or set for the intended imaging based on the calculated difference.
  • the processor 208 automatically determines whether the imaging device settings are to be set for the intended imaging based on the calculated difference.
  • the processor 208 may automatically adjust any number of imaging device settings of the one or more first imaging device settings and determine whether to use the settings based on the calculated difference .
  • the one or more first imaging device settings determined in act 306 include view positions (e.g., angular positions of the C-arm) of the imaging device 102, and the one or more second imaging device settings determined in in act 312 (e.g., adjusted first imaging device setting) also include the view positions of the imaging device 102.
  • the method 300 is used to provide feedback to a us ⁇ er regarding the positioning of the imaging device 102 to ul ⁇ timately provide appropriate views for an identified type of imaging. After final view positions are determined, the X-ray imaging technique may be optimized.
  • the X-ray imaging technique may be optimized using, for example, a Monte Carlo simulation that allows simultaneous esti- mates of measures of image quality and patient dose based on the registered 3D data or using the registered 3D data as a reference for finding a matching 3D Monte Carlo simulation data set.
  • a 3D Monte Carlo simulation data set may use the registered 3D data as an input into a database including appropriate data (e.g., high-resolution anthropomorphic phan ⁇ tom CT data or a parameterized Monte Carlo simulation model) .
  • the Monte Carlo simulation may take all X-ray imaging parame ⁇ ters into account.
  • the Monte Carlo simulation takes tube current, pulse width, tube potential, focal spot size, copper pre-filtering, projection geometry, positions of collimators, wedge filters, finger filters, etc., automatic dose rate control (ARDC) of the system, or any combination thereof into account to simulate how an X-ray image of the 3D data will look under different settings (e.g., dose settings or image quality settings) .
  • ARDC automatic dose rate control
  • the Monte Carlo simulation may work together with the ADRC to improve the optimization even further.
  • the Monte Carlo simulation may be used to suggest actions to the user to provide potential dose savings and image quality im- provements.
  • the Monte Carlo simulation may act as a smart agent for finding the right imaging technique factors .
  • 3D data registered to an X-ray imaging device, may be used to set up X-ray views without any need for fur- ther X-ray. Resulting fluoro views may be simulated. Additionally, resulting patient dose may be simulated. Augmented fluoroscopy may be used for setting up X-ray views. For exam ⁇ ple, a fluoro overlay images is computed to demonstrate how anatomy within a treatment volume would look for a selected imaging device orientation and chosen zoom factor, for example. The 3D data may also be used to optimize an X-ray imaging technique after the X-ray views are set up. Monte-Carlo simulations, for example, may be used to optimize the X-ray imaging technique.
  • the X-ray imaging technique may be at least partially defined by, for example, dose level, tube voltage, and pre-filtering .
  • dose level may be simplified, and dose may be saved while keeping the image quality at an acceptable level according to ALARA principles.
  • imaging fidelity parameters may be optimized (e.g., sound-to-noise ratio, scan time) as part of a simulation as part of calculating view angles or once view angles or other imaging device settings (e.g., proper MRI scan protocol, settings for an ultrasound scanner) have been determined.
  • a method for X-ray imaging includes the steps: receiving 3D data of an object or patient; registering the 3D data with the X-ray scanner; receiving information characterizing the type of an intended X-ray scan of the object or patient; and determining dependent on the registered 3D data and the type of the intended X-ray scan one or more of the following X-ray scanner settings:
  • object or patient table position X-ray source position and detector position, and/or X-ray beam angulation and orientation, and/or at least one of zoom setting, collimator setting, shutter setting.
  • the method includes the additional step of determining dependent on the registered 3D data and the type of the intended X-ray scan one or more of the following X-ray scanner settings: x-ray dose, tube voltage, and/or pre-filter settings.
  • the method includes the additional steps of: generating from the registered 3D data and using the de ⁇ termined X-ray scanner settings a virtual X-ray view of the object or patient; displaying the virtual X-ray view to a user; receiving from a user a command, e.g., via voice control, pedal, touchscreen, joystick, or keyboard; and depending on the received command generating either a single X-ray image or a series of X-ray images of the object or patient using the determined X-ray scanner settings.
  • a command e.g., via voice control, pedal, touchscreen, joystick, or keyboard
  • the method includes the additional steps of: receiving from a user a command for adjusting one or more X-ray scanner settings; generating from the registered 3D da- ta and using the adjusted X-ray scanner settings a virtual X- ray view; displaying the virtual X-ray view to a user; receiving from a user a command, e.g., via voice control, pedal, touchscreen, joystick, or keyboard; and depending on the received command generating a fluoroscopy scan of the object or patient using the determined X-ray scanner settings, or adjusting one or more X-ray scanner settings.
  • the method includes the additional step of calculating, e.g., by use of a Monte Carlo simulation algo ⁇ rithm, from the registered 3D data and using the determined X-ray scanner settings a first expected scan dose for the intended scan of the object or patient.
  • the method includes the additional step of displaying the first expected scan dose to a user.
  • the method includes the additional steps of: automatically determining second, different X-ray scanner settings; calculating from the registered 3D data and using the different X-ray scanner settings a second expected scan dose for the intended scan of the object or patient; compar ⁇ ing the second expected scan dose with the first expected scan dose; and displaying the result of the comparison to a user .
  • the method includes the additional steps of: receiving information related to a planned intervention, in particular but not limited to 3D intervention region data of the intervention region, e.g., myocardium, within the object or patient, the 3D intervention region data being in ⁇ cluded in or registered with the registered 3D data, and 3D planning data of a planned intervention path, e.g., ablation line, in the intervention region, the 3D planning data being included in or registered with the registered 3D region data; determining dependent on the 3D intervention region data and the 3D planning data a 2D projection view; and determining dependent on the 2D projection view X-ray scanner settings, in particular but not limited to X-ray beam angulation, that allow for generating an X-ray view of the 2D projection plane of the intervention region within the object or patient.
  • 3D intervention region data of the intervention region e.g., myocardium
  • 3D planning data of a planned intervention path e.g., ablation line
  • the 2D projection view is determined so as to optimize one or more of the following view characteris ⁇ tics: maximize area of 3D intervention region or planned intervention path in the 2D projection; maximize elongation of 3D intervention region or planned intervention path in the 2D projection; minimize number of intersections of planned in ⁇ tervention path in the 2D projection; and minimize number of opaque obstacles in the X-ray beam that is corresponding to the 2D projection between X-ray beam source and the 3D inter- vention region or planned intervention path.
  • the method includes the additional steps of: receiving directional information characterizing a preferred direction of the 2D project view, e.g., in relation to the planned intervention path; and determining a 2D projection view dependent on the directional information.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne une optimisation d'un réglage d'imagerie, où des données en trois dimensions (3D) représentant un volume à imager sont enregistrées avec un dispositif d'imagerie et les réglages du dispositif d'imagerie sont déterminés sur base d'un type d'imagerie identifié. Une vue virtuelle du volume est générée à partir des données 3D enregistrées sur la base d'un ou de plusieurs des réglages du dispositif d'imagerie, et est affichée. Au moins l'un des réglages du dispositif d'imagerie est ajusté sur la base de la vue virtuelle affichée. Une vue virtuelle ajustée du volume est générée à partir des données 3D enregistrées sur la base de l'au moins un réglage du dispositif d'imagerie ajusté.
PCT/EP2015/052871 2014-02-13 2015-02-11 Optimisation d'imagerie médicale WO2015121301A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14155060.8 2014-02-13
EP14155060 2014-02-13

Publications (1)

Publication Number Publication Date
WO2015121301A1 true WO2015121301A1 (fr) 2015-08-20

Family

ID=50159028

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/052871 WO2015121301A1 (fr) 2014-02-13 2015-02-11 Optimisation d'imagerie médicale

Country Status (1)

Country Link
WO (1) WO2015121301A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170055928A1 (en) * 2015-08-31 2017-03-02 General Electric Company Systems and Methods of Image Acquisition for Surgical Instrument Reconstruction
EP3162288A1 (fr) * 2015-10-30 2017-05-03 Siemens Healthcare GmbH Gestion de protocole d'entreprise
EP3360482A1 (fr) * 2017-02-09 2018-08-15 Koninklijke Philips N.V. Tomographie informatique d'isocentrage sur arceau
US10667869B2 (en) 2017-05-17 2020-06-02 General Electric Company Guidance system for needle procedures

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020045817A1 (en) * 2000-10-17 2002-04-18 Masahide Ichihashi Radiographic image diagnosis apparatus
WO2008001260A2 (fr) * 2006-06-28 2008-01-03 Koninklijke Philips Electronics N. V. Détermination de la trajectoire de rotation optimale pour angiographie par rayons x sur la base d'une carte d'observation vision optimale prédéterminée
WO2013102880A1 (fr) * 2012-01-06 2013-07-11 Koninklijke Philips Electronics N.V. Affichage en temps réel de vues du système vasculaire pour navigation de dispositif optimale

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020045817A1 (en) * 2000-10-17 2002-04-18 Masahide Ichihashi Radiographic image diagnosis apparatus
WO2008001260A2 (fr) * 2006-06-28 2008-01-03 Koninklijke Philips Electronics N. V. Détermination de la trajectoire de rotation optimale pour angiographie par rayons x sur la base d'une carte d'observation vision optimale prédéterminée
WO2013102880A1 (fr) * 2012-01-06 2013-07-11 Koninklijke Philips Electronics N.V. Affichage en temps réel de vues du système vasculaire pour navigation de dispositif optimale

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170055928A1 (en) * 2015-08-31 2017-03-02 General Electric Company Systems and Methods of Image Acquisition for Surgical Instrument Reconstruction
EP3135203A3 (fr) * 2015-08-31 2017-03-08 General Electric Company Systèmes et procédés d'acquisition d'images pour la reconstruction d'instrument chirurgical
CN106725851A (zh) * 2015-08-31 2017-05-31 通用电气公司 用于外科器械重建的图像采集的系统和方法
US9895127B2 (en) 2015-08-31 2018-02-20 General Electric Company Systems and methods of image acquisition for surgical instrument reconstruction
EP3162288A1 (fr) * 2015-10-30 2017-05-03 Siemens Healthcare GmbH Gestion de protocole d'entreprise
CN107016148A (zh) * 2015-10-30 2017-08-04 西门子保健有限责任公司 企业协议管理
US10324594B2 (en) 2015-10-30 2019-06-18 Siemens Healthcare Gmbh Enterprise protocol management
EP3360482A1 (fr) * 2017-02-09 2018-08-15 Koninklijke Philips N.V. Tomographie informatique d'isocentrage sur arceau
WO2018145930A1 (fr) * 2017-02-09 2018-08-16 Koninklijke Philips N.V. Iso-centrage dans une tomographie assistée par ordinateur à bras en c
US11123025B2 (en) 2017-02-09 2021-09-21 Koninklijke Philips N.V. Iso-centering in C-arm computer tomography
US10667869B2 (en) 2017-05-17 2020-06-02 General Electric Company Guidance system for needle procedures

Similar Documents

Publication Publication Date Title
EP2188782B1 (fr) Couplage de la direction de visualisation d'une vue de reformatage plan courbe d'un vaisseau sanguin avec l'angle de visualisation sur son volume de voxel rendu de structure tubulaire en 3d et/ou avec la géométrie à bras en c d'un système à bras en c d'un dispositif d'angiographie rotationelle en 3d
EP2800516B1 (fr) Affichage en temps réel de vues du système vasculaire pour navigation de dispositif optimale
EP3161785B1 (fr) Système et procédé de composition d'image
US9427286B2 (en) Method of image registration in a multi-source/single detector radiographic imaging system, and image acquisition apparatus
JP5739812B2 (ja) 血管造影画像取得装置の作動方法、コリメータ制御ユニット、血管造影画像取得装置及びコンピュータソフトウェア
US10426414B2 (en) System for tracking an ultrasonic probe in a body part
RU2711140C2 (ru) Редактирование медицинских изображений
JP2016513540A (ja) 手術中の位置調整および誘導を容易にするシステム
JP2016097261A (ja) 画像処理装置、画像処理プログラム、画像処理方法及び治療システム
JP2012505009A5 (fr)
CN102727236A (zh) 通过使用3d模型产生身体器官的医疗图像的方法和设备
Zachiu et al. Non-rigid CT/CBCT to CBCT registration for online external beam radiotherapy guidance
US20220277477A1 (en) Image-based guidance for navigating tubular networks
WO2015121301A1 (fr) Optimisation d'imagerie médicale
US20190005685A1 (en) Systems and Methods for Generating 2D Projection from Previously Generated 3D Dataset
Choi et al. X-ray and magnetic resonance imaging fusion for cardiac resynchronization therapy
US11950947B2 (en) Generation of composite images based on live images
WO2023209014A1 (fr) Enregistrement d'images de projection sur des images volumétriques
US11123025B2 (en) Iso-centering in C-arm computer tomography
Wagner et al. Continuous-sweep limited angle fluoroscopy guidance for percutaneous needle procedures
EP4128145B1 (fr) Combinaison d'informations angiographiques avec des images fluoroscopiques
EP4312188A1 (fr) Reconstruction optique et non optique combinée en 3d
US20130108127A1 (en) Management of patient model data
Wang et al. Target visibility enhancement for C-arm cone beam CT-fluoroscopy-guided hepatic needle placement: implementation and accuracy evaluation
Aksoy et al. 3D–2D registration of vascular structures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15705260

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15705260

Country of ref document: EP

Kind code of ref document: A1