WO2014141163A2 - Determining a residual mode image from a dual energy image - Google Patents

Determining a residual mode image from a dual energy image Download PDF

Info

Publication number
WO2014141163A2
WO2014141163A2 PCT/IB2014/059770 IB2014059770W WO2014141163A2 WO 2014141163 A2 WO2014141163 A2 WO 2014141163A2 IB 2014059770 W IB2014059770 W IB 2014059770W WO 2014141163 A2 WO2014141163 A2 WO 2014141163A2
Authority
WO
WIPO (PCT)
Prior art keywords
image data
residual mode
intensity
mode image
pixel
Prior art date
Application number
PCT/IB2014/059770
Other languages
English (en)
French (fr)
Other versions
WO2014141163A3 (en
Inventor
Rafael Wiemker
Thomas Buelow
André GOOSSEN
Klaus Erhard
Martin Bergtholdt
Harald Sepp Heese
Original Assignee
Koninklijke Philips N.V.
Philips Deutschland Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V., Philips Deutschland Gmbh filed Critical Koninklijke Philips N.V.
Priority to EP14719356.9A priority Critical patent/EP2973391B1/en
Priority to CN201480015630.4A priority patent/CN105122300B/zh
Priority to JP2015562518A priority patent/JP6251297B2/ja
Priority to US14/774,205 priority patent/US9706968B2/en
Publication of WO2014141163A2 publication Critical patent/WO2014141163A2/en
Publication of WO2014141163A3 publication Critical patent/WO2014141163A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/482Diagnostic techniques involving multiple energy imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/408Dual energy

Definitions

  • the invention relates to a method, a computer program, a computer-readable medium and a controller for processing a digital image comprising pixels with intensities relating to different energy levels of record. Additionally, the invention relates to an X-ray imaging system.
  • transmission images of a region of interest of a patient may be recorded by detecting X-rays that fall onto a sensor or detector after having passed the region of interest. After that these images may be displayed and a physician or similarly skilled person may decide, whether there are malignant or benign changes in the region of interest.
  • An X-ray imaging system may comprise an energy discriminating detector, i.e. a detector that is adapted for differentiating between X-rays of different energy and of recording images at different energy levels.
  • an energy discriminating detector i.e. a detector that is adapted for differentiating between X-rays of different energy and of recording images at different energy levels.
  • a dual energy detector may produce two energy images of exactly the same geometry and anatomical area.
  • US 8 165 379 B2 shows a mammography system that is adapted to record low energy and high energy images.
  • Two complementing energy images may show information which may help to discriminate whether visible masses are likely malignant or benign. However, the way to compute malignancy features from the dual energy images is not yet clear.
  • An aspect of the invention relates to a method for processing a digital image, which comprises pixels with intensities relating to different energy levels. I.e. each pixel of the image may be associated with at least two intensities, which have been recorded at different energy levels.
  • the digital image may be a digital X-ray image and the energies may relate to different X-ray energies during the recording of the digital X-ray image.
  • the method comprises the steps of: receiving first image data and second image data of the digital image, the first image data encoding a first energy level and the second image data encoding a second energy level; determining a regression model from the first image data and the second image data, the regression model establishing a correlation between intensities of pixels of the first image data with intensities of pixels of the second image data; and calculating residual mode image data from the first image data and the second image data, such that a pixel of the residual mode image data has an intensity based on the difference of an intensity of the second image data at the pixel and a correlated intensity of the pixel of the first image data, the correlated intensity determinate by applying the regression model to the intensity of pixel of the first image data.
  • the basic idea of the invention may be seen that two energy images, i.e. a first energy image and a second energy image (represented by the first (energy) image data and the second (energy) image data) are recorded and evaluated such that new information is extracted from the two energy images that may not be directly seen by simply comparing the two energy images. This new information may be visualized with the residual mode image data.
  • a correlation between the two energy images may be analyzed, and a regression model may be established based on the correlation.
  • the regression model may be used for predicting an intensity of each pixel in the second energy image from the intensity of the same pixel in the first energy image.
  • new images may be computed with the aid of the regression model.
  • the residual mode image (represented by residual mode image data) may display information in the second image that could not be predicted from the first image, i.e. new information.
  • a dominant mode image (represented by dominant mode image data) may display highly correlated components in the two energy images.
  • a computation of malignancy features from dual energy images may be improved.
  • the residual mode image it may be established which parts of a second energy image significantly add information to a first energy image.
  • a computer-readable medium may be a floppy disk, a hard disk, an USB (Universal Serial Bus) storage device, a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable Read Only Memory) or a FLASH memory.
  • a controller may comprise a processor for executing the computer program, which may be stored in a memory of the controller.
  • an X-ray imaging system which comprises a detector arrangement for recording first image data with X-rays of a first energy level and for recording second image data with X-rays of a second energy level, a controller, which is adapted for generating residual mode image data from the first image data and the second image data, and a display device for displaying the residual mode image data.
  • the X-ray imaging system may be a mammography station.
  • the image data is recorded at a first place and processed at a second place, for example at a workstation remotely connected to an X-ray device.
  • Fig. 1 schematically shows an X-ray imaging system according to an embodiment of the invention.
  • Fig. 2 shows a flow diagram for a method for processing image data according to an embodiment of the invention.
  • Fig. 3 schematically shows a display device for an X-ray imaging system according to an embodiment of the invention.
  • Fig. 4 shows an example for a correlation of energy intensities of two images.
  • Fig. 5 shows an example of a dominant mode image and a residual mode image generated with a method for processing image data according to an embodiment of the invention.
  • Fig. 6 shows an example of a dominant mode image and a residual mode image generated with a method for processing image data according to an embodiment of the invention.
  • Fig. 1 shows an X-ray imaging system 10 that comprises a detector arrangement 12, a controller 14 and a display device 16.
  • the detector arrangement 12 which may be controlled by the controller 14, is adapted for recording digital images at two different energy levels of an object of interest 18, such as a breast.
  • the detector arrangement 12 may comprise an energy discriminating detector 20 or an X-ray source 22 that is adapted for generating X-ray radiation at different energy levels at different times.
  • Fig. 2 shows a flow diagram for a method for processing a digital image 40 that may be recorded by the detector arrangement 12.
  • the method may be executed by the controller 14, which, for example, may comprise a processor running a corresponding computer program.
  • step 30 the digital image 40 is recorded and received in the controller 14.
  • the controller 14 may control the detector arrangement 12 in such a way that digital image data 42a, 42b is recorded at different energy levels, for example with an energy discriminating detector 20.
  • the image 40 and its parts 42a, 42b may be recorded simultaneously or during a short time period. In such a way, the image 40 may show the same geometry of the region of interest 18 and/or the same anatomical area 18.
  • the method comprises the step of: recording the first image data 42a and the second image data 42b with an X-ray detector arrangement 12 adapted for acquiring X-rays at different X-ray energy levels.
  • Each of the first and second image data 42a, 42b (or energy image data 42a, 42b) comprises pixels associated with intensities relating to the intensity of the respective energy level at the respective pixel.
  • the first and second image data 42a, 42b may have the same size in pixels (as representing the same field of view on the region of interest). It has to be noted that the digital image 40 may comprise more than two sets of image data 42a, 42b, associated with more than two energy levels.
  • the method comprises the step of: receiving first image data 42a and second image data 42b of the digital image 40, the first image data 42a encoding a first energy level and the second image data 42b encoding a second energy level.
  • step 32 a regression model is determined from the first and second image data 42a, 42b.
  • Fig. 4 shows an example of a pixel-wise scatter plot of the intensities of two sets of energy image data 42a, 42b, which show a high but non-linear correlation. This correlation between the two sets of energy image data is analyzed, to generate a regression model 44.
  • the regression model 44 is a mapping that maps an intensity of a pixel of the first image data 42a to an intensity of the second image data 42b.
  • an intensity of each pixel in the second image data 42b may be predicted from the corresponding intensity in the first image data 42a.
  • the method comprises the step of: determining a regression model 44 from the first image data 42a and the second image data 42b, the regression model 44 establishing a correlation between intensities of pixels of the first image data 42a with intensities of pixels of the second image data 42b.
  • the regression model 44 may be derived either for the whole image area (i.e. all pixels of the image data 42a, 42b) or only for a selected region 50, which may be selected by a user, as will be explained below.
  • the regression model 44 is determined from pixels of only a selected region 50.
  • the correlation between the two sets of image data 42a, 42b may be non-linear. Therefore, it is possible that a non-linear regression model 44 is established, for example a piece-wise linear model or a Support- Vector- Regression model.
  • the regression model 44 is a non-linear model.
  • the regression model may also be linear and may be based, for example, on linear decorrelation techniques, such as Principal Component Analysis.
  • residual mode image data 46 and/or dominant mode image data 48 are determined from the first and second image data 42a, 42b with the aid of the regression model 44.
  • the first and second image data 42a, 42b may be compared and analyzed.
  • intensities of pixels of the one set of image data 42a may be mapped with the regression model 44 to intensities that are comparable with intensities of corresponding pixels of the other set of image data 42b.
  • a corresponding pixel may be a pixel at the same position, which may have the same coordinates.
  • a residual mode image (represented by residual mode image data 46) may be computed containing the residuals of the correlation, i.e. the new information added by the second image data 42b, which cannot have been predicted from the first image data 42a with the aid of the regression model 44.
  • the method comprises the step of: calculating residual mode image data 46 from the first image data 42a and the second image data 42b, such that a pixel of the residual mode image data 46 has an intensity based on the difference of an intensity of the second image data 42b at the pixel and a correlated intensity of the pixel of the first image data 42a, the correlated intensity determinate by applying the regression model to the intensity of pixel of the first image data 42a.
  • the residual mode image data 42a may be used to visualize which parts of the second image data 42b are truly adding information to the first image data 42a, i.e. are non- redundant and cannot be simply predicted from the first one.
  • the parts of the first and second image data 42a, 42b that are predictable from each other may be computed.
  • a dominant mode image (represented by dominant mode image data 48) may be computed, containing the dominant mode, i.e. the highly correlated component between the two energy images.
  • the method further comprises the step of: calculating dominant mode image data 48, wherein an intensity of a pixel of the dominant mode image data 48 is based on a correlated component of the pixel of the first image data 42a and second image data 42b with respect to the regression model 44.
  • Fig. 5 and 6 show examples of dominant mode images 48 and residual mode images 46 that are depicted side by side and that may be displayed in the shown way on a display device 16 as will be explained below.
  • the residual mode image data 46 may further be analyzed to find an intensity threshold above which the intensity difference can be considered meaningful, for example to eliminate noise.
  • the residual mode image data 46 may contain mainly noise.
  • An intensity threshold may be set, above which the deviation from the dominant mode image data 48 can be considered meaningful.
  • the such established threshold intensity of the residual mode image data 46 may be used to display only pixels which are above a specific noise level.
  • the method further comprises the step of: applying a threshold intensity to the residual mode image data 46, such that pixels of the residual mode image data 46 with an intensity below the threshold intensity are discarded.
  • the noise level of the residual mode image data 46 may be analyzed by means of a curve of Euler characteristics, i.e. for each possible threshold the number of holes and blobs is counted, to build an Euler histogram.
  • Techniques are known for establishing the Euler characteristic simultaneously for possible thresholds in a single pass over the residual mode image data 46.
  • a threshold intensity may be derived from the curve position at which the Euler characteristics drops below a predefined number, indicating larger spatial structures in the residual mode image data 46.
  • the threshold intensity is determined such that an Euler characteristic of the residual mode image data 46 drops below a predefined number at the threshold intensity.
  • step 38 the residual mode image data 46 and/or the dominant mode image data 48 are displayed on the display device 16.
  • Fig. 3 shows an example of a screen content of the display device 16, which may be a CRT or LCD or similar device.
  • Each set of image data 42a, 42b, 46, 48 may be displayed on the display device.
  • the residual mode image 46 may be presented separately from the dominant mode image 48 as an additional image.
  • the residual mode image 46 may be presented as an overlay or alpha-blended to the dominant mode image 48 (or the other energy images 42a, 42b), either for the whole image area or only for a selected region 50.
  • the residual mode image 46 is presented an overlay to the dominant mode image 48, for example as a color overlay, wherein the opacity is determined by the intensities of the residual mode image 46.
  • the degree of color overlay may be manually controlled by a user, for example with a slider 54 or with a mouse wheel.
  • the method further comprises the step of: displaying the residual mode image data 46 together with further image data 42a, 42b, 48 on a display device 16 by overlaying the image data with the further image data 46.
  • the further image data may be one of the first image data 42a, second image data 42b, and the dominant image data 48.
  • the residual mode image 46 is toggled in place with the further image data 42a, 42b, for example with the aid of a toggle button 52.
  • a user may manually control the toggling of the images.
  • the method further comprises the step of: displaying the residual mode image data 46 together with further image data 42a, 42b, 48 on a display device 16 by toggling the image data with the residual mode image data 46 by a user command on the display device 16.
  • the residual mode image 36 may be gradually alpha- blended in gray-scale and in place with the other image data 42a, 42b, 48.
  • the degree of alpha-blending or the binary toggling may be manually controlled by a user, for example with a slider 54 or mouse wheel.
  • the method further comprises the step of: displaying the residual mode image data 46 together with further image data 42a, 42b, 48 on a display device 16 by alpha-blending the image data with the residual mode image data 46 by a user command.
  • a region 50 of one of the images 42a, 42b, 46, 48 may be selected by a user. For example, a rectangle may be selected with a mouse.
  • the selected region 50 may be used for defining a region in the digital image 40 from which the regression model 44 is determined, i.e. the regression model 44 may be determined from the whole image 40 or only from a part of the image 40.
  • the selected region 50 also may be used for defining a region in one of the images 42a, 42b, 48, to which the residual mode image data 46 is overlaid or alpha-blended. According to an embodiment of the invention, the residual mode image data 46 is displayed only for pixels of only the selected region 50.
  • two images may be linked such that a mouse click or movement in either one of the two images shows a hair cross at the corresponding position in the other image.
  • the user may freely move with the selected region over the image 42a, 42b, 46, 48 (like a magic magnifier).
  • the method may be applied to the multiple energy image 40 as a whole, or to a local region of interest, interactively steered by a user like a magic magnifying glass.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
PCT/IB2014/059770 2013-03-15 2014-03-14 Determining a residual mode image from a dual energy image WO2014141163A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP14719356.9A EP2973391B1 (en) 2013-03-15 2014-03-14 Determining a residual mode image from a dual energy image
CN201480015630.4A CN105122300B (zh) 2013-03-15 2014-03-14 根据双能量图像确定剩余模式图像
JP2015562518A JP6251297B2 (ja) 2013-03-15 2014-03-14 デュアルエネルギー画像からの残差モード画像の決定
US14/774,205 US9706968B2 (en) 2013-03-15 2014-03-14 Determining a residual mode image from a dual energy image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361787489P 2013-03-15 2013-03-15
US61/787,489 2013-03-15

Publications (2)

Publication Number Publication Date
WO2014141163A2 true WO2014141163A2 (en) 2014-09-18
WO2014141163A3 WO2014141163A3 (en) 2015-08-27

Family

ID=50549361

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/059770 WO2014141163A2 (en) 2013-03-15 2014-03-14 Determining a residual mode image from a dual energy image

Country Status (5)

Country Link
US (1) US9706968B2 (US07534539-20090519-C00280.png)
EP (1) EP2973391B1 (US07534539-20090519-C00280.png)
JP (1) JP6251297B2 (US07534539-20090519-C00280.png)
CN (1) CN105122300B (US07534539-20090519-C00280.png)
WO (1) WO2014141163A2 (US07534539-20090519-C00280.png)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107111881A (zh) * 2014-12-16 2017-08-29 皇家飞利浦有限公司 对应性概率图驱动的可视化
EP3441005A1 (de) * 2017-08-11 2019-02-13 Siemens Healthcare GmbH Analyse von läsionen mit hilfe der multi-energie-ct-bildgebung
US11278254B2 (en) 2017-09-22 2022-03-22 The University Of Chicago System and method for low-dose multi-spectral X-ray tomography

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018212217A1 (ja) 2017-05-16 2018-11-22 株式会社ジョブ X線検査におけるデータ処理装置及びデータ処理方法、並びに、その装置を搭載したx線検査装置
CN107595311A (zh) * 2017-08-30 2018-01-19 沈阳东软医疗系统有限公司 双能量ct图像处理方法、装置以及设备
US10830855B2 (en) * 2018-03-28 2020-11-10 University Of Virginia Patent Foundation Free-breathing cine DENSE imaging

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165379B2 (en) 2008-06-06 2012-04-24 General Electric Company Method of processing radiological images, and, in particular, mammographic images

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4789930A (en) 1985-11-15 1988-12-06 Picker International, Inc. Energy dependent gain correction for radiation detection
AUPO525897A0 (en) 1997-02-24 1997-03-20 Redflex Traffic Systems Pty Ltd Digital image processing
US20020186875A1 (en) * 2001-04-09 2002-12-12 Burmer Glenna C. Computer methods for image pattern recognition in organic material
WO2005101312A2 (en) * 2004-04-13 2005-10-27 Aic Fractal skr-method for evaluating image quality
US7599465B2 (en) 2004-11-19 2009-10-06 General Electric Company Detection of thrombi in CT using energy discrimination
US20080292194A1 (en) * 2005-04-27 2008-11-27 Mark Schmidt Method and System for Automatic Detection and Segmentation of Tumors and Associated Edema (Swelling) in Magnetic Resonance (Mri) Images
CN101394487B (zh) * 2008-10-27 2011-09-14 华为技术有限公司 一种合成图像的方法与系统
WO2010143100A1 (en) 2009-06-10 2010-12-16 Koninklijke Philips Electronics N.V. Visualization apparatus for visualizing an image data set
WO2011045784A1 (en) 2009-10-13 2011-04-21 Ramot At Tel-Aviv University Ltd. Method and system for processing an image
GB201006046D0 (en) * 2010-04-12 2010-05-26 Ge Healthcare Uk Ltd System and method for determining motion of a biological object
KR20120007892A (ko) 2010-07-15 2012-01-25 삼성전자주식회사 영상 처리 방법 및 장치와 이를 채용한 의료영상시스템
US8934697B2 (en) * 2010-11-26 2015-01-13 Koninklijke Philips N.V. Image processing apparatus
US8818069B2 (en) * 2010-11-30 2014-08-26 General Electric Company Methods for scaling images to differing exposure times
US9477875B2 (en) * 2012-11-28 2016-10-25 Japan Science And Technology Agency Cell monitoring device, cell monitoring method and program thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165379B2 (en) 2008-06-06 2012-04-24 General Electric Company Method of processing radiological images, and, in particular, mammographic images

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107111881A (zh) * 2014-12-16 2017-08-29 皇家飞利浦有限公司 对应性概率图驱动的可视化
JP2018500082A (ja) * 2014-12-16 2018-01-11 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 対応確率マップ主導の視覚化
CN107111881B (zh) * 2014-12-16 2021-06-15 皇家飞利浦有限公司 对应性概率图驱动的可视化
EP3441005A1 (de) * 2017-08-11 2019-02-13 Siemens Healthcare GmbH Analyse von läsionen mit hilfe der multi-energie-ct-bildgebung
US11278254B2 (en) 2017-09-22 2022-03-22 The University Of Chicago System and method for low-dose multi-spectral X-ray tomography

Also Published As

Publication number Publication date
EP2973391B1 (en) 2018-11-14
US20160038112A1 (en) 2016-02-11
JP6251297B2 (ja) 2017-12-20
US9706968B2 (en) 2017-07-18
CN105122300B (zh) 2018-09-25
CN105122300A (zh) 2015-12-02
EP2973391A2 (en) 2016-01-20
JP2016516464A (ja) 2016-06-09
WO2014141163A3 (en) 2015-08-27

Similar Documents

Publication Publication Date Title
US9706968B2 (en) Determining a residual mode image from a dual energy image
US10383602B2 (en) Apparatus and method for visualizing anatomical elements in a medical image
US9928600B2 (en) Computer-aided diagnosis apparatus and computer-aided diagnosis method
JP2021509721A5 (US07534539-20090519-C00280.png)
JP6423857B2 (ja) 画質インデックス及び/又はそれに基づく画像化パラメータ推奨
US9282929B2 (en) Apparatus and method for estimating malignant tumor
Mallett et al. Tracking eye gaze during interpretation of endoluminal three-dimensional CT colonography: visual perception of experienced and inexperienced readers
US11321841B2 (en) Image analysis method, image analysis device, image analysis system, and storage medium
CA3136127A1 (en) Systems and methods for automated and interactive analysis of bone scan images for detection of metastases
JP6475691B2 (ja) X線画像における構造のコンピュータ援用検出のための方法およびx線システム
EP3203914B1 (en) Radiation dose applied to different anatomical stages
KR20120041557A (ko) 영상을 처리하는 방법, 이를 수행하는 영상처리장치 및 의료영상시스템
US20150294182A1 (en) Systems and methods for estimation of objects from an image
JP2016537099A (ja) デュアルエネルギスペクトルマンモグラフィー画像処理
JP6492553B2 (ja) 画像処理装置及びプログラム
CN111630562A (zh) 用于评估肺部图像的系统
Belal et al. 3D skeletal uptake of 18F sodium fluoride in PET/CT images is associated with overall survival in patients with prostate cancer
JP6430500B2 (ja) 腫瘍の奏効測定を支援するための方法
CN109313803B (zh) 一种用于映射对象的身体的至少部分的图像中的结构的至少部分的方法和装置
US20160217262A1 (en) Medical Imaging Region-of-Interest Detection Employing Visual-Textual Relationship Modelling.
Koo et al. Improved efficiency of CT interpretation using an automated lung nodule matching program
CN106575436B (zh) 用于钙化的肋-软骨关节的视觉评估的轮廓显示
US10810737B2 (en) Automated nipple detection in mammography
Al-Hinnawi et al. Collaboration between interactive three-dimensional visualization and computer aided detection of pulmonary embolism on computed tomography pulmonary angiography views
JP6245237B2 (ja) 医用画像処理装置、医用画像処理装置に搭載可能なプログラム、及び医用画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14719356

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase in:

Ref document number: 2015562518

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14774205

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2014719356

Country of ref document: EP