WO2008002325A2 - Cross-time inspection method for medical diagnosis - Google Patents

Cross-time inspection method for medical diagnosis Download PDF

Info

Publication number
WO2008002325A2
WO2008002325A2 PCT/US2006/049329 US2006049329W WO2008002325A2 WO 2008002325 A2 WO2008002325 A2 WO 2008002325A2 US 2006049329 W US2006049329 W US 2006049329W WO 2008002325 A2 WO2008002325 A2 WO 2008002325A2
Authority
WO
WIPO (PCT)
Prior art keywords
images
image
mri
mammography
time
Prior art date
Application number
PCT/US2006/049329
Other languages
French (fr)
Other versions
WO2008002325A3 (en
Inventor
Shoupu Chen
Lawrence Allen Ray
Zhimin Huo
Original Assignee
Caresteam Health, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caresteam Health, Inc. filed Critical Caresteam Health, Inc.
Priority to EP06851504A priority Critical patent/EP1969563A2/en
Priority to JP2008548693A priority patent/JP2009522004A/en
Priority claimed from US11/616,316 external-priority patent/US20070160276A1/en
Publication of WO2008002325A2 publication Critical patent/WO2008002325A2/en
Publication of WO2008002325A3 publication Critical patent/WO2008002325A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • G06F2218/16Classification; Matching by matching signal segments
    • G06F2218/20Classification; Matching by matching signal segments by applying autoregressive analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • G06T2207/10096Dynamic contrast-enhanced magnetic resonance imaging [DCE-MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/032Recognition of patterns in medical or anatomical images of protuberances, polyps nodules, etc.

Definitions

  • the present invention relates to a digital image processing/computer vision method for image analysis and, in particular, to cross- time inspection of tissues of different properties in medical image as a time function (cross-time image sequences).
  • Multi-dimensional image analysis can be used in applications such as automatic quantification of changes (anatomical or functional) in serial image volume scans of body parts, foreign objects localization, consistent diagnostic rendering, and the like.
  • CT and MRI images describe complementary morphologic features. For example, bone and calcifications are best seen on CT images, while soft-tissue structures are better differentiated by MRI. Modalities such as MRI and CT usually provide a stack of images for certain body parts.
  • a contrast agent injected into the bloodstream can provide information about blood supply to the breast tissues; the agent "lights up" a tumor by highlighting its blood vessel network.
  • several scans are taken: one before the contrast agent is injected and at least one after.
  • the pre-contrast and post-contrast images are compared and areas of difference are highlighted. It should be recognized that if the patient moves even slightly between the two scans, the shape or size of the image may be distorted—a big loss of information.
  • An contrast agent for MRI is Gadolinium or gadodiamide, and provides contrast between normal tissue and abnormal tissue in the brain and body.
  • Gadolinium looks clear like water and is non-radioactive. After it is injected into a vein, Gadolinium accumulates in the abnormal tissue that maybe affecting the body or head. Gadolinium causes these abnormal areas to become bright (enhanced) on the MRI. This makes it easy to see. Gadolinium is then cleared from the body by the kidneys. Gadolinium allows the MRI to define abnormal tissue with greater clarity. Tumors enhance after Gadolinium is given. The exact size of the tumor and location is important in treatment planning and follow up. Gadolinium is also helpful in finding small tumors by making them bright and easy to see.
  • Dynamic contrast enhanced MRI is used for breast cancer imaging; in particular for those situation that have an inconclusive diagnosis based on x-ray mammography.
  • the MRI study involves intravenous injection of a contrast agent (typically gadopentetate dimeglumine) immediately prior to acquiring a set of 21- weighted MR volumes with a temporal resolution of around a minute.
  • a contrast agent typically gadopentetate dimeglumine
  • U.S. Patent No. 6,353,803 (Degani, Hadassa), incorporated herein by reference, is directed to an apparatus and method for monitoring a system in which a fluid flows and which is characterized by a change in the system with time in space. A preselected place in the system is monitored to collect data at two or more time points correlated to a system event. The data is indicative of a system parameter that varies with time as a function of at least two variables related to system wash-in and wash-out behavior.
  • the present invention provides a method for image analysis and, in particular, for cross-time inspection of tissues of different properties in medical image as a time function.
  • An object of the present invention is to provide a method for cross- time inspection of tissues of different properties (for example, abnormal and normal tissues) in medical image as a time function (cross-time image sequences).
  • the present invention provides a pattern recognition method for cross-time inspection of tissues of different properties using contrast enhanced MRI images augmented with other physical or non-physical factors.
  • the method includes the steps of acquiring a plurality of medical image (e.g. MRI images before and after the injection of contrast enhancement agent) cross-time sequences; performing intra-registration of the plurality of medical image cross- time sequences with respect to spatial coordinates; performing inter-registration of the plurality of medical image cross-time sequences with respect to spatial coordinates; classifying tissues of different properties for the registered plurality of medical image cross-time sequences; and presenting the classification results for cross-time inspection.
  • a plurality of medical image e.g. MRI images before and after the injection of contrast enhancement agent
  • a method for automatic abnormal tissue detection and differentiation using contrast enhanced MRI images augmented with other physical or non-physical factors includes the steps of acquiring a plurality of MRI breast image sets; aligning the plurality of MRI breast images with respect to spatial coordinates; differencing the plurality of MRI breast image sets with a reference MRI image set, producing a plurality of difference image sets; segmenting the plurality of difference image sets, producing a plurality of MRI breast images with segmented intensity pixels; applying dynamic system identification to the segmented intensity pixels, producing a plurality of dynamic system parameters; and classifying the plurality of system parameters augmented with other physical or non-physical factors into different classes.
  • a method for automatic material classification includes the steps of: acquiring a plurality of image sets of an object sequentially in time; aligning the plurality of image sets with respect to spatial coordinates; differencing the plurality of image sets with a reference image set to produce a plurality of difference image sets; segmenting the plurality of difference image sets to produce a plurality of images with segmented intensity pixels; applying dynamic system identification to the segmented intensity pixels of the plurality of images to produce a plurality of dynamic system parameters; and classifying the plurality of system parameters into different classes.
  • a method for abnormal tissue detection using contrast enhanced MRI images includes the steps of: acquiring a plurality of MRI breast image sets sequentially in time; aligning the plurality of MRI breast image sets with respect to spatial coordinates; differencing the plurality of MRI breast image sets with a reference MRI image set to produce a plurality of difference image sets; segmenting the plurality of difference image sets to produce a plurality of MRI breast image sets with segmented intensity pixels; applying a dynamic system identification to the segmented intensity pixels of the plurality of MRI breast image sets to produce a plurality of dynamic system parameters; and classifying the plurality of system parameters into different classes to detect abnormal tissue.
  • FIG. 1 is a graph illustrating dynamic contrast uptake properties (curves) for different breast tissues.
  • FIG. 2 is a schematic diagram of an image processing system useful in practicing the method in accordance with present invention.
  • FIG. 3 is a flow chart illustrating one embodiment of the automatic abnormal tissue detection method in accordance with the present invention.
  • FIG. 4 is a graph illustrating dynamic contrast uptake properties
  • FIG. 5 is a schematic diagram illustrating the concept of step function response and system identification.
  • FIG. 6 is a flowchart illustrating a method of system identification in accordance with the present invention.
  • FIG. 7 is a graph illustrating two cross-time image sequences.
  • FIG. 8 is a flowchart illustrating one embodiment of the cross-time tissue property inspection method in accordance with the present invention.
  • FIG. 9 is a graph illustrating a method of cross-time tissue property inspection visualization presentations of the present invention.
  • FIG. 10 is a flowchart illustrating a method of image registration in accordance with the present invention.
  • FIG. 11 is a graph illustrating image registration concept.
  • FIG. 2 shows an image processing system 10 useful in practicing the method in accordance with the present invention.
  • System 10 includes a digital MRI image source 100, for example, an MRI scanner, a digital image storage device (such as a compact disk drive), or the like.
  • the digital image from digital MRI image source 100 is provided to an image processor 102, for example, a programmable personal computer, or digital image processing work station such as a Sun Sparc workstation.
  • Image processor 102 can be connected to a display 104 (such as a CRT display or other monitor), an operator interface such as a keyboard 106, and a mouse 108 or other known input device. Image processor 102 is also connected to computer readable storage medium 107. Image processor 102 transmits processed digital images to an output device 109. Output device 109 can comprise a hard copy printer, a long-term image storage device, a connection to another processor, an image telecommunication device connected, for example, to the Internet, or the like.
  • a preferred embodiment of the present invention will be described as a method. However, in another preferred embodiment, the present invention comprises a computer program product for detecting abnormal tissues in a digital MRI image in accordance with the method described.
  • the computer program of the present invention can be utilized by any well-known computer system, such as the personal computer of the type shown in Figure 2.
  • other types of computer systems can be used to execute the computer program of the present invention.
  • the method of the present invention can be executed in the computer contained in a digital MRI machine or a PACS (picture archiving communication system). Consequently, the computer system will not be discussed in further detail herein.
  • the computer program product of the present invention can make use of image manipulation algorithms and processes that are well known. Accordingly, the present description will be directed in particular to those algorithms and processes forming part of, or cooperating more directly with, the method of the present invention.
  • the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes are conventional and within the ordinary skill in such arts.
  • a computer program for performing the method of the present invention can be stored in a computer readable storage medium.
  • This medium may comprise, for example: magnetic storage media such as a magnetic disk (such as a hard drive or a floppy disk) or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program.
  • the computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the Internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
  • Figure 8 is a flow chart illustrating one embodiment of the method of the cross- time inspection of tissues of different properties in medical image of the present invention.
  • a plurality of medical image cross-time sequences goes through a series of processes. Each of these processes performs a specific functionality such as intra-sequence registration, inter- sequence registration, dynamic curve classification, and visualization and diagnosis.
  • the process of image registration is to determine a mapping between the coordinates in one space (a two dimensional image) and those in another (another two dimensional image), such that points in the two spaces that correspond to the same feature point of an object are mapped to each other.
  • the process of determining a mapping between the coordinates of two images provides a horizontal displacement map and a vertical displacement map of corresponding points in the two images. The found vertical and horizontal displacement maps are then used to deform one of the involved images to minimize the misalignment between the two.
  • the two images involved in registration process are referred as a source image 1020 and a reference image 1022.
  • the source image and the reference image denote the source image and the reference image by I(x t , y t , t) and i(x /+1 , j ⁇ , + i , t + 1) respectively.
  • the notations x and y are the horizontal and vertical coordinates of the image coordinate system, and t is the image index (image 1, image 2, etc.).
  • the image (or image pixel) is also indexed as I(i,j) where i and j are strictly integers and parameter t is ignored for simplicity.
  • the column index i runs from 0 to w— 1 .
  • the row index j runs from 0 to h — 1 .
  • the registration process is to find an optimal transformation function ⁇ t+ ⁇ ( x t >yt) ( see ste P 1002) such that
  • Equation (10-1) The transformation function of Equation (10-1) is a 3x3 matrix with elements shown in Equation (10-2). In fact, the transformation matrix consists of two parts, a rotation
  • the transformation function ⁇ is either a global function or a local function.
  • a global function ⁇ transforms every pixel in an image in a same way.
  • a local function ⁇ transforms each pixel in an image differently based on the location of the pixel.
  • the transformation function ⁇ could be a global function or a local function or a combination of the two.
  • the transformation function ⁇ generates two displacement maps (step 1004), X(i,j) , and Y(i,j) , which contain the information that could bring pixels in the source image to new positions that align with the corresponding pixel positions in the reference image.
  • the source image is to be spatially corrected in step 1008 and become a registered source image 1024.
  • the column index i runs from 0 to w— 1 and the row index j runs from 0 to h - 1 .
  • Fig. 1 1 An exemplary result of misalignment correction is shown in Fig. 1 1.
  • Tn Fig. 1 1 3 on the left is the source image 1102; on the right is the reference image 1106.
  • the vertical misalignment corrected source image is obtained (image 1104).
  • the registration algorithm used in computing the image transformation function ⁇ could be a rigid registration algorithm, a non-rigid registration algorithm or a combination of the two.
  • People skilled in the art understand that there are numerous registration algorithms that can carry out the task of finding the transformation function ⁇ that generates the needed displacement maps for the correction of the misalignment in two relevant images. Exemplary algorithms can be found in "Medical Visualization with ITK", by Lydia Ng, et al. at http://www.itk.org.
  • spatially correcting an image with a displacement map could be realized by using any suitable image interpolation algorithms (see “Robot Vision” by Berthold Klaus Paul Horn, The MIT Press Cambridge, Massachusetts.)
  • Box 1000 will be used in the following description of the present invention of cross-time inspection of tissues with different properties.
  • An MRI image sequence 704 contains an exemplary collection of MRI slice sets 706,708 and 710 for the same object (the breast). Each MRI slice set contains a number of slices that are images (cross-sections) of the object (the breast). Exemplary slices are slice (image) 712 for set 706, slice (image) 714 for set 708, and slice (image) 716 for set 710. Purposely, MRI slice sets are taken at different time to capture functional changes of the object in time space when contrast enhancement agent is administrated. Exemplary time gap between the MRI slice sets could be 1 minute, 2 minutes, etc.
  • sequence 704 For cross-time inspection of tissues with different properties, besides sequence 704, one or more sequences of MRI image for the same object (the breast) are needed.
  • An exemplary MRI sequence 724 is such a sequence. Sequence 724 is captured at a different time. Exemplary time gap between sequence 724 and sequence 704 could be several months.
  • sequence 724 contains an exemplary collection of MRI slice sets 726,728 and 730 for the same object (the breast).
  • Each MRI slice set contains a number of slices that are images (cross-sections) of the object (the breast).
  • Exemplary slices are slice (image) 732 for set 726, slice (image) 734 for set 728, and slice (image) 736 for set 730.
  • MRI slice sets are taken at different time to capture functional changes of the object in time space. Exemplary time gap between the MRI slice sets could be 1 minute, 2 minutes, etc.
  • An intra-sequence registration (804) is defined as registering slices (images) of the same cross-section of an object within a sequence of MRI image sets.
  • Exemplary slices are slices (images) 712, 714, and 716 for sequence 704, and slices (images) 732, 734, and 736 for sequence 724.
  • An embodiment of intra- sequence registration is discussed in the context of the method of tissue property inspection of a set of images, which acts as an independent entity, illustrated in Figure 3.
  • the need of intra-sequence registration stems from the fact that during the process of capturing MRI images, due the inevitable object (breast, for example) motion, images (for example, 712, 714 and 716) for the same cross- section of the object present misalignment. This misalignment may cause errors in the process of tissue property inspection.
  • inter-sequence registration is thus needed and defined as registering slices (images) of the same cross-section of an object from different sequences.
  • inter-sequence registration is pair-wise (2D) registration.
  • Exemplary pairs of slices to be inter-registered are pairs 712 and 732, 714 and 734, and 716 and 736.
  • Another embodiment of inter- sequence registration is volume-wise (3D) registration. In volume-wise (3D) registration, intra-registration is applied to individual sequences (e.g. 704 and 724) first. Then the intra-registered sequences are input to box 1000.
  • FIG. 3 is a flow chart illustrating one embodiment of the automatic abnormal tissue detection method of the present invention.
  • the flow chart illustrated in Figure 3 serves as an independent entity that constitutes a self- contained process. Therefore, the flow chart illustrated in Figure 3 is not interpreted as an expansion of step 808. Rather, step 808 and step 804 are explained using the steps shown in the flow chart in Figure 3.
  • a plurality of MRI breast images sets acquired before and after contrast agent injection go through a series of processes. Each of these processes performs a specific functionality such as alignment, subtraction, segmentation, system identification, and classification.
  • abnormal tissue detection tasks are accomplished by a means of dynamic system parameter classification.
  • a first step 202 also step
  • step 202 is employed for acquiring a plurality of MRI breast image sets before and after an injection of contrast agent at one time.
  • step 202 repeats to acquire another plurality of MRI breast image sets before and after an injection of contrast agent at another time.
  • a plurality of MRI image sets is acquired with the same number ( S ) of images of the same breast for each set in the same spatial order ⁇ .
  • the plurality of MRI image sets is taken with a temporal resolution, for example, of around one minute.
  • This MRI image sets can be expressed by I k (x, y, z) where k is the temporal order index and Ae [1,...K]; K is the number of sets.
  • the presence of a contrast agent within an imaging voxel results in an increased signal that can be observed over the time course of the image acquisition process. Study of these signal-time curves enables identification of different tissue types due to their differential contrast uptake properties.
  • the K sets of MRI images, I k (jc, y, z) taken after the injection of contrast agent have to be spatially aligned (misalignment correction), in a step 204 (also step 804 intra-sequence registration), with a reference set of MRI images with respect to spatial coordinates x, y .
  • the reference set of MRI image is the set of MRI images, I 0 (x, y,z) , taken before the injection of the contrast agent.
  • the alignment process ensures that pixels belong to a same tissue region of the breast have the same x, y coordinates in all the K sets of images.
  • I k (x, y, z) is input to terminal A (1032)
  • I 0 (x, y,z) is input to terminal B (1034)
  • the registered image of I k (x,y,z) is obtained at output terminal D (1036).
  • An exemplary method employable to realize the alignment function, align(A, B) is a non-rigid registration that aligns A with B and is widely used in medical imaging and remote sensing fields.
  • the registration process has been discussed previously. Persons skilled in the art will recognize that other registration methods could also be used.
  • image pixel intensity increases differently for different breast tissues.
  • This phenomenon indicates that subtracting the image taken before the injection from the image taken after the injection will provide radiologists with clearer information of locations of abnormal tissues in the image.
  • This information can also be used to extract regions from the original MRI breast images for automatic abnormal tissue detection and differentiation.
  • This information is obtained in step 206 in Figure 3 that carries out differencing the plurality of MRI breast image sets, l k (x, y,z) , Ae [1,..X] with a reference MRI image set to produce a plurality of difference image sets, ⁇ l k (x, y, z), k e [1,..JST].
  • the set of MRI images, I 0 (x, y, z) is selected as intensity reference images.
  • the differencing process is executed as following:
  • T is a statistical intensity threshold.
  • An exemplary value of T is an empirical value 10.
  • the segmentation process in step 208 segments the images in the plurality of MRI breast image sets, I k (x,y,z) , according to the non-zero pixels in the mask images, M k (x,y, z) , to obtain segmented intensity pixels in the images of the plurality of MRI breast image sets.
  • the stage of generating mask images can be omitted and the segmentation process can be realized by executing the following:
  • FIG 4 there is shown a chart that is a replica to the chart shown in Figure 1 except that Figure 4 includes the insertions of a step function, /(O , curve 302 and the removal of the normal and fat tissue curves.
  • Pixels that belong to normal and fat tissues are set to zeros in images S k (x, y, z) in the segmentation step 208.
  • the remaining pixels in images S k (x, y, z) belong to either malignant or benign tissues. It is practically difficult if not impossible to differentiate malignant tissue from benign tissue by just assessing the pixels brightness (intensity) in a static form, that is, in individual images.
  • the brightness changes present a distinction between these two types of tissues.
  • the brightness (contrast) curve 304, m(t) of the malignant tissue rises quickly above the step function curve 302 and then asymptotically approaches the step function curve 302; while the brightness (contrast) curve 306, b(t) , of the benign tissue rises slowly underneath the step function curve 302 and then asymptotically approaches the step function curve, /(0 , 302.
  • the brightness (contrast) curve 304, m(t) of the malignant tissue rises quickly above the step function curve 302 and then asymptotically approaches the step function curve 302
  • the brightness (contrast) curve 306, b(t) of the benign tissue rises slowly underneath the step function curve 302 and then asymptotically approaches the step function curve, /(0 , 302.
  • FIG. 5 An exemplary generic approach to identifying a dynamic system behavior is generally depicted in Figure 5.
  • a step function 402 is used as an excitation.
  • a response 406 to the step function 402 from the dynamic system 404 is fed to a system identification step 408 in order to estimate dynamic parameters of system 404.
  • FIG. 6 An exemplary realization of dynamic system modeling 212 (of Figure 3) is shown in Figure 6 where it is shown an ARX (autoregressive) model 500 (refer to "System identification Toolbox", by Lennart Ljung, The Math Works).
  • a general ARX model can be expressed as the equation:
  • G(q) (506) and H(q) (504) are the system transfer functions as shown in Figure 6
  • u(t) (502) is the excitation
  • ⁇ (t) (508) is the disturbance
  • y(t) (510) is the system output.
  • G(q) (506) and H(q) (504) can be specified in terms of rational functions of q l and specify the numerator and denominator coefficients in the forms:
  • Equations (9) and (10) ⁇ 0 is the data sampling starting time and N 1 is the number of samples.
  • «(*) is a step function.
  • the corresponding solutions are ⁇ m and ⁇ b .
  • the computation of ⁇ realizes the step of Dynamic system identification 210 (also step 408).
  • a supervised learning step 218 is needed.
  • a supervised learning is defined as a learning process in which the exemplar set consists of pairs of inputs and desired outputs.
  • the exemplar inputs are ⁇ m and ⁇ b (or the known curves)
  • the exemplar desired outputs are indicators O m and O b for malignant and benign tumors respectively.
  • step 218 receives M sample breast MRI dynamic curves with known characteristics (benign or malignant) from step 216.
  • An exemplary value for M could be 100.
  • M 1n curves belong to malignant tumors and M b curves belong to benign tumors.
  • Exemplary values for M n , and M b could be 50 and 50.
  • M m coefficient vectors (denoted by ⁇ m ' , i — 1...M n , ) represent malignant tumor with indicator O 111
  • These learned coefficient vectors ⁇ m ' and ⁇ b ' are used to train a classifier that in turn is used to classify a dynamic contrast curve in a detection or diagnosis process.
  • step 220 To increase the specificity (accuracy in differentiating benign tumors from malignant tumors) other factors (step 220) can be incorporated into the training (learning) and classification process. It is known that factors such as the speed of administration of the contrast agent, timing of contrast administration with imaging, acquisition time and slice thickness (refer to "Contrast-enhanced breast MRI: factors affecting sensitivity and specificity", by CW. Piccoli, Eur. Radiol. 7 (Suppl. 5), S281-S288 (1997)).
  • the vector p y [ ⁇ , a, ⁇ , ⁇ , ⁇ ] is traditionally called feature vector in computer vision literature.
  • the notion 9 ⁇ rf represents a domain, d is the domain dimension.
  • the data format in Equation (11) is used in leaning step 218 as well as in classification step 214.
  • the data vector P y can be constructed in a different manner and augmented with different physical or non-physical numerical elements (factors) other than the ones aforementioned.
  • classifiers There are known types of classifiers that can be used to accomplish the task of differentiating malignant rumors from benign tumors with the use of dynamic contrast curves along with other physical or non-physical factors.
  • An exemplary classifier is an SVM (support vector machine) (refer to "A tutorial on Support Vector Machines for Pattern Recognition", by C. Burges, Data Mining and Knowledge Discovery, 2(2), 1-47, 1998, Kluwer Academic Publisher, Boston, with information available at the website: http://ava.technion.ac.il/karniel/CMCC/SVM-tutorial.pdf>.
  • An example case of an SVM classifier would be training and classification of data representing two classes that are separable by a hyper-plane.
  • the goal of training the SVM is to determine the free parameter w and ⁇ .
  • a scaling can always be applied to the scale of W and ⁇ such that all the data obey the paired inequalities: r y (w. Py + ⁇ )-l> 0 5 Vy (13)
  • Equation (13) can be solved by minimizing a Lagrangian function
  • l s is the number of support vectors.
  • Classification of a new vector p new into one of the two classes (malignant and benign) is based on the sign of the decision function. Persons skilled in the art will recognize that in non-separable case, non-linear SVMs can be used.
  • the above described method of tissue property inspection of a set of images (also steps 804 and 808) is applied to all the cross-time image sequences such 704 and 724 for cross-time tissue property inspection. It is understood that in the present invention, the cross- time image sequence go through the steps of intra- registration and inter-registration before entering step 808.
  • One exemplary execution procedure of the steps of intra-registration and inter-registration for the exemplary sequences is applying intra-registration to sequence 704 first, then applying inter-registration to sequences 704 and 724. People skilled in the art should know that the roles of sequences 704 and 724 are exchangeable.
  • For intra-registering sequence 704 for this particular exemplary execution procedure select arbitrarily a set of images as the reference image set, e.g. set 706.
  • Images of set 706 are input to terminal B (1034), other image sets (708 and 710) are input to terminal A (1032).
  • the registered images of image sets (708 and 710) are obtained at terminal D (1036).
  • images of sequence 724 are input to terminal A (1032)
  • images of sequence 704 are input to terminal B (1034)
  • the registered images of sequence 724 are obtained at output terminal D (1036).
  • multiple dynamic curves are generated reflecting tissue properties captured in multiple cross-time image sequences (two sequences 704 and 724 for the current exemplary case) at multiple time instances (two for the current exemplary case). It is well known that these dynamic curves provide the medical professionals with valuable information regarding disease conditions (or progressions) for patients, hi step 810, visualization tools are employed for medical professional to examine concerned regions of the object (regions of interest in the images) for better diagnosis.
  • One embodiment of such visualization facility is illustrated in Figure 9.
  • FIG. 9 There is shown in Figure 9 a computer monitor screen 900 (also 104 in Figure 2) hooked up to an image processor (102) that executes previously described steps.
  • image processor (102) that executes previously described steps.
  • slice 712 is the first image of e [l,2,3] across three sets (706, 708 and 710) at spatial location 1; slice
  • FIG. 732 is the first image of I k [l,2 5 3] across three sets (726, 728 and 730) at spatial location 1.
  • Breast images 902 and 912 are shown in slices 712 and 732.
  • Breast images 902 and 912 are the images of a same cross-section of a breast.
  • a medical professional moves a computer mouse 906 (as a user interface) over a location 908 in slice 712.
  • a ghost mouse 916 appears at the same spatial location 918 in slice 732 as 908 in slice 712.
  • the user also can move a computer mouse 916 (as a user interface) over a location 918 in slice 732.
  • a ghost mouse 906 appears at the same spatial location 908 in slice 712 as 918 in slice 732.
  • two dynamic curses (924 solid and 926 dashed) appear at the left side (922) of the screen.
  • the exemplary curves 924 and 926 reflect different tissue properties for the same spot of a breast at two different times.
  • the image sequence containing slice 712 may be taken 6 months prior capturing the sequence containing slice 732.
  • the medical professional can move the mouse to other locations to examine the change of the tissue properties over time (6 months). With this visualization facility, disease progression can be readily analyzed. People skilled in the art should understand that tissue properties could be represented by other means besides the dynamic curve plots 924 and 926.
  • tissue properties could be represented by colored angiogenesis maps.
  • tissue properties could be represented by colored angiogenesis maps.
  • multiple cross-time image sequences can be processed by the method of the current invention and multiple dynamic curves can be displayed simultaneously for medical diagnosis.
  • the subject matter of the present invention relates to digital image processing and computer vision technologies, which is understood to mean technologies that digitally process a digital image to recognize and thereby assign useful meaning to human understandable objects, attributes or conditions, and then to utilize the results obtained in the further processing of the digital image.

Abstract

A cross-time inspection method for medical image diagnosis. A first set of medical images of a subject is accessed wherein the first set is captured at a first time period. A second set of medical images of the subject is accessed, wherein the second set is captured at a second time period. The first and second sets are each comprised of a plurality of medical image. Image registration is performed by mapping the plurality of medical images of the first and second sets to predetermined spatial coordinates. A cross-time image mapping is performed of the first and second sets. Means are provided for interactive cross-time medical image analysis.

Description

CROSS-TIME INSPECTION METHOD FOR MEDICAL DIAGNOSIS
FIELD OF THE INVENTION The present invention relates to a digital image processing/computer vision method for image analysis and, in particular, to cross- time inspection of tissues of different properties in medical image as a time function (cross-time image sequences).
BACKGROUND OF THE INVENTION
Digital imaging techniques in medicine were implemented in the 1970's with the first clinical use and acceptance of the Computed Tomography or CT scanner. Later, extensive use of x-ray imaging (CT) and the advent of the digital computer and new imaging modalities like ultrasound and magnetic resonance imaging (MRI) have combined to create an explosion of diagnostic imaging techniques in the past three decades.
There are benefit to using digital medical imaging technology in health care. For example, angiographic procedures for looking at the blood vessels in the brain, kidneys, arms and legs, and heart all have benefited from the adaptation of digital medical imaging and image processing technologies.
With digital images, computerized multi-dimensional (e.g., spatial and temporal) image analysis becomes possible. Multi-dimensional image analysis can be used in applications such as automatic quantification of changes (anatomical or functional) in serial image volume scans of body parts, foreign objects localization, consistent diagnostic rendering, and the like.
Also, different medical imaging modalities produce images providing different view of human body function and anatomy that have the potential of enhancing diagnostic accuracy dramatically with the help of the right medical image processing software and visualization tools. For example, X-ray computed tomography (CT) and magnetic resonance imaging (MRI) demonstrate brain anatomy but provide little functional information. Positron emission tomography (PET) and single photon emission computed tomography (SPECT) scans display aspects of brain function and allow metabolic measurements but poorly delineate anatomy. Furthermore, CT and MRI images describe complementary morphologic features. For example, bone and calcifications are best seen on CT images, while soft-tissue structures are better differentiated by MRI. Modalities such as MRI and CT usually provide a stack of images for certain body parts.
It is known that the information gained from different dimensions (spatial and temporal) or modalities is often of a difference or complementary nature. Within the current clinical setting, this difference or complementary image information is a component of a large number of applications in clinical diagnostics settings, and also in the area of planning and evaluation of surgical and radiotherapeutical procedures.
In order to effectively use the difference or complementary information, image features from different dimensions or different modalities had to be superimposed to each other by physicians using a visual alignment system. Unfortunately, such a coordination of multiple images with respect to each other is extremely difficult and even highly trained medical personnel, such as experienced radiologists, have difficulty in consistently and properly interpreting a series of medical images so that a treatment regime can be instituted which best fits the patient's current medical condition.
Another problem encountered by medical personnel today is the large amount of data and numerous images that are obtained from current medical imaging devices. The number of images collected in a standard scan can be in excess of 100 and frequently numbers in the many hundreds. In order for medical personnel to properly review each image takes a great deal of time and, with the many images that current medical technology provides, a great amount of time is required to thoroughly examine all the data.
Accordingly, there exists a need for an efficient approach that uses image processing/computer vision techniques to automatically detect/diagnose diseases. U.S. Publication No. 2004/0064037 (Smith), incorporated herein by reference, is directed to a technique that applies pre-programmed rules that specify the manner in which medical image data is to be classified or otherwise processed. U.S. Publication No. 2003/0095147 (Daw), incorporated herein by reference, relates to a computerized method of medical image processing and visualization.
It is known that malignant breast tumors begin to grow their own blood supply network once they reach a certain size; this is the way the cancer can continue to grow, hi a breast MRI scan, a contrast agent injected into the bloodstream can provide information about blood supply to the breast tissues; the agent "lights up" a tumor by highlighting its blood vessel network. Usually, several scans are taken: one before the contrast agent is injected and at least one after. The pre-contrast and post-contrast images are compared and areas of difference are highlighted. It should be recognized that if the patient moves even slightly between the two scans, the shape or size of the image may be distorted—a big loss of information.
An contrast agent for MRI is Gadolinium or gadodiamide, and provides contrast between normal tissue and abnormal tissue in the brain and body.
Gadolinium looks clear like water and is non-radioactive. After it is injected into a vein, Gadolinium accumulates in the abnormal tissue that maybe affecting the body or head. Gadolinium causes these abnormal areas to become bright (enhanced) on the MRI. This makes it easy to see. Gadolinium is then cleared from the body by the kidneys. Gadolinium allows the MRI to define abnormal tissue with greater clarity. Tumors enhance after Gadolinium is given. The exact size of the tumor and location is important in treatment planning and follow up. Gadolinium is also helpful in finding small tumors by making them bright and easy to see. Dynamic contrast enhanced MRI is used for breast cancer imaging; in particular for those situation that have an inconclusive diagnosis based on x-ray mammography. The MRI study involves intravenous injection of a contrast agent (typically gadopentetate dimeglumine) immediately prior to acquiring a set of 21- weighted MR volumes with a temporal resolution of around a minute. The presence of contrast agent within an imaging voxel results in an increased signal that can be observed over the time course of the experiment.
Study of these signal-time curves enables identification of different tissue types due to their differential contrast uptake properties as illustrated in Figure 1. Typically, cancerous tissue shows a high and fast uptake due to a proliferation of "leaky" angiogenic microvessels, while normal and fatty tissues show little uptake. The uptake (dynamic) curves have often been fitted using a pharmacokinetic model to give a physiologically relevant parameterisation of the curve (refer to P. S. Tofts, B. Berkowitz, M. Schnall, "Quantitative analysis of dynamic Gd-DTPA enhancement in breast tumours using a permeability model", Magn Reson Med 33, pp 564-568, 1995).
U.S. Patent No. 6,353,803 (Degani, Hadassa), incorporated herein by reference, is directed to an apparatus and method for monitoring a system in which a fluid flows and which is characterized by a change in the system with time in space. A preselected place in the system is monitored to collect data at two or more time points correlated to a system event. The data is indicative of a system parameter that varies with time as a function of at least two variables related to system wash-in and wash-out behavior.
Study of these curves/parameters has been used clinically to identify and characterize tumors into malignant or benign classes, although the success has been variable with generally good sensitivity but often very poor specificity (refer to S.C. Rankin "MRI of the breast", Br. J. Radiol 73, pp 806-818, 2000).
While such systems may have achieved certain degrees of success in their particular applications, there is a need for an improved digital image processing method for medical image analysis that overcomes the problems set forth above and addresses the utilitarian needs set forth above.
The present invention provides a method for image analysis and, in particular, for cross-time inspection of tissues of different properties in medical image as a time function.
SUMMARY OF THE INVENTION
An object of the present invention is to provide a method for cross- time inspection of tissues of different properties (for example, abnormal and normal tissues) in medical image as a time function (cross-time image sequences). Any objects provided are given only by way of illustrative example, and such objects maybe exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the disclosed invention may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
The present invention provides a pattern recognition method for cross-time inspection of tissues of different properties using contrast enhanced MRI images augmented with other physical or non-physical factors. The method includes the steps of acquiring a plurality of medical image (e.g. MRI images before and after the injection of contrast enhancement agent) cross-time sequences; performing intra-registration of the plurality of medical image cross- time sequences with respect to spatial coordinates; performing inter-registration of the plurality of medical image cross-time sequences with respect to spatial coordinates; classifying tissues of different properties for the registered plurality of medical image cross-time sequences; and presenting the classification results for cross-time inspection.
According to one aspect of the invention, there is provided a method for automatic abnormal tissue detection and differentiation using contrast enhanced MRI images augmented with other physical or non-physical factors. The method includes the steps of acquiring a plurality of MRI breast image sets; aligning the plurality of MRI breast images with respect to spatial coordinates; differencing the plurality of MRI breast image sets with a reference MRI image set, producing a plurality of difference image sets; segmenting the plurality of difference image sets, producing a plurality of MRI breast images with segmented intensity pixels; applying dynamic system identification to the segmented intensity pixels, producing a plurality of dynamic system parameters; and classifying the plurality of system parameters augmented with other physical or non-physical factors into different classes.
According to another aspect of the invention, there is provided a method for automatic material classification. The method includes the steps of: acquiring a plurality of image sets of an object sequentially in time; aligning the plurality of image sets with respect to spatial coordinates; differencing the plurality of image sets with a reference image set to produce a plurality of difference image sets; segmenting the plurality of difference image sets to produce a plurality of images with segmented intensity pixels; applying dynamic system identification to the segmented intensity pixels of the plurality of images to produce a plurality of dynamic system parameters; and classifying the plurality of system parameters into different classes.
According to still another aspect of the invention, there is provided a method for abnormal tissue detection using contrast enhanced MRI images. The method includes the steps of: acquiring a plurality of MRI breast image sets sequentially in time; aligning the plurality of MRI breast image sets with respect to spatial coordinates; differencing the plurality of MRI breast image sets with a reference MRI image set to produce a plurality of difference image sets; segmenting the plurality of difference image sets to produce a plurality of MRI breast image sets with segmented intensity pixels; applying a dynamic system identification to the segmented intensity pixels of the plurality of MRI breast image sets to produce a plurality of dynamic system parameters; and classifying the plurality of system parameters into different classes to detect abnormal tissue.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings. The elements of the drawings are not necessarily to scale relative to each other.
FIG. 1 is a graph illustrating dynamic contrast uptake properties (curves) for different breast tissues.
FIG. 2 is a schematic diagram of an image processing system useful in practicing the method in accordance with present invention.
FIG. 3 is a flow chart illustrating one embodiment of the automatic abnormal tissue detection method in accordance with the present invention. FIG. 4 is a graph illustrating dynamic contrast uptake properties
(curves) for malignant and benign tumor tissues.
FIG. 5 is a schematic diagram illustrating the concept of step function response and system identification.
FIG. 6 is a flowchart illustrating a method of system identification in accordance with the present invention.
FIG. 7 is a graph illustrating two cross-time image sequences.
FIG. 8 is a flowchart illustrating one embodiment of the cross-time tissue property inspection method in accordance with the present invention.
FIG. 9 is a graph illustrating a method of cross-time tissue property inspection visualization presentations of the present invention.
FIG. 10 is a flowchart illustrating a method of image registration in accordance with the present invention.
FIG. 11 is a graph illustrating image registration concept.
DETAILED DESCRIPTION OF THE INVENTION
The following is a detailed description of the preferred embodiments of the invention, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures. Figure 2 shows an image processing system 10 useful in practicing the method in accordance with the present invention. System 10 includes a digital MRI image source 100, for example, an MRI scanner, a digital image storage device (such as a compact disk drive), or the like. The digital image from digital MRI image source 100 is provided to an image processor 102, for example, a programmable personal computer, or digital image processing work station such as a Sun Sparc workstation. Image processor 102 can be connected to a display 104 (such as a CRT display or other monitor), an operator interface such as a keyboard 106, and a mouse 108 or other known input device. Image processor 102 is also connected to computer readable storage medium 107. Image processor 102 transmits processed digital images to an output device 109. Output device 109 can comprise a hard copy printer, a long-term image storage device, a connection to another processor, an image telecommunication device connected, for example, to the Internet, or the like. hi the following description, a preferred embodiment of the present invention will be described as a method. However, in another preferred embodiment, the present invention comprises a computer program product for detecting abnormal tissues in a digital MRI image in accordance with the method described. In describing the present invention, it should be recognized that the computer program of the present invention can be utilized by any well-known computer system, such as the personal computer of the type shown in Figure 2. However, other types of computer systems can be used to execute the computer program of the present invention. For example, the method of the present invention can be executed in the computer contained in a digital MRI machine or a PACS (picture archiving communication system). Consequently, the computer system will not be discussed in further detail herein. It will be further recognized that the computer program product of the present invention can make use of image manipulation algorithms and processes that are well known. Accordingly, the present description will be directed in particular to those algorithms and processes forming part of, or cooperating more directly with, the method of the present invention. Thus, it will be understood that the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes are conventional and within the ordinary skill in such arts.
Other aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images involved or co- operating with the computer program product of the present invention, are not specifically shown or described herein and can be selected from such algorithms, systems, hardware, components, and elements known in the art.
A computer program for performing the method of the present invention can be stored in a computer readable storage medium. This medium may comprise, for example: magnetic storage media such as a magnetic disk (such as a hard drive or a floppy disk) or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the Internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware. Turning now to Figure 8, the method of cross-time inspection of tissues of different properties in medical image as a time function will be outlined. Figure 8 is a flow chart illustrating one embodiment of the method of the cross- time inspection of tissues of different properties in medical image of the present invention. In the embodiment shown in Figure 8, a plurality of medical image cross-time sequences goes through a series of processes. Each of these processes performs a specific functionality such as intra-sequence registration, inter- sequence registration, dynamic curve classification, and visualization and diagnosis.
Next, the concept of image registration is to be introduced. The method of curve classification will be discussed in depth later. Referring now to Fig. 10, the flow chart of the method of a generic image registration process is shown. The process of image registration is to determine a mapping between the coordinates in one space (a two dimensional image) and those in another (another two dimensional image), such that points in the two spaces that correspond to the same feature point of an object are mapped to each other. The process of determining a mapping between the coordinates of two images provides a horizontal displacement map and a vertical displacement map of corresponding points in the two images. The found vertical and horizontal displacement maps are then used to deform one of the involved images to minimize the misalignment between the two.
In terms of image registration terminology the two images involved in registration process are referred as a source image 1020 and a reference image 1022. Denote the source image and the reference image by I(xt , yt , t) and i(x/+1 , jμ,+i , t + 1) respectively. The notations x and y are the horizontal and vertical coordinates of the image coordinate system, and t is the image index (image 1, image 2, etc.). The origin, (x = 0, y = 0) , of the image coordinate system is defined at the center of the image plane. It should be pointed that the image coordinates, x and y , are not necessarily integers.
For the convenience of implementation, the image (or image pixel) is also indexed as I(i,j) where i and j are strictly integers and parameter t is ignored for simplicity. This representation aligns with indexing a matrix in the discrete domain. If the image (matrix) has a height of h and a width of w, the corresponding image plane coordinates, x and y , at location (i,j) can be computed as χ = i — (w— ϊ)/2.Q , and y = (h - l)/2.0~ j . The column index i runs from 0 to w— 1 . The row index j runs from 0 to h — 1 .
In general, the registration process is to find an optimal transformation function Φt+ι(x t>yt) (see steP 1002) such that
Figure imgf000011_0001
The transformation function of Equation (10-1) is a 3x3 matrix with elements shown in Equation (10-2).
Figure imgf000012_0001
In fact, the transformation matrix consists of two parts, a rotation
.
Figure imgf000012_0002
Noted that the transformation function Φ is either a global function or a local function. A global function Φ transforms every pixel in an image in a same way. A local function Φ transforms each pixel in an image differently based on the location of the pixel. For the task of image registration, the transformation function Φ could be a global function or a local function or a combination of the two.
In practice, the transformation function Φ generates two displacement maps (step 1004), X(i,j) , and Y(i,j) , which contain the information that could bring pixels in the source image to new positions that align with the corresponding pixel positions in the reference image. In other words, the source image is to be spatially corrected in step 1008 and become a registered source image 1024. For both displacement maps, X(i,j) and Y(i,j) , the column index i runs from 0 to w— 1 and the row index j runs from 0 to h - 1 .
An exemplary result of misalignment correction is shown in Fig. 1 1. Tn Fig. 1 13 on the left is the source image 1102; on the right is the reference image 1106. Clearly, there are varying vertical misalignments between the source image 1102 and the reference image 1106. By applying the steps shown in Fig. 10 to these two images, the vertical misalignment corrected source image is obtained (image 1104).
Noted that the registration algorithm used in computing the image transformation function Φ could be a rigid registration algorithm, a non-rigid registration algorithm or a combination of the two. People skilled in the art understand that there are numerous registration algorithms that can carry out the task of finding the transformation function Φ that generates the needed displacement maps for the correction of the misalignment in two relevant images. Exemplary algorithms can be found in "Medical Visualization with ITK", by Lydia Ng, et al. at http://www.itk.org. Also, people skilled in the art understand that spatially correcting an image with a displacement map could be realized by using any suitable image interpolation algorithms (see "Robot Vision" by Berthold Klaus Paul Horn, The MIT Press Cambridge, Massachusetts.)
For the present invention, the above discussed image registration process can be viewed as a black box 1000 with input terminal A (1032), input terminal B (1034) and output terminal D (1036). Box 1000 will be used in the following description of the present invention of cross-time inspection of tissues with different properties.
Now turning back to Figure 8, the processes of intra-sequence and inter-sequence registration will be described. Exemplary MRI image sequences for an object (a beast, for example) are depicted in Figure 7. An MRI image sequence 704 contains an exemplary collection of MRI slice sets 706,708 and 710 for the same object (the breast). Each MRI slice set contains a number of slices that are images (cross-sections) of the object (the breast). Exemplary slices are slice (image) 712 for set 706, slice (image) 714 for set 708, and slice (image) 716 for set 710. Purposely, MRI slice sets are taken at different time to capture functional changes of the object in time space when contrast enhancement agent is administrated. Exemplary time gap between the MRI slice sets could be 1 minute, 2 minutes, etc.
For cross-time inspection of tissues with different properties, besides sequence 704, one or more sequences of MRI image for the same object (the breast) are needed. An exemplary MRI sequence 724 is such a sequence. Sequence 724 is captured at a different time. Exemplary time gap between sequence 724 and sequence 704 could be several months.
Similarly, sequence 724 contains an exemplary collection of MRI slice sets 726,728 and 730 for the same object (the breast). Each MRI slice set contains a number of slices that are images (cross-sections) of the object (the breast). Exemplary slices are slice (image) 732 for set 726, slice (image) 734 for set 728, and slice (image) 736 for set 730. Purposely, MRI slice sets are taken at different time to capture functional changes of the object in time space. Exemplary time gap between the MRI slice sets could be 1 minute, 2 minutes, etc. An intra-sequence registration (804) is defined as registering slices (images) of the same cross-section of an object within a sequence of MRI image sets. Exemplary slices are slices (images) 712, 714, and 716 for sequence 704, and slices (images) 732, 734, and 736 for sequence 724. An embodiment of intra- sequence registration is discussed in the context of the method of tissue property inspection of a set of images, which acts as an independent entity, illustrated in Figure 3. The need of intra-sequence registration stems from the fact that during the process of capturing MRI images, due the inevitable object (breast, for example) motion, images (for example, 712, 714 and 716) for the same cross- section of the object present misalignment. This misalignment may cause errors in the process of tissue property inspection. As stated previously, for cross-time inspection of tissues with different properties, two or more image sequences (such as sequences 704, and 724) obtained at different times are required for the same object. Corresponding slices (such as 712 and 732) in different sequences are most likely misaligned and may have somehow different shapes. An inter-sequence registration (806) is thus needed and defined as registering slices (images) of the same cross-section of an object from different sequences. One embodiment of inter-sequence registration is pair-wise (2D) registration. Exemplary pairs of slices to be inter-registered are pairs 712 and 732, 714 and 734, and 716 and 736. Another embodiment of inter- sequence registration is volume-wise (3D) registration. In volume-wise (3D) registration, intra-registration is applied to individual sequences (e.g. 704 and 724) first. Then the intra-registered sequences are input to box 1000.
Turning now to Figure 3, the method of tissue property inspection of a set of images (also step 808, dynamic curve classification) will be outlined. Figure 3 is a flow chart illustrating one embodiment of the automatic abnormal tissue detection method of the present invention. Noted that the flow chart illustrated in Figure 3 serves as an independent entity that constitutes a self- contained process. Therefore, the flow chart illustrated in Figure 3 is not interpreted as an expansion of step 808. Rather, step 808 and step 804 are explained using the steps shown in the flow chart in Figure 3. In the embodiment shown in Figure 3, a plurality of MRI breast images sets acquired before and after contrast agent injection go through a series of processes. Each of these processes performs a specific functionality such as alignment, subtraction, segmentation, system identification, and classification. In the present invention, abnormal tissue detection tasks are accomplished by a means of dynamic system parameter classification. In the embodiment shown in Figure 3, a first step 202 (also step
802) is employed for acquiring a plurality of MRI breast image sets before and after an injection of contrast agent at one time. For cross-time inspection, step 202 repeats to acquire another plurality of MRI breast image sets before and after an injection of contrast agent at another time. Denote I0 (x, y, z) as a set of MRI image for a breast with a number of images (slices) in a spatial order before an injection of contrast agent, where z<= [I,.. S] is the spatial order index, S is the number of images in the set, x and y are the horizontal and vertical indices respectively for an image where x e [1,...X] and y e [1,...7] . After the administration of contrast agent, a plurality of MRI image sets is acquired with the same number ( S ) of images of the same breast for each set in the same spatial order ∑ . The plurality of MRI image sets is taken with a temporal resolution, for example, of around one minute. This MRI image sets can be expressed by Ik (x, y, z) where k is the temporal order index and Ae [1,...K]; K is the number of sets. Exemplary sets are 706, 708 and 710, (three sets, K = 3), or 726, 728 and 730, (three sets, K = 3). An exemplary slice h (x >yΛ) (at location 1) for set 706 (the first set for sequence 704, k = 1 ) is slice 712.
The presence of a contrast agent within an imaging voxel results in an increased signal that can be observed over the time course of the image acquisition process. Study of these signal-time curves enables identification of different tissue types due to their differential contrast uptake properties. For the purpose of automatic detection of abnormal tissues, the K sets of MRI images, Ik (jc, y, z) , taken after the injection of contrast agent have to be spatially aligned (misalignment correction), in a step 204 (also step 804 intra-sequence registration), with a reference set of MRI images with respect to spatial coordinates x, y . In general, the reference set of MRI image is the set of MRI images, I0(x, y,z) , taken before the injection of the contrast agent. The alignment process ensures that pixels belong to a same tissue region of the breast have the same x, y coordinates in all the K sets of images. The alignment process executes the following: for k - \ : K for z = l: S align(Ik (x, y, z), I0 (x, y, z)) end end
Using the black box 1000, Ik (x, y, z) is input to terminal A (1032), I0(x, y,z) is input to terminal B (1034) and the registered image of Ik(x,y,z) is obtained at output terminal D (1036). An exemplary method employable to realize the alignment function, align(A, B) , is a non-rigid registration that aligns A with B and is widely used in medical imaging and remote sensing fields. The registration process (misalignment correction) has been discussed previously. Persons skilled in the art will recognize that other registration methods could also be used. As was shown in Figure 1 , after the injection of contrast agent, image pixel intensity increases differently for different breast tissues. This phenomenon indicates that subtracting the image taken before the injection from the image taken after the injection will provide radiologists with clearer information of locations of abnormal tissues in the image. This information can also be used to extract regions from the original MRI breast images for automatic abnormal tissue detection and differentiation. This information is obtained in step 206 in Figure 3 that carries out differencing the plurality of MRI breast image sets, lk(x, y,z) , Ae [1,..X] with a reference MRI image set to produce a plurality of difference image sets, δlk (x, y, z), k e [1,..JST]. The set of MRI images, I0 (x, y, z) , is selected as intensity reference images. The differencing process is executed as following:
for Jc = I : K for z = l : S δlk (x, y, z) = subtraction{Ik (x, y, z), I0 (x, y, z)) end end wherein the function, subtraction(A,B) , subtracts B from A.
In Figure 3 at step 208, the difference images, δlk (x, y, z) , are subject to a segmentation process that first evaluates the plurality of difference image sets δik O, y, z) , and produces a plurality of mask image sets, Mk (x, y, z), k e [1,...K] obtained by executing: for k = \ : K for z = \ : S for x = l : X for y = 1 : Y if δtk (x,y, z) > T
Mk (χ> y> ) - ! end end end end end wherein mask image sets, M k (x, y, z), k e [1,...K] , are initialized with zeros, T is a statistical intensity threshold. An exemplary value of T is an empirical value 10. The segmentation process in step 208 segments the images in the plurality of MRI breast image sets, Ik(x,y,z) , according to the non-zero pixels in the mask images, M k (x,y, z) , to obtain segmented intensity pixels in the images of the plurality of MRI breast image sets. Denote the resultant images by Sk(,x,y,z),k<= [1,...K] , the segmentation operation can be expressed as: for k = l:K for z = l: S for x = l:X for y = 1 : Y ifMk(x,y,z) = l Sk(x,y,z) = Ik(x,y,z) end end end end end wherein images, Sk (X y, z) , are initialized as zeros. Persons skilled in the art will recognize that, in practical implementation, the stage of generating mask images can be omitted and the segmentation process can be realized by executing the following:
for k = \:K for z = l: S for x = l:X for y = l:Y ifδlk(x,y,z)>T Sk(x,y,z)=Jk(x,y,z) end end end end end wherein images, Sk (x, y, z) , are initialized as zeros.
Referring now to Figure 4, there is shown a chart that is a replica to the chart shown in Figure 1 except that Figure 4 includes the insertions of a step function, /(O , curve 302 and the removal of the normal and fat tissue curves.
It is the intention of the present invention to detect abnormal tissues and more importantly to differentiate Malignant from Benign tissues. (Note: the step function, /(O , is defined as /(/ < 0) = 0; f(t > 0) = |λ|; λ ≠ 0 ). Pixels that belong to normal and fat tissues are set to zeros in images Sk (x, y, z) in the segmentation step 208. The remaining pixels in images Sk (x, y, z) belong to either malignant or benign tissues. It is practically difficult if not impossible to differentiate malignant tissue from benign tissue by just assessing the pixels brightness (intensity) in a static form, that is, in individual images. However, in a dynamic form, the brightness changes present a distinction between these two types of tissues. As shown in Figure 4, starting from time zero, the brightness (contrast) curve 304, m(t) , of the malignant tissue rises quickly above the step function curve 302 and then asymptotically approaches the step function curve 302; while the brightness (contrast) curve 306, b(t) , of the benign tissue rises slowly underneath the step function curve 302 and then asymptotically approaches the step function curve, /(0 , 302. Persons skilled in the art can recognize that the brightness
(contrast) curve 304, m{i), resembles a step response of an underdamped dynamic system, while the brightness (contrast) curve 306, b(t) , resembles a step response of an overdamped dynamic system.
An exemplary generic approach to identifying a dynamic system behavior is generally depicted in Figure 5. For an unknown dynamic system 404, a step function 402 is used as an excitation. A response 406 to the step function 402 from the dynamic system 404 is fed to a system identification step 408 in order to estimate dynamic parameters of system 404.
An exemplary realization of dynamic system modeling 212 (of Figure 3) is shown in Figure 6 where it is shown an ARX (autoregressive) model 500 (refer to "System identification Toolbox", by Lennart Ljung, The Math Works).
A general ARX model can be expressed as the equation:
y(t) = G(q)f(t) + H(q)ε{t) (1)
where G(q) (506) and H(q) (504) are the system transfer functions as shown in Figure 6, u(t) (502) is the excitation, ε(t) (508) is the disturbance, and y(t) (510) is the system output. It is known that the transfer functions G(q) (506) and H(q) (504) can be specified in terms of rational functions of q l and specify the numerator and denominator coefficients in the forms:
G(q) = q -,,k B(q)
(2) A(q)
H(q) = (3)
Aq) wherein A and B are polynomials in the delay operator g ' :
A(q) = \ + alq-1 + + anaq~"a (4)
B(q) = bi +b2q-l + + aq-nM (5)
When A and B are polynomials, the ARX model of the system can be explicitly rewritten as: y(t) = ~axy(t - 1) - ...- am y(t - no) + bxu(t - nk) + ..bnbu(t -nk -nb + ϊ) + e(t) (6)
Equation (6) can be further rewritten as a regression as follows: y(t) = φ(t)τθ (7)
where φ{t)
Figure imgf000020_0001
The system identification solution for the coefficient vector θ is
^ = (φrφ)-'φry (8) where
Figure imgf000020_0002
and
Figure imgf000021_0001
In Equations (9) and (10), ^0 is the data sampling starting time and N1 is the number of samples.
In relation to the brightness (contrast) curve m(t) 304, and the brightness (contrast) curve bit) 306,
Figure imgf000021_0002
respectively.
In this particular case, «(*) is a step function. And the corresponding solutions are θm and θb . The computation of θ realizes the step of Dynamic system identification 210 (also step 408).
Referring again to Figure 3, in order to classify (step 214) a region with high contrast brightness in MRI images as benign or malignant tumor, a supervised learning step 218 is needed.
A supervised learning is defined as a learning process in which the exemplar set consists of pairs of inputs and desired outputs. In this MRI image breast tissue classification case, the exemplar inputs are θm and θb (or the known curves), the exemplar desired outputs are indicators Om and Ob for malignant and benign tumors respectively. In Figure 3, step 218 receives M sample breast MRI dynamic curves with known characteristics (benign or malignant) from step 216. An exemplary value for M could be 100. Within the M curves, there are M1n curves belong to malignant tumors and Mb curves belong to benign tumors. Exemplary values for Mn, and Mb could be 50 and 50. In step 218, applying Equation (8) to all the sample curves generates M coefficient vectors θ among which, Mm coefficient vectors (denoted by θm' , i — 1...Mn, ) represent malignant tumor with indicator O111 , and Mb coefficient vectors (denoted hyθb'ii = 1..JW6) represent benign tumor with indicator Ob . These learned coefficient vectors θm' and θb' are used to train a classifier that in turn is used to classify a dynamic contrast curve in a detection or diagnosis process.
To increase the specificity (accuracy in differentiating benign tumors from malignant tumors) other factors (step 220) can be incorporated into the training (learning) and classification process. It is known that factors such as the speed of administration of the contrast agent, timing of contrast administration with imaging, acquisition time and slice thickness (refer to "Contrast-enhanced breast MRI: factors affecting sensitivity and specificity", by CW. Piccoli, Eur. Radiol. 7 (Suppl. 5), S281-S288 (1997)).
Denote the speed of administration of the contrast agent by a , the timing of contrast administration with imaging by β , the acquisition time by y and slice thickness by δ . These exemplary factors are to be used in conjunction with the coefficient vectors θm' and θb' to train the classify that that in turn is used to classify a region in the MRI breast image into malignant or benign tumor classes. Noted that these exemplary factors should be quantified in a range comparable to that of the coefficient vectors θm' and θb' .
For the learning and training purpose, construct the training data set {p,r,},./ = l.. /,-rJ = {-l)l}5p7 e 9?rf (11) wherein T7 are the class labels.
For example, if the tumor is malignant, r, =1 , otherwise, τ, =-1. The vector p y = [θ, a, β, γ, δ] is traditionally called feature vector in computer vision literature. The notion 9ϊrf represents a domain, d is the domain dimension. For this exemplary case, assume that the coefficient vector 0 has five elements, then d = 9. The data format in Equation (11) is used in leaning step 218 as well as in classification step 214. Persons skilled in the art understand that the data vector Py can be constructed in a different manner and augmented with different physical or non-physical numerical elements (factors) other than the ones aforementioned.
There are known types of classifiers that can be used to accomplish the task of differentiating malignant rumors from benign tumors with the use of dynamic contrast curves along with other physical or non-physical factors. An exemplary classifier is an SVM (support vector machine) (refer to "A Tutorial on Support Vector Machines for Pattern Recognition", by C. Burges, Data Mining and Knowledge Discovery, 2(2), 1-47, 1998, Kluwer Academic Publisher, Boston, with information available at the website: http://ava.technion.ac.il/karniel/CMCC/SVM-tutorial.pdf>.
An example case of an SVM classifier would be training and classification of data representing two classes that are separable by a hyper-plane. A hyper-plane that separates the data satisfies w»p + σ = 0 (12) where • is a dot product.
The goal of training the SVM is to determine the free parameter w and σ . A scaling can always be applied to the scale of W and σ such that all the data obey the paired inequalities: ry(w.Py +σ)-l> 05Vy (13)
Equation (13) can be solved by minimizing a Lagrangian function
Figure imgf000023_0001
with respect to the parameter w , and maximize it with respect to the undetermined multipliers ξj ≥O.
After the optimization problem has been solved, the expression for w in Equation (13) can be rewritten in terms of the support vectors with non-zero coefficients, and plugged into the equation for the classifying hyper-plane to give the SVM decision function: Ψ(pne«. )=(w «p,iew +σ)= ∑TjξjPj »pnew +σ (15)
wherein ls is the number of support vectors. Classification of a new vector pnew into one of the two classes (malignant and benign) is based on the sign of the decision function. Persons skilled in the art will recognize that in non-separable case, non-linear SVMs can be used.
The above described method of tissue property inspection of a set of images (also steps 804 and 808) is applied to all the cross-time image sequences such 704 and 724 for cross-time tissue property inspection. It is understood that in the present invention, the cross- time image sequence go through the steps of intra- registration and inter-registration before entering step 808. One exemplary execution procedure of the steps of intra-registration and inter-registration for the exemplary sequences is applying intra-registration to sequence 704 first, then applying inter-registration to sequences 704 and 724. People skilled in the art should know that the roles of sequences 704 and 724 are exchangeable. For intra-registering sequence 704 for this particular exemplary execution procedure, select arbitrarily a set of images as the reference image set, e.g. set 706. Images of set 706 are input to terminal B (1034), other image sets (708 and 710) are input to terminal A (1032). The registered images of image sets (708 and 710) are obtained at terminal D (1036). For inter-registration for this particular exemplary execution procedure, images of sequence 724 are input to terminal A (1032), images of sequence 704 are input to terminal B (1034) and the registered images of sequence 724 are obtained at output terminal D (1036). ,
Upon the completion of step 808, multiple dynamic curves (two curves in the current exemplary case) are generated reflecting tissue properties captured in multiple cross-time image sequences (two sequences 704 and 724 for the current exemplary case) at multiple time instances (two for the current exemplary case). It is well known that these dynamic curves provide the medical professionals with valuable information regarding disease conditions (or progressions) for patients, hi step 810, visualization tools are employed for medical professional to examine concerned regions of the object (regions of interest in the images) for better diagnosis. One embodiment of such visualization facility is illustrated in Figure 9.
There is shown in Figure 9 a computer monitor screen 900 (also 104 in Figure 2) hooked up to an image processor (102) that executes previously described steps. On screen 900, two representative image slices 712 and 732 are shown on the left. For example, slice 712 is the first image of
Figure imgf000025_0001
e [l,2,3] across three sets (706, 708 and 710) at spatial location 1; slice
732 is the first image of Ik
Figure imgf000025_0002
[l,253] across three sets (726, 728 and 730) at spatial location 1. Breast images 902 and 912 are shown in slices 712 and 732. Breast images 902 and 912 are the images of a same cross-section of a breast. In operation, a medical professional moves a computer mouse 906 (as a user interface) over a location 908 in slice 712. Simultaneously, a ghost mouse 916 appears at the same spatial location 918 in slice 732 as 908 in slice 712. The user also can move a computer mouse 916 (as a user interface) over a location 918 in slice 732. Simultaneously, a ghost mouse 906 appears at the same spatial location 908 in slice 712 as 918 in slice 732. hi either case, two dynamic curses (924 solid and 926 dashed) appear at the left side (922) of the screen. The exemplary curves 924 and 926 reflect different tissue properties for the same spot of a breast at two different times. For example, the image sequence containing slice 712 may be taken 6 months prior capturing the sequence containing slice 732. The medical professional can move the mouse to other locations to examine the change of the tissue properties over time (6 months). With this visualization facility, disease progression can be readily analyzed. People skilled in the art should understand that tissue properties could be represented by other means besides the dynamic curve plots 924 and 926. For example, tissue properties could be represented by colored angiogenesis maps. People skilled in the art also understand that multiple cross-time image sequences can be processed by the method of the current invention and multiple dynamic curves can be displayed simultaneously for medical diagnosis. The subject matter of the present invention relates to digital image processing and computer vision technologies, which is understood to mean technologies that digitally process a digital image to recognize and thereby assign useful meaning to human understandable objects, attributes or conditions, and then to utilize the results obtained in the further processing of the digital image.

Claims

CLAIMS:
1. A method for cross-time medical image inspection, comprising: accessing a plurality of medical image cross-time sequences; performing intra-registration of the plurality of medical image cross-time sequences with respect to spatial coordinates; performing inter-registration of the plurality of medical image cross-time sequences with respect to spatial coordinates; classifying tissues of different properties for the registered plurality of medical image cross-time sequences; and displaying the classified tissues.
2. A method for cross-time medical image analysis, comprising: accessing a first set of medical images of a subject captured at a first time period; accessing a second set of medical images of the subject captured at a second time period, the first and second sets each being comprised of a plurality of medical images; performing image registration by mapping the plurality of medical images of the first and second sets to predetermined spatial coordinates; performing cross-time image mapping of the first and second sets; and providing means for interactive cross-time medical image analysis.
3. The method of claim 2, wherein the step of performing image registration comprises: performing intra-registration of the plurality of medical images of the first and second sets; and performing inter-registration of the plurality of medical images of the first and second sets.
4. The method of claim 2, further comprising performing tissue property inspection of at least one of the images of the first and second sets.
5. The method of claim 2, further comprising: accessing a reference set of medical images of the subject; differencing the first and second sets with the reference set to generate a difference image set comprised of a plurality of images; segmenting the plurality of images of the difference image set to generate a plurality of images having segmented intensity pixels; applying a system identification to the plurality of images having segmented intensity pixels to generate a plurality of system parameters; and classifying the plurality of system parameters.
6. The method of claim 5, further comprising, prior to classifying the plurality of system parameters, augmenting the system parameters with physical or non-physical factors.
7. The method of claim 2, further comprising, after performing image registration, classifying tissues of different properties.
8. A method for tissue analysis of MRI contrast enhanced mammography images, comprising: accessing a mammography image set comprised of a plurality of MRI contrast enhanced mammography images taken sequentially in time; mapping the plurality of MRI images to a predetermined spatial coordinate; accessing a reference MRI mammography image set; differencing the mammography image set with the reference MRI mammography set to generate a difference image set; segmenting the difference image set to generate a plurality of images having segmented intensity pixels; applying a system identification to the plurality of images having segmented intensity pixels to generate a plurality of system parameters; and classifying the plurality of system parameters.
9. The method of claim 8, further comprising, prior to classifying the plurality of system parameters, augmenting the system parameters with physical or non-physical factors.
10. The method of claim 8, wherein the step of accessing a mammography image set comprised of a plurality of MRI contrast enhanced mammography images taken sequentially in time is accomplished by: acquiring a first plurality of MRI mammography images in a spatial order prior to injection of a contrast agent; acquiring a second plurality of MRI mammography images in a spatial order after injection of a contrast agent, the first and second plurality of MRI images having an equal number of images; and organizing the first and second plurality of MRI mammography images in a temporal order.
11. A pattern recognition method for human tissue, comprising: accessing a mammography image set comprised of a plurality of
MRI contrast enhanced mammography images taken sequentially in time; mapping the plurality of MRI mammography images to a predetermined spatial coordinate; accessing a reference MRI mammography image set; differencing the mammography image set with the reference MRI mammography set to generate a difference image set; segmenting the difference image set to generate a plurality of images having segmented intensity pixels; applying a system identification to the plurality of images having segmented intensity pixels to generate a plurality of system parameters; and classifying the plurality of system parameters into classes to detect abnormal tissue.
12. The method of claim 13, further comprising providing means for indicating a region of interest in one of the plurality of images.
13. The method of claim 14, further comprising highlighting the region of interest in the other images of the plurality of images.
PCT/US2006/049329 2005-12-30 2006-12-27 Cross-time inspection method for medical diagnosis WO2008002325A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP06851504A EP1969563A2 (en) 2005-12-30 2006-12-27 Cross-time inspection method for medical diagnosis
JP2008548693A JP2009522004A (en) 2005-12-30 2006-12-27 Follow-up inspection method for medical diagnosis

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US75515605P 2005-12-30 2005-12-30
US60/755,156 2005-12-30
US11/616,316 2006-12-27
US11/616,316 US20070160276A1 (en) 2005-12-29 2006-12-27 Cross-time inspection method for medical image diagnosis

Publications (2)

Publication Number Publication Date
WO2008002325A2 true WO2008002325A2 (en) 2008-01-03
WO2008002325A3 WO2008002325A3 (en) 2008-04-03

Family

ID=38846142

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/049329 WO2008002325A2 (en) 2005-12-30 2006-12-27 Cross-time inspection method for medical diagnosis

Country Status (3)

Country Link
EP (1) EP1969563A2 (en)
JP (1) JP2009522004A (en)
WO (1) WO2008002325A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5478832B2 (en) * 2008-03-21 2014-04-23 株式会社東芝 Medical image processing apparatus and medical image processing program
EP3307173B1 (en) * 2015-06-12 2021-08-18 Koninklijke Philips N.V. System for identifying cancerous tissue
JP6933498B2 (en) * 2016-06-06 2021-09-08 キヤノンメディカルシステムズ株式会社 Medical information processing equipment, X-ray CT equipment and medical information processing program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167395A1 (en) * 2003-01-15 2004-08-26 Mirada Solutions Limited, British Body Corporate Dynamic medical imaging

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317617B1 (en) * 1997-07-25 2001-11-13 Arch Development Corporation Method, computer program product, and system for the automated analysis of lesions in magnetic resonance, mammogram and ultrasound images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167395A1 (en) * 2003-01-15 2004-08-26 Mirada Solutions Limited, British Body Corporate Dynamic medical imaging

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FABER T L ET AL: "SPATIAL AND TEMPORAL REGISTRATION OF CARDIAC SPECT AND MR IMAGES: METHODS AND EVALUATION" RADIOLOGY, OAK BROOK,IL, US, vol. 179, no. 3, 1991, pages 857-861, XP008028789 ISSN: 0033-8419 *
LEUNG K K ET AL: "Analysis of serial MR images of joints" BIOMEDICAL IMAGING: MACRO TO NANO, 2004. IEEE INTERNATIONAL SYMPOSIUM ON ARLINGTON,VA, USA APRIL 15-18, 2004, PISCATAWAY, NJ, USA,IEEE, 15 April 2004 (2004-04-15), pages 221-224, XP010773748 ISBN: 0-7803-8389-3 *
See also references of EP1969563A2 *
SKARPATHIOTAKIS M ET AL: "DEVELOPMENT OF CONTRAST DIGITAL MAMMOGRAPHY" MEDICAL PHYSICS, AIP, MELVILLE, NY, US, vol. 29, no. 10, October 2000 (2000-10), pages 2419-2426, XP001151616 ISSN: 0094-2405 *

Also Published As

Publication number Publication date
JP2009522004A (en) 2009-06-11
WO2008002325A3 (en) 2008-04-03
EP1969563A2 (en) 2008-09-17

Similar Documents

Publication Publication Date Title
US7317821B2 (en) Automatic abnormal tissue detection in MRI images
US20070237372A1 (en) Cross-time and cross-modality inspection for medical image diagnosis
US20070160276A1 (en) Cross-time inspection method for medical image diagnosis
EP1966762A2 (en) Cross-time and cross-modality medical diagnosis
US7738683B2 (en) Abnormality detection in medical images
US7653263B2 (en) Method and system for volumetric comparative image analysis and diagnosis
Lladó et al. Automated detection of multiple sclerosis lesions in serial brain MRI
Cárdenes et al. A multidimensional segmentation evaluation for medical image data
US9235887B2 (en) Classification of biological tissue by multi-mode data registration, segmentation and characterization
US9256966B2 (en) Multiparametric non-linear dimension reduction methods and systems related thereto
EP2100275B1 (en) Comparison workflow automation by registration
Ayachi et al. Brain tumor segmentation using support vector machines
US10388017B2 (en) Advanced treatment response prediction using clinical parameters and advanced unsupervised machine learning: the contribution scattergram
US20080298657A1 (en) Computer-Aided Method for Detection of Interval Changes in Successive Whole-Body Bone Scans and Related Computer Program Program Product and System
US20070003118A1 (en) Method and system for projective comparative image analysis and diagnosis
Liu Symmetry and asymmetry analysis and its implications to computer-aided diagnosis: A review of the literature
US20070014448A1 (en) Method and system for lateral comparative image analysis and diagnosis
US8682051B2 (en) Smoothing of dynamic data sets
EP1952340A1 (en) Superimposing brain atlas images and brain images with delineation of infarct and penumbra for stroke diagnosis
Behrenbruch et al. Fusion of contrast-enhanced breast MR and mammographic imaging data
WO2008002325A2 (en) Cross-time inspection method for medical diagnosis
CN1836258B (en) Method and system for using structure tensors to detect lung nodules and colon polyps
Pathak et al. Quantitative image analysis: software systems in drug development trials
CN101390127A (en) Cross-time inspection method for medical diagnosis
Abd Hamid et al. Incorporating attention mechanism in enhancing classification of alzheimer’s disease

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06851504

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2008548693

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006851504

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 200680053361.6

Country of ref document: CN