US6246782B1 - System for automated detection of cancerous masses in mammograms - Google Patents
System for automated detection of cancerous masses in mammograms Download PDFInfo
- Publication number
- US6246782B1 US6246782B1 US08/870,709 US87070997A US6246782B1 US 6246782 B1 US6246782 B1 US 6246782B1 US 87070997 A US87070997 A US 87070997A US 6246782 B1 US6246782 B1 US 6246782B1
- Authority
- US
- United States
- Prior art keywords
- roi
- context data
- mammogram
- neural net
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/92—Computer assisted medical diagnostics
- Y10S128/925—Neural network
Definitions
- the present invention relates generally to the field of computer-aided diagnosis of medical images. More specifically, the present invention discloses an automated system for detecting cancerous masses in digital mammograms.
- the prior art includes a variety of systems to detect both microcalcifications and mass lesions.
- the prior art techniques include either the use of features extracted by human observers or computer-extracted features.
- the latter features for microcalcifications include shape analysis such as compactness, Fourier descriptors of image boundaries, and average distance between calcifications applied to the extracted features.
- features used include spiculations or irregular masses that are identified by local radiating structures or analysis of gradient histograms generated by seed growing and local thresholding methods, morphologically-based features and image texture. These features are classified using either neural networks or binary decision trees. Common to all of these approaches is limited testing, with the computationally- intensive nature of the process implied as one reason for less than full, comprehensive testing.
- Davies et al. have described a system for detection of clusters of calcifications using image analysis techniques. Their method was based on a local area thresholding process, and was also able to identify other breast structures relative to the clusters of calcification. The results of their feasibility trial were a 100% true positive classification (for clusters), with an 8% false positive rate and 0% false negatives.
- Kegelmeyer et al. have reported promising results when applying algorithms using binary decision trees and a dense feature map approach in identifying spiculated lesions and calcifications. Kegelmeyer et al. have achieved 100% sensitivity and 82% specificity when merging edge information identifying spicules with local texture measures, thus eliminating false-positive detections.
- the present approach is based on an innovative concept to detect patterns in medical images (e.g., mammograms) using a Fourier transform optical correlator for image analysis followed by a series of neural networks hosted on a digital computer to analyze the results.
- An optical processor allows higher-order bandwidth, multi-resolution, multi-orientation approaches for feature extraction and enhancement that are not feasible in feal time on a digital computer.
- a hybrid optical/digital computer approach ensures sufficient processing power at a moderate cost to accommodate discriminating algorithms and yet analyze a mammogram in a matter of seconds.
- This invention provides a system for automated detection of cancerous masses in mammograms.
- the mammogram is digitized and regions of interest (ROIs) are detected using Fourier analysis (e.g., by means of an optical correlator).
- ROIs regions of interest
- the pixels in the ROI are averaged together to create a smaller array of super-pixels that are input into a first neural net.
- Context data is extracted from the mammogram for each ROI, such as size, location, ranking, brightness, density, and relative isolation from other ROIs.
- a second neural net receives the output values from the first neural net and the context data as inputs and generates an output score indicating whether the ROI contains a cancerous mass.
- the second neural net can also be provided with context data from another view of the same breast, the same view of the other breast, or a previous mammogram for the same patient.
- FIG. 1 is a simplified block diagram of the present invention.
- FIG. 2 is a flow chart of the present invention.
- FIG. 3 is a section of a mammogram showing a region of interest that is potentially cancerous.
- FIG. 4 is a section of a mammogram corresponding to FIG. 3 that has been overlaid with a super-pixel grid.
- FIG. 5 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been reduced to super-pixels.
- FIG. 6 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been enhanced.
- FIG. 7 is an example of a report consisting of two views in a mammogram with the suspicious regions of interest marked.
- FIG. 1 provides a simplified block diagram of the overall system. A corresponding flow chart of the process used in the present invention is illustrated in FIG. 2 .
- the first step in the process shown in FIG. 2 is acquisition of a digital mammogram 20 for analysis.
- a wide variety of hardware 11 can be used for image acquisition.
- a high-resolution scanner can be used to scan a conventional mammogram film, or a direct digital image can be acquired.
- Mammograms are typically paired by views (mediolateral and craniocaudal).
- raw digitized images 20 are received by the workstation. These images are raw, single byte images in the 35-70 micron resolution range. Either a fixed size is used for the image or information on the pixel width and height of the image is included. A header is added to the image that allows the system to know the size and data type of the image.
- a database is used to track the image, its size, and orientation (left breast, right breast, craniocaudal, or mediolateral). The quality of imagery is also examined. This is primarily a check that the image is a mammogram (i.e., it has the features of a mammogram) and that it meets specifications indicative of a mammogram ready to be read by a radiologist.
- Image processing hardware 12 reduces the scale of the image to a resolution of approximately 230 microns. This reduces the processing requirement while maintaining sufficient resolution to identify regions of interest (ROIs).
- ROIs regions of interest
- a blurring/contrast reduction algorithm is used where the image is superimposed on the background, to reduce “edge effects” (i.e., erroneous high frequency signals created by a sudden drop-off or edge).
- Regions of interest are identified in the image by Fourier analysis, as shown at reference number 21 in FIG. 2 . This is done by comparing four bandpassed images with the original image and looking for “peaks”. Bandpass filtering can be done in the digital domain using conventional computer hardware and signal processing algorithms. However, an optical correlator 13 such as that shown in U.S. Pat. No. 5,418,380 (Simon et al.) is used in the preferred embodiment of the present invention to decrease processing time and increase through-put. This approach selects bright spots of specific sizes and passes them on to the neural networks 14 , 15 for analysis.
- bandpass filters are used to detect feature sizes. These bandpass filters are as follows:
- the 230-micron images are correlated with each of the bandpass filters. For each of these four bandpasses, the strongest peaks are detected, then the sizes of the peaks are estimated by measuring the number of pixels around the peak that are:
- Peaks that are too small or too large relative to the scale of the bandpass are immediately rejected. A more detailed size determination is made on the remaining peaks.
- a radial size is calculated by measuring the distance from the peak in several directions until the pixel brightness is:
- FIG. 3 is an example of a section of a mammogram showing a region of interest that is potentially cancerous. These ROIs are extracted from the original high-resolution image in squares approximately 2-3 times feature size as measured by the first criteria and scaled to fixed size (256 ⁇ 256 pixels). There are eight possible ROI resolutions, ranging from a minimum size of 9.7 mm (38 micron resolution) to a maximum size of 7.8 cm (305 micron resolution).
- Each region of interest is analyzed by means of a neural network using a super-pixeled image as an input.
- This grid is a radial-polar grid with the angles evenly spaced and constant radial increments.
- the grid consists of 10 equally-spaced radial and 32 equally-spaced angular bands.
- the pixels in each grid space are averaged together to create 320 “super-pixels” (reference numeral 24 in FIG. 2 ).
- the super-pixels, along with two inputs determined from the approximate size of the feature, are used as the inputs to the first neural net 14 .
- FIG. 4 is a section of a mammogram corresponding to FIG. 3 that has been overlaid with a super-pixel grid.
- FIG. 5 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been reduced to super-pixels.
- the first neural network 14 is trained on both cancerous and non-cancerous regions of interest.
- Regions of Interest are selected from known (verified normal or verified cancerous) mammograms. These ROIs are ranked according to their appearance, with +1 being most normal-like and ⁇ 1 being most lesion-like.
- the ROIs are then converted to super-pixels for input to the first neural network. A random scaling, rotation and translation of the super-pixel grid is performed, and half of the image (angularly) is chosen in a way that excludes any parts of the ROI outside the original area of the mammogram (These edges are generated by superimposing an odd-sized image on a fixed-sized background).
- the values of the half-grid are normalized in two steps: A stretching of the values over a fixed range increases the contrast within the half-grid, while a normalization of the sum of all the half-grid inputs assures consistency in brightness over all possible half-grids. A statistical thresholding of the ROI is used to determine the approximate “size” of the main feature in the ROI. This size measurement is converted to an equivalent diameter, perturbed by the same scaling perturbation used on the super-pixel grid, then converted into a pair of inputs, using a sine-cosine transformation. These two values, along with the grid super-pixels, are input to the first neural network using a randomly initialized set of neural network weights.
- the output of the first neural network (restricted between +1 and ⁇ 1) is compared with the chosen ranking.
- An error value is calculated and used to correct the neural network weights using a back-propagation learning algorithm known as REM (Recursive Error Minimization).
- the REM algorithm uses the derivatives of the propagated error to create a more stable convergence than conventional back-propagation.
- a new set of perturbations is then applied to the next ROI, creating a new half-grid, which is again normalized and applied to the network. This process is repeated until the RMS of the error values drops to a small value, or the change in the error over several iterations becomes very small, whichever occurs first. This signifies that the network is now trained.
- the number of iterations depends on several conditions, but in general will take about 1000-3000 cycles through the complete set of ROIs.
- a set of ROIs selected from “unknown” mammograms is processed through the neural network (reference numeral 25 in FIG. 2 ). No perturbations are done on these inputs, although the grid is rotated through all possible angles, one angular grid space at a time, such that no half-grid includes any “edges”. Each half-grid in turn is normalized as above, and applied to the trained network along with the two size inputs from the ROI. The outputs for the rotations are combined, producing two or three statistical outputs for each ROI. These outputs are then used as part of the set of inputs for the second neural network 15 , as discussed below.
- FIG. 6 is a section of a mammogram corresponding to FIG. 3 after the ROI has been enhanced to emphasize the contours and lines of the features in the image.
- these enhanced images are used, along with the original ROIs, to generate context inputs for the second neural net 15 (or “context network”).
- the context inputs for the context network 15 includes the location, ranking, brightness and density profiles, sizes of inner and outer features, relative isolation from other ROIs, roughness of contours, fuzziness, and the number and length of any spicules.
- the distance of the center of each ROI from the nipple is determined by:
- the context inputs to the second neural net 15 are calculated by using sine and cosine transformations of the parameters defined above.
- the context neural net 15 is initially trained, again using verified cancerous and non-cancerous cases with random deviations for each input to generate training examples.
- the context inputs for each ROI are determined, including the outputs from the trained first neural network 14 .
- Each input is perturbed by a random value (e.g., +/ ⁇ 10%) and then converted into a pair of inputs using a sine-cosine transformation.
- the full set of inputs is applied to the second neural network 15 using a randomly initialized set of neural network weights.
- the output of the second neural network 15 is compared to the same ranking value used for the first neural network 14 .
- the weights are corrected by the method cited above, and the second network 15 is trained to convergence using about 1000-3000 cycles through the set of ROIs.
- the “unknown” ROIs are passed once through the trained second network 15 without perturbations.
- the output of this network is then combined with that of other ROIs belonging to the same mammogram (including other views, other breast and/or other years) to characterize the ROI and the mammogram.
- the trained context neural network 15 can then be used to evaluate unknown cases (reference numeral 26 in FIG. 2 ).
- the context neural net 15 is a second network separate from the first neural network 14 used in step 3.
- the context neural net 15 could be substituted.
- a single neural net could receive both the ROI super-pixels and context data as inputs.
- an ROC curve is generated from the output score produced by the second neural network in response to sets of known mammograms.
- An appropriate threshold output score can be determined from the resulting ROC curve which will achieve the desired probability of detection and false alarm rate.
- This threshold can then be used to classify unknown cases during normal operation of the system (reference numeral 27 in FIG. 2 ).
- the threshold score is used by the decision software 16 , as depicted in FIG. 1, to generate a report for the radiologist.
- FIG. 7 is an example of a report consisting of two views in a mammogram with the suspicious regions of interest marked. This report typically shows both views of the mammogram, along with a classification determined by the analysis of the neural networks. The locations of suspect masses are marked on the display.
- the final classifications might include the following:
- “Suspect” mammogram has at least one ROI detected on either view with a value less than a predetermined threshold (i.e., more lesion-like), but more than a fixed level (e.g., 0.2) below the threshold.
- a predetermined threshold i.e., more lesion-like
- a fixed level e.g., 0.2
- the present system can be used by a radiologist in either of two modes of operation.
- the system merely aides the radiologist by flagging any ROIs falling into the “Suspect” or “Cancer” classifications.
- the radiologist continues to thoroughly review each mammogram and remains responsible for diagnosis.
- the system is equipped with an image archive 18 shown in FIG. 1 .
- the archive can store each mammogram or only those mammograms designated by the radiologist.
- screening mode the system analyzes each mammogram and only those classified as “Suspect” or “Cancer” are brought to the attention of the radiologist for further review. “Normal” mammograms are sent directly to the image archive 18 for archival storage.
- a number of additional types of context data can be input into the context neural net to enhance the accuracy of the system.
- the ROIs can be analyzed in the context of information from the other view of same breast. This step can be combined with the previous step using the second neural network 15 as described above, or performed separately using a third neural network.
- the additional context data can include information on the size and brightness of the ROI under question as well as other ROIs in the same breast and ROIs in the alternate view which have nearly the same perpendicular distance to the nipple. For training purposes, each value is allowed to vary over a specified range to gain additional test cases.
- the second neural network 15 is then trained using both cancerous and non-cancerous ROIs. The resulting connections are then used to evaluate ROIs outside of the training set.
- the ROIs can be analyzed in the context of the same view of other breast for the same patient.
- This step is implemented by performing the same operations on the other breast, then cross-correlating the information to determine asymmetries between the two breasts.
- additional context inputs can be input into the context neural net 15 corresponding to the sizes and density network results for ROIs in similar locations on the other breast. These additions should produce a significant reduction in the false alarm rate, since it is very unlikely that cancers would symmetrically develop in both breasts.
- This implementation is conceptually identical to the use of context inputs derived from alternative views of the same breast as previously discussed, but requires additional sets of mammograms for adequate training of the context neural net 15 .
- the ROIs can also be analyzed in the context of prior year mammograms for the same patient. Additional context inputs can be input into the context neural net 15 corresponding to ROIs in similar locations in prior-year mammograms for the same patient. This is similar to the use of context inputs derived from contemporaneous alternate views, as described above, but instead relies on detecting changes that have occurred since previous mammograms. These additional context inputs should produce a significant reduction in the false alarm rate in light of the temporal development properties of cancers in contrast to non-cancers. Here again this implementation requires a additional sets of mammograms for adequate training of the context neural net 15 .
- the present invention could be employed to detect abnormalities in a wide variety of medical images.
- the present system could be used to detect cancer or other types of lesions in a chest x-ray film; or cancerous and pre-cancerous cells in a pap smear or biopsy.
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Analysis (AREA)
Abstract
A system for automated detection of cancerous masses in mammograms initially identifies regions of interest (ROIs) using Fourier analysis (e.g., by means of an optical correlator). Context data is extracted from the mammogram for each ROI, such as size, location, ranking, brightness, density, and relative isolation from other ROIs. The pixels in the ROI are averaged together to create a smaller array of super-pixels, which are input into a first neural net. A second neural net receives the output values from the first neural net and the context data as inputs and generates an output score indicating whether the ROI contains a cancerous mass. The second neural net can also be provided with context data from another view of the same breast, the same view of the other breast, or a previous mammogram for the same patient.
Description
1. Field of the Invention
The present invention relates generally to the field of computer-aided diagnosis of medical images. More specifically, the present invention discloses an automated system for detecting cancerous masses in digital mammograms.
2. Statement of the Problem
While the problem of reducing breast cancer mortality is substantial, mammography provides an important tool for early detection. False negative rates are high, however, resulting largely from the varying patterns of breast tissue and their ability to disguise a cancer. Systems that can rapidly scan and analyze many mammograms can help radiologists reduce such errors. However, to date, such systems have not been practical for clinical use because they have neither achieved sufficient performance (sensitivity and specificity) nor the required processing speed for analyzing mammograms in a cost-effective, near real-time manner.
Since no two breast lesions look the same, computer-aided diagnosis of mammograms must be capable of generalizing to make correct decisions on data patterns of lesions that the computer has never before experienced. This is analogous in many ways to locating camouflaged targets in that neither “target” is well defined, and detection may require looking for large scale features and distortions of the overall scene (or total breast).
The prior art in the field includes the following:
Doi et al. “Computer-aided Diagnosis: Development of Automated Schemes for Quantitative Analysis of Radiographic Images,” Seminars in Ultrasound, CT, and MRI 1992, 13:140-152.
Nishikawa et al., “Performance of Automated CAD Schemes for the Detection and Classification of Clustered Microcalcifications,” In: Digital Mammography, Elsevier Science B. V., The Netherlands, 1994, 13-20.
Davies et al., “Automatic Computer Detection of Clustered Calcifications in Digital Mammograms,” Phy. Med. Biol. 1990, 35:8:1111-1118.
Ng et al., “Automated Detection and Classification of Breast Tumors,” Computers and Biomedical Research 1992, 25:218-237.
Kegelmeyer, Jr. et al., “Dense Feature Maps for Detection of Calcifications,” In: Digital Mammography, Elsevier Science B. V., The Netherlands, 1994, 3-12.
Brettle et al., “Automatic Microcalcification Localisation using Matched Fourier Filtering,” Digital Mammography, Elsevier Science B. V., The Netherlands, 1994, pp. 21-30.
The prior art includes a variety of systems to detect both microcalcifications and mass lesions. For microcalcification or mass classification, the prior art techniques include either the use of features extracted by human observers or computer-extracted features. The latter features for microcalcifications include shape analysis such as compactness, Fourier descriptors of image boundaries, and average distance between calcifications applied to the extracted features. For mass classification methods, features used include spiculations or irregular masses that are identified by local radiating structures or analysis of gradient histograms generated by seed growing and local thresholding methods, morphologically-based features and image texture. These features are classified using either neural networks or binary decision trees. Common to all of these approaches is limited testing, with the computationally- intensive nature of the process implied as one reason for less than full, comprehensive testing.
Researchers associated with the University of Chicago (or Arch Development Corp.) have been major contributors in the development of computer-aided diagnosis for mammogram analysis (e.g., Doi et al.). This group has developed an automatic scheme for detecting clustered microcalcifications and is evaluating a method for classification of clusters. One approach involves using a linear filter to improve the signal-to-noise ratio of microcalcifications. Signal-extraction criteria are imposed to distinguish true lesions from artifacts. The computer then indicates locations that may contain clusters of microcalcifications on the film. These investigators were able to demonstrate that use of the program improved radiologists' accuracy in detecting clustered microcalcifications. In the detection computer-aided diagnosis scheme, 78 digitized mammograms (half with, half without clusters) were processed and achieved 87% sensitivity to detection of clusters.
Davies et al. have described a system for detection of clusters of calcifications using image analysis techniques. Their method was based on a local area thresholding process, and was also able to identify other breast structures relative to the clusters of calcification. The results of their feasibility trial were a 100% true positive classification (for clusters), with an 8% false positive rate and 0% false negatives.
A more complex automated model is described by Ng et al. for the detection and classification of stellate lesions and circumscribed lesions. The relative strength of this system is that both stellate and circumscribed lesions appear as circular, bright masses with fuzzy boundaries but they may be difficult to distinguish by an automated system. Their pilot trial showed a high detection rate with a low false positive rate.
Kegelmeyer et al. have reported promising results when applying algorithms using binary decision trees and a dense feature map approach in identifying spiculated lesions and calcifications. Kegelmeyer et al. have achieved 100% sensitivity and 82% specificity when merging edge information identifying spicules with local texture measures, thus eliminating false-positive detections.
Brettle et al. demonstrated a true positive success rate of 100% and false positive and false negative rates of 0% each when using matched Fourier filtering in the frequency domain of mammographic images for detecting micro-calcification clusters. This application only used 15 segments of images, seven containing microcalcifications and eight without microcalcifications.
Other efforts by the National Institutes of Health and the National Science Foundation include: (i) gray scale image processing for better presentation of the image to the radiologist; (ii) applications of solutions of large-scale constrained optimization problems; (iii) adapting eye dwell time as a cuer; and (iv) spatial filters and signal extraction techniques for detection of microcalcifications; noise smoothing, edge enhancement, and structured background correction methods for detection of the mass boundary and characteristics such as size, density, edge sharpness, calcifications, shape, lobulation, and spiculation extracted from the mass and used for classification.
All of the aforementioned reports indicate some level of success, though most use single scale techniques, not multiscale which is essential to a high confidence result. From the data presented, it is difficult to compare the various results since there is no common data set. It is also difficult to assess the complexity of each data set.
3. Solution to the Problem.
The present approach is based on an innovative concept to detect patterns in medical images (e.g., mammograms) using a Fourier transform optical correlator for image analysis followed by a series of neural networks hosted on a digital computer to analyze the results. An optical processor allows higher-order bandwidth, multi-resolution, multi-orientation approaches for feature extraction and enhancement that are not feasible in feal time on a digital computer. A hybrid optical/digital computer approach ensures sufficient processing power at a moderate cost to accommodate discriminating algorithms and yet analyze a mammogram in a matter of seconds.
This invention provides a system for automated detection of cancerous masses in mammograms. The mammogram is digitized and regions of interest (ROIs) are detected using Fourier analysis (e.g., by means of an optical correlator). The pixels in the ROI are averaged together to create a smaller array of super-pixels that are input into a first neural net. Context data is extracted from the mammogram for each ROI, such as size, location, ranking, brightness, density, and relative isolation from other ROIs. A second neural net receives the output values from the first neural net and the context data as inputs and generates an output score indicating whether the ROI contains a cancerous mass. The second neural net can also be provided with context data from another view of the same breast, the same view of the other breast, or a previous mammogram for the same patient.
These and other advantages, features, and objects of the present invention will be more readily understood in view of the following detailed description and the drawings.
The present invention can be more readily understood in conjunction with the accompanying drawings, in which:
FIG. 1 is a simplified block diagram of the present invention.
FIG. 2 is a flow chart of the present invention.
FIG. 3 is a section of a mammogram showing a region of interest that is potentially cancerous.
FIG. 4 is a section of a mammogram corresponding to FIG. 3 that has been overlaid with a super-pixel grid.
FIG. 5 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been reduced to super-pixels.
FIG. 6 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been enhanced.
FIG. 7 is an example of a report consisting of two views in a mammogram with the suspicious regions of interest marked.
FIG. 1 provides a simplified block diagram of the overall system. A corresponding flow chart of the process used in the present invention is illustrated in FIG. 2.
1. Acquire Digital Image of Mammogram.
The first step in the process shown in FIG. 2 is acquisition of a digital mammogram 20 for analysis. A wide variety of hardware 11 can be used for image acquisition. For example, a high-resolution scanner can be used to scan a conventional mammogram film, or a direct digital image can be acquired. However, it is important that certain minimum standards be met. It is anticipated that 70 micron, 12 bit (reduced to 8 bit) digitization will be required if the present system is used for the purpose of aiding a radiologist in evaluating mammograms, and that 35 micron, 12 bit (reduced to 8 bit) digitization will be required for automated screening of mammograms.
Mammograms are typically paired by views (mediolateral and craniocaudal). As illustrated in FIG. 2, raw digitized images 20 are received by the workstation. These images are raw, single byte images in the 35-70 micron resolution range. Either a fixed size is used for the image or information on the pixel width and height of the image is included. A header is added to the image that allows the system to know the size and data type of the image. A database is used to track the image, its size, and orientation (left breast, right breast, craniocaudal, or mediolateral). The quality of imagery is also examined. This is primarily a check that the image is a mammogram (i.e., it has the features of a mammogram) and that it meets specifications indicative of a mammogram ready to be read by a radiologist.
2. Identify Regions of Interest.
Regions of interest are identified in the image by Fourier analysis, as shown at reference number 21 in FIG. 2. This is done by comparing four bandpassed images with the original image and looking for “peaks”. Bandpass filtering can be done in the digital domain using conventional computer hardware and signal processing algorithms. However, an optical correlator 13 such as that shown in U.S. Pat. No. 5,418,380 (Simon et al.) is used in the preferred embodiment of the present invention to decrease processing time and increase through-put. This approach selects bright spots of specific sizes and passes them on to the neural networks 14, 15 for analysis.
Four bandpass filters are used to detect feature sizes. These bandpass filters are as follows:
(a) 38 micron resolution—detects features approximately in 0.5-5 mm range. The peak detector eliminates anything smaller than about 2.5 mm.
(b) 76 micron resolution—detects feature sizes approximately in 2.5 mm-1 cm range.
(c) 152 micron resolution—detects feature sizes approximately in 5 mm-2 cm range.
(d) 305 micron resolution—detects feature sizes approximately in 1 cm-4 cm range.
The 230-micron images are correlated with each of the bandpass filters. For each of these four bandpasses, the strongest peaks are detected, then the sizes of the peaks are estimated by measuring the number of pixels around the peak that are:
(a) at least 30% of the peak brightness of the bandpassed image; and
(b) at least 70% of the brightness of the original image measured at the peak location; and
(c) at least 20% of the maximum brightness of the original image.
Peaks that are too small or too large relative to the scale of the bandpass are immediately rejected. A more detailed size determination is made on the remaining peaks. A radial size is calculated by measuring the distance from the peak in several directions until the pixel brightness is:
(a) less than 1/exp(2) of the peak brightness; or
(b) less than 70% of the brightness of the original image measured at the peak location; or
(c) greater than 5% higher than the next pixel closer to the peak (“peak overlap”).
Peaks that are too small or too large are eliminated, as are peaks lower in actual brightness than the surrounding area (“dark spots”), elongated features (maximum length across significantly greater than minimum length) and peaks which are too close to other peaks. Peaks that survive the selection criteria are then used to select regions of interest (ROIs). FIG. 3 is an example of a section of a mammogram showing a region of interest that is potentially cancerous. These ROIs are extracted from the original high-resolution image in squares approximately 2-3 times feature size as measured by the first criteria and scaled to fixed size (256×256 pixels). There are eight possible ROI resolutions, ranging from a minimum size of 9.7 mm (38 micron resolution) to a maximum size of 7.8 cm (305 micron resolution).
3. Analyze Regions of Interest.
Each region of interest is analyzed by means of a neural network using a super-pixeled image as an input. This grid is a radial-polar grid with the angles evenly spaced and constant radial increments. In the preferred embodiment, the grid consists of 10 equally-spaced radial and 32 equally-spaced angular bands. The pixels in each grid space are averaged together to create 320 “super-pixels” (reference numeral 24 in FIG. 2). The super-pixels, along with two inputs determined from the approximate size of the feature, are used as the inputs to the first neural net 14. For example, FIG. 4 is a section of a mammogram corresponding to FIG. 3 that has been overlaid with a super-pixel grid. FIG. 5 is a section of a mammogram corresponding to FIG. 3 after the region of interest has been reduced to super-pixels.
The first neural network 14 is trained on both cancerous and non-cancerous regions of interest. Regions of Interest (ROIs) are selected from known (verified normal or verified cancerous) mammograms. These ROIs are ranked according to their appearance, with +1 being most normal-like and −1 being most lesion-like. The ROIs are then converted to super-pixels for input to the first neural network. A random scaling, rotation and translation of the super-pixel grid is performed, and half of the image (angularly) is chosen in a way that excludes any parts of the ROI outside the original area of the mammogram (These edges are generated by superimposing an odd-sized image on a fixed-sized background). The values of the half-grid are normalized in two steps: A stretching of the values over a fixed range increases the contrast within the half-grid, while a normalization of the sum of all the half-grid inputs assures consistency in brightness over all possible half-grids. A statistical thresholding of the ROI is used to determine the approximate “size” of the main feature in the ROI. This size measurement is converted to an equivalent diameter, perturbed by the same scaling perturbation used on the super-pixel grid, then converted into a pair of inputs, using a sine-cosine transformation. These two values, along with the grid super-pixels, are input to the first neural network using a randomly initialized set of neural network weights. The output of the first neural network (restricted between +1 and −1) is compared with the chosen ranking. An error value is calculated and used to correct the neural network weights using a back-propagation learning algorithm known as REM (Recursive Error Minimization). The REM algorithm uses the derivatives of the propagated error to create a more stable convergence than conventional back-propagation. A new set of perturbations is then applied to the next ROI, creating a new half-grid, which is again normalized and applied to the network. This process is repeated until the RMS of the error values drops to a small value, or the change in the error over several iterations becomes very small, whichever occurs first. This signifies that the network is now trained. The number of iterations depends on several conditions, but in general will take about 1000-3000 cycles through the complete set of ROIs.
Once trained, a set of ROIs selected from “unknown” mammograms is processed through the neural network (reference numeral 25 in FIG. 2). No perturbations are done on these inputs, although the grid is rotated through all possible angles, one angular grid space at a time, such that no half-grid includes any “edges”. Each half-grid in turn is normalized as above, and applied to the trained network along with the two size inputs from the ROI. The outputs for the rotations are combined, producing two or three statistical outputs for each ROI. These outputs are then used as part of the set of inputs for the second neural network 15, as discussed below.
4. Analyze Regions of Interest in Context of Mammogram.
A combination of 16 wavelet filters and two bandpass filters are applied to each region of interest to create enhanced ROIs at reference numeral 22 in FIG. 2. FIG. 6 is a section of a mammogram corresponding to FIG. 3 after the ROI has been enhanced to emphasize the contours and lines of the features in the image. At reference numeral 23 in FIG. 2, these enhanced images are used, along with the original ROIs, to generate context inputs for the second neural net 15 (or “context network”). The context inputs for the context network 15 includes the location, ranking, brightness and density profiles, sizes of inner and outer features, relative isolation from other ROIs, roughness of contours, fuzziness, and the number and length of any spicules. The distance of the center of each ROI from the nipple is determined by:
(a) For the mediolateral view, locating the chest wall (if possible), and measuring the distance of the ROI from the nipple perpendicular to the chest wall;
(b) For the craniocaudal view, measuring the horizontal distance of the ROI from the nipple.
For each ROI, up to five ROIs in the alternate view are selected that are within 50% of the same distance from the nipple. Sizes and output values from the first neural network 14 are also used as inputs to the context neural net 15. The context inputs to the second neural net 15 are calculated by using sine and cosine transformations of the parameters defined above.
The context neural net 15 is initially trained, again using verified cancerous and non-cancerous cases with random deviations for each input to generate training examples. The context inputs for each ROI are determined, including the outputs from the trained first neural network 14. Each input is perturbed by a random value (e.g., +/−10%) and then converted into a pair of inputs using a sine-cosine transformation. The full set of inputs is applied to the second neural network 15 using a randomly initialized set of neural network weights. The output of the second neural network 15 is compared to the same ranking value used for the first neural network 14. The weights are corrected by the method cited above, and the second network 15 is trained to convergence using about 1000-3000 cycles through the set of ROIs.
The “unknown” ROIs are passed once through the trained second network 15 without perturbations. The output of this network is then combined with that of other ROIs belonging to the same mammogram (including other views, other breast and/or other years) to characterize the ROI and the mammogram. After training is complete, the trained context neural network 15 can then be used to evaluate unknown cases (reference numeral 26 in FIG. 2).
In the preferred embodiment of the present invention, the context neural net 15 is a second network separate from the first neural network 14 used in step 3. However, it should be expressly understood that other configurations could be substituted. For example, a single neural net could receive both the ROI super-pixels and context data as inputs.
5. Threshold Regions of Interest
After training is complete, an ROC curve is generated from the output score produced by the second neural network in response to sets of known mammograms. An appropriate threshold output score can be determined from the resulting ROC curve which will achieve the desired probability of detection and false alarm rate.
This threshold can then be used to classify unknown cases during normal operation of the system (reference numeral 27 in FIG. 2). For example, the threshold score is used by the decision software 16, as depicted in FIG. 1, to generate a report for the radiologist.
6. Create Report
A user-friendly graphical report is provided to the radiologist on a CRT or other display 17 (reference numeral 28 in FIG. 2). FIG. 7 is an example of a report consisting of two views in a mammogram with the suspicious regions of interest marked. This report typically shows both views of the mammogram, along with a classification determined by the analysis of the neural networks. The locations of suspect masses are marked on the display. For example, the final classifications might include the following:
(a) “Suspect” mammogram has at least one ROI detected on either view with a value less than a predetermined threshold (i.e., more lesion-like), but more than a fixed level (e.g., 0.2) below the threshold. A mammogram with such a designation needs further review by the radiologist.
(b) “Cancer” mammogram has at least one ROI detected on either view with a value less than the minimum defined above for a “suspect.” A mammogram with such a designation needs further review by the radiologist.
(c) “Normal” mammogram has no ROIs on either view with values less than the threshold. This mammogram needs no further review.
7. Results
A round-robin test was run with over 100 different pairs (two views per breast) of mammograms analyzed. The results of this test showed that, when the threshold was set to detect 100% of the cancers, approximately 50% of the mammograms were screened, with less than 0.7 false positives per image.
Application
The present system can be used by a radiologist in either of two modes of operation. In the first mode, the system merely aides the radiologist by flagging any ROIs falling into the “Suspect” or “Cancer” classifications. The radiologist continues to thoroughly review each mammogram and remains responsible for diagnosis. The system is equipped with an image archive 18 shown in FIG. 1. The archive can store each mammogram or only those mammograms designated by the radiologist. In the second mode (“screening mode”), the system analyzes each mammogram and only those classified as “Suspect” or “Cancer” are brought to the attention of the radiologist for further review. “Normal” mammograms are sent directly to the image archive 18 for archival storage.
Other Context Inputs
In addition to the embodiment described above, a number of additional types of context data can be input into the context neural net to enhance the accuracy of the system. For example, the ROIs can be analyzed in the context of information from the other view of same breast. This step can be combined with the previous step using the second neural network 15 as described above, or performed separately using a third neural network. The additional context data can include information on the size and brightness of the ROI under question as well as other ROIs in the same breast and ROIs in the alternate view which have nearly the same perpendicular distance to the nipple. For training purposes, each value is allowed to vary over a specified range to gain additional test cases. The second neural network 15 is then trained using both cancerous and non-cancerous ROIs. The resulting connections are then used to evaluate ROIs outside of the training set.
Similarly, the ROIs can be analyzed in the context of the same view of other breast for the same patient. This step is implemented by performing the same operations on the other breast, then cross-correlating the information to determine asymmetries between the two breasts. For example, additional context inputs can be input into the context neural net 15 corresponding to the sizes and density network results for ROIs in similar locations on the other breast. These additions should produce a significant reduction in the false alarm rate, since it is very unlikely that cancers would symmetrically develop in both breasts. This implementation is conceptually identical to the use of context inputs derived from alternative views of the same breast as previously discussed, but requires additional sets of mammograms for adequate training of the context neural net 15.
The ROIs can also be analyzed in the context of prior year mammograms for the same patient. Additional context inputs can be input into the context neural net 15 corresponding to ROIs in similar locations in prior-year mammograms for the same patient. This is similar to the use of context inputs derived from contemporaneous alternate views, as described above, but instead relies on detecting changes that have occurred since previous mammograms. These additional context inputs should produce a significant reduction in the false alarm rate in light of the temporal development properties of cancers in contrast to non-cancers. Here again this implementation requires a additional sets of mammograms for adequate training of the context neural net 15.
The previous discussion has been directed primarily to detection of cancerous masses in mammograms. It should be expressly understood that the present invention could be employed to detect abnormalities in a wide variety of medical images. For example, the present system could be used to detect cancer or other types of lesions in a chest x-ray film; or cancerous and pre-cancerous cells in a pap smear or biopsy.
The above disclosure sets forth a number of embodiments of the present invention. Other arrangements or embodiments, not precisely set forth, could be practiced under the teachings of the present invention and as set forth in the following claims.
Claims (47)
1. A method for automated analysis of a digitized mammogram to detect the presence of a possible cancerous mass comprising:
detecting a region of interest (ROI) using said digitized mammogram and Fourier spatial bandpass analysis, said ROI corresponding with a possible cancerous mass, wherein a plurality of spatially bandpassed images of different resolutions corresponding with said digitized mammogram are employed to identify at least one brightness peak corresponding with the ROI;
extracting context data for said ROI from said digitized mammogram, said context data comprising attribute information determined in specific relation to the ROI; and
inputting image data corresponding with said ROI and said context data to a neural net trained in relation to the attributes of cancerous tissue regions, and generating an output from said neural net indicating whether a possible cancerous tissue mass is present in said ROI.
2. The method of claim 1 wherein the step of detecting a ROI is performed by an optical processor separate from said neural net.
3. The method of claim 1 further comprising the step of averaging pixels corresponding with only said ROI to create a smaller number of super-pixels that are input to said neural net.
4. The method of claim 3 wherein said super-pixels are arranged in a radial-polar pattern.
5. The method of claim 4, wherein said radial-polar pattern comprises a first predetermined number of equally spaced radial bins and a second predetermined number of equally spaced angular bins, said radial and angular bins defining a plurality of grid spaces, and wherein for each grid space the included ROI pixels are averaged to obtain said super pixels.
6. The method of claim 1 wherein said context data comprises the location of said ROI relative to a standard anatomical landmark in said digitized mammogram.
7. The method of claim 1 wherein said context data comprises the size of said ROI.
8. The method of claim 1 wherein said context data comprises brightness and density profiles of said ROI.
9. The method of claim 1 wherein said context data comprises the relative isolation of said ROI from other ROIs in said digitized mammogram.
10. The method of claim 1 wherein said context data comprises the number and length of any spicules in said ROI.
11. The method of claim 1 wherein said context data comprises information about other ROIs in said digitized mammogram.
12. The method of claim 1 further comprising the step of enhancing the contours of said ROI using spatial bandpass and spatial wavelet filters before extracting said context data.
13. The method of claim 1 wherein said neural net comprises:
a first neural net receiving image data corresponding with said ROI as an input and generating at least one output value; and
a second neural net receiving said output values from said first neural net and said context data as inputs and generating an output indicating whether a possible cancerous mass is present in said ROI.
14. The method of claim 13, further comprising the substep of averaging pixels in said digitized mammogram corresponding with said ROI to obtain the image data, and wherein said input to said first neural net includes information corresponding with a size of said ROI.
15. The method of claim 14, wherein said context data comprises a least two types of information selected from a group consisting of: ROI location information, ROI size-related information, ROI brightness and density profile information, ROI isolation information relative to other detected regions of interest, ROI information relating to spicules therewithin.
16. The method of claim 15, wherein said context data is calculated from sine and cosine transformation of said at least two types of information.
17. The method of claim 1 wherein said neural net is initially trained using perturbations of a set of training images, said perturbations being generated by at least one of random scaling, rotation and translation of training images comprising said set of training images.
18. The method of claim 1, further comprising the step of reducing a resolution of said digitized mammogram for use in the detecting step.
19. The method of claim 1, further comprising the step of spatial bandpass filtering said digitized mammogram image to obtain said plurality of spatially bandpassed images, wherein a corresponding plurality of bandpass filters having differing resolutions are employed.
20. The method of claim 1, wherein said detecting step includes the substep of estimating, for at least one of said plurality of differing bandpassed images, the size of said at least one brightness peak based upon a determination of the number of pixels around said at least one brightness peak having a brightness within each of the following: a first predetermined percentage of a brightness of the peak in at least one bandpassed image; a second predetermined percentage of a brightness of the digitized mammogram measured at the corresponding peak location; and a third predetermined brightness percentage of a maximum brightness of the digitized mammogram.
21. The method of claim 20, wherein a resolution of a said digitized mammogram is reduced.
22. The method of claim 20, wherein said estimating substep comprises the additional substep of calculating a radial size of said at least one brightness peak by measuring a distance from said peak in a plurality of directions until corresponding pixel brightness is one of the following: less than 1/exp(2) of the corresponding peak brightness; less than a predetermined percentage of a brightness of the digitized mammogram measured at the corresponding peak location; or greater than a predetermined percentage higher than the next pixel closer to the peak.
23. The method of claim 1, wherein said detecting step includes the substep of estimating, for at least one of the said plurality of differing bandpassed images, a size of said at least one brightness peak by comparing said at least one of said plurality of differing bandpassed images with said digitized mammogram.
24. The method of claim 1, further comprising the step of applying a predetermined plurality of spatial bandpass and spatial wavelet filters to obtain an enhanced image of said ROI.
25. The method of claim 24, wherein said enhanced image is employed to obtain at least a portion of said context data.
26. A system for automated detection of cancerous masses in mammograms comprising:
means for inputting a digital mammogram;
an optical processor for detecting a region of interest (ROI) using said digital mammogram and Fourier spatial bandpass analysis, said ROI corresponding with a possible cancerous mass, wherein a plurality of spatially bandpassed images of different resolutions corresponding with said digital mammogram are employed to identify at least one brightness peak corresponding with the ROI;
means for extracting context data for said ROI from said mammogram, said context data comprising attribute information determined in specific relation to the ROI; and
a neural net trained in relation to the attributes of cancerous tissue regions, for receiving image data corresponding with said ROI and said context data as inputs and for generating an output indicating whether said ROI contains a possible cancerous mass.
27. The system of claim 26 further comprising means for averaging pixels corresponding with only said ROI to create a smaller number of super-pixels that are input to said neural net.
28. The system of claim 27 wherein said super-pixels are arranged in a radial-polar pattern.
29. The system of claim 26 wherein said context data comprises the location of said ROI relative to the nipple in said mammogram.
30. The system of claim 26 wherein said context data comprises the size of said ROI.
31. The system of claim 26 wherein said context data comprises the brightness and density profiles of said ROI.
32. The system of claim 26 wherein said context data comprises the relative isolation of said ROI from other ROIs in said mammogram.
33. The system of claim 26 wherein said context data comprises the number and length of any spicules in said ROI.
34. The system of claim 26 wherein said context data comprises information about other ROIs in said mammogram.
35. The system of claim 26 wherein said mammogram includes a plurality of views of the same breast, and wherein said context data comprises information from another view of the same breast.
36. The system of claim 26 wherein said mammogram includes views of both breasts, and wherein said context data comprises information about the other breast.
37. The system of claim 26 wherein said context data includes information from a previous mammogram for the same patient.
38. The system of claim 26 further comprising filter means for enhancing the contours of said ROI before extracting said context data.
39. The system of claim 26 wherein said neural net comprises:
a first neural net receiving said ROI as an input and generating at least one output value; and
a second neural net receiving said output values from said first neural net and said context data as inputs and generating an output indicating whether said ROI contains a cancerous mass.
40. The system of claim 26 wherein neural net is initially trained using perturbations of a set of training images, said perturbations being generated by at least one of random scaling, rotation and translation of training images comprising said set of training images.
41. A system for automated detection of cancerous masses in mammograms comprising:
means for inputting a digital mammogram containing a first array of pixels;
an optical processor to detect a region of interest (ROI) using said digital mammogram and Fourier spatial bandpass analysis, said ROI corresponding with a possible cancerous mass, wherein a plurality of spatially bandpassed images of different resolutions corresponding with said digital mammogram are employed to identify at least one brightness peak corresponding with the ROI;
means for averaging pixels of said first array that correspond with only said ROI to create a second array of super-pixels;
a first neural net, trained in relation to the attributes of cancerous tissue regions, for receiving said second array of super-pixels as an input and generating at least one output value; and
a second neural net, trained in relation to the attributes of cancerous tissue regions, for receiving said at least one output value from said first neural net and at least a portion of said context data as inputs and generating an output indicating whether said ROI contains a possible cancerous mass.
42. The system of claim 41 wherein said super-pixels are arranged in a radial-polar pattern.
43. The system of claim 41 wherein said context data comprises information about other ROIs in said mammogram.
44. The system of claim 41 wherein said mammogram includes a plurality of views of the same breast, and wherein said context data comprises information from another view of the same breast.
45. The system of claim 41 wherein said mammogram includes views of both breasts, and wherein said context data comprises information about the other breast.
46. The system of claim 41 wherein said context data includes information from a previous mammogram for the same patient.
47. The system of claim 41 further comprising spatial bandpass and spatial wavelet filter means for enhancing the contours of said ROI before extracting said context data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/870,709 US6246782B1 (en) | 1997-06-06 | 1997-06-06 | System for automated detection of cancerous masses in mammograms |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/870,709 US6246782B1 (en) | 1997-06-06 | 1997-06-06 | System for automated detection of cancerous masses in mammograms |
Publications (1)
Publication Number | Publication Date |
---|---|
US6246782B1 true US6246782B1 (en) | 2001-06-12 |
Family
ID=25355948
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/870,709 Expired - Lifetime US6246782B1 (en) | 1997-06-06 | 1997-06-06 | System for automated detection of cancerous masses in mammograms |
Country Status (1)
Country | Link |
---|---|
US (1) | US6246782B1 (en) |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020028006A1 (en) * | 2000-09-07 | 2002-03-07 | Novak Carol L. | Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images |
US6453058B1 (en) * | 1999-06-07 | 2002-09-17 | Siemens Corporate Research, Inc. | Computer-assisted diagnosis method using correspondence checking and change detection of salient features in digital images |
US20020196967A1 (en) * | 2001-06-13 | 2002-12-26 | Fuji Photo Film Co., Ltd. | Prospective abnormal shadow detecting system |
US20030007598A1 (en) * | 2000-11-24 | 2003-01-09 | U-Systems, Inc. | Breast cancer screening with adjunctive ultrasound mammography |
US20030039385A1 (en) * | 2001-08-20 | 2003-02-27 | Fuji Photo Film Co., Ltd. | Anomalous shadow detection apparatus |
US20030165262A1 (en) * | 2002-02-21 | 2003-09-04 | The University Of Chicago | Detection of calcifications within a medical image |
US20030231790A1 (en) * | 2002-05-02 | 2003-12-18 | Bottema Murk Jan | Method and system for computer aided detection of cancer |
WO2004047006A1 (en) * | 2002-11-20 | 2004-06-03 | Ostankovich Anatoly Aleksandro | Method for producing an information pattern of breast cancer upon mammograms |
US20040133083A1 (en) * | 2002-11-13 | 2004-07-08 | Siemens Corporate Research Inc. | System and method for real-time feature sensitivity analysis based on contextual information |
US20040136577A1 (en) * | 2002-10-11 | 2004-07-15 | University Of Massachusetts | Optical fourier systems and methods for medical image processing |
US20040146190A1 (en) * | 2003-01-23 | 2004-07-29 | Konica Minolta Holdings, Inc. | Medical image processing system |
US20040147840A1 (en) * | 2002-11-08 | 2004-07-29 | Bhavani Duggirala | Computer aided diagnostic assistance for medical imaging |
US20040258291A1 (en) * | 2003-06-20 | 2004-12-23 | Gustafson Gregory A. | Method and system for tracking abnormality data |
US20050171430A1 (en) * | 2000-11-24 | 2005-08-04 | Wei Zhang | Processing and displaying breast ultrasound information |
US20050259857A1 (en) * | 2004-05-21 | 2005-11-24 | Fanny Jeunehomme | Method and apparatus for classification of pixels in medical imaging |
US20050283076A1 (en) * | 2004-06-18 | 2005-12-22 | Hangiandreou Nicholas J | Non-invasive diagnosis of breast cancer using real-time ultrasound strain imaging |
US20060082595A1 (en) * | 2004-10-15 | 2006-04-20 | Fujitsu Limited | Device part assembly drawing image search apparatus |
US20060122467A1 (en) * | 2002-09-24 | 2006-06-08 | Scott Harrington | Method and system for computer aided detection (cad) cued reading of medical images |
US20060149162A1 (en) * | 2004-11-29 | 2006-07-06 | Derek Daw | Graphical user interface for tissue biopsy system |
US20060153434A1 (en) * | 2002-11-29 | 2006-07-13 | Shih-Ping Wang | Thick-slice display of medical images |
KR100646828B1 (en) | 2005-10-08 | 2006-11-23 | 건국대학교 산학협력단 | System for measurement mass size on breast mammogram and method therefor |
US20070036440A1 (en) * | 2002-10-15 | 2007-02-15 | Definiens Ag | Generating applications that analyze image data using a semantic cognition network |
US20070112823A1 (en) * | 1999-12-14 | 2007-05-17 | Definiens Ag | Method for processing data structures |
US20070118384A1 (en) * | 2005-11-22 | 2007-05-24 | Gustafson Gregory A | Voice activated mammography information systems |
US20070280525A1 (en) * | 2006-06-02 | 2007-12-06 | Basilico Robert F | Methods and Apparatus for Computer Automated Diagnosis of Mammogram Images |
US20080008349A1 (en) * | 2002-10-15 | 2008-01-10 | Definiens Ag | Analyzing pixel data using image, thematic and object layers of a computer-implemented network structure |
US20080137937A1 (en) * | 2006-11-16 | 2008-06-12 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US20080212864A1 (en) * | 2004-02-13 | 2008-09-04 | Sectra Mamea Ab | Method and Arrangement Relating to X-Ray Imaging |
US20080285825A1 (en) * | 2007-05-15 | 2008-11-20 | Three Palm Software | Mass spicules detection and tracing from digital mammograms |
US7873223B2 (en) | 2002-10-15 | 2011-01-18 | Definiens Ag | Cognition integrator and language |
US7940966B2 (en) | 2000-11-24 | 2011-05-10 | U-Systems, Inc. | Full-field breast image data processing and archiving |
US20110125526A1 (en) * | 2009-11-24 | 2011-05-26 | Greg Gustafson | Multiple modality mammography image gallery and clipping system |
US20110122138A1 (en) * | 2006-08-28 | 2011-05-26 | Guenter Schmidt | Context driven image mining to generate image-based biomarkers |
US20110137132A1 (en) * | 2009-11-24 | 2011-06-09 | Gustafson Gregory A | Mammography Information System |
US20130094726A1 (en) * | 2011-10-18 | 2013-04-18 | Olympus Corporation | Image processing device, image processing method, and computer readable storage device |
GB2497516A (en) * | 2011-12-05 | 2013-06-19 | Univ Lincoln | Generating training data for automation of image analysis |
US20130208956A1 (en) * | 2010-04-30 | 2013-08-15 | Vucomp, Inc. | Spiculated Malignant Mass Detection and Classification in Radiographic Image |
US20130245426A1 (en) * | 2012-03-14 | 2013-09-19 | Samsung Electronics Co., Ltd. | Diagnostic imaging apparatus and method |
US8542899B2 (en) | 2006-11-30 | 2013-09-24 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US8687867B1 (en) * | 2008-09-16 | 2014-04-01 | Icad, Inc. | Computer-aided detection and classification of suspicious masses in breast imagery |
US9089307B2 (en) | 2009-10-30 | 2015-07-28 | Koninklijke Philips N.V. | Three-dimensional analysis of lesions represented by image data |
US9256799B2 (en) | 2010-07-07 | 2016-02-09 | Vucomp, Inc. | Marking system for computer-aided detection of breast abnormalities |
US20160067402A1 (en) * | 2014-09-04 | 2016-03-10 | Samsung Electronics Co., Ltd. | Medical imaging apparatus and controlling method thereof |
US20160314587A1 (en) * | 2014-01-10 | 2016-10-27 | Canon Kabushiki Kaisha | Processing apparatus, processing method, and non-transitory computer-readable storage medium |
US20170150941A1 (en) * | 2014-07-02 | 2017-06-01 | Koninklijke Philips N.V. | Lesion signature to characterize pathology for specific subject |
US9861342B2 (en) | 2000-11-24 | 2018-01-09 | U-Systems, Inc. | Adjunctive ultrasound processing and display for breast cancer screening |
US10314563B2 (en) | 2014-11-26 | 2019-06-11 | Devicor Medical Products, Inc. | Graphical user interface for biopsy device |
US20190303567A1 (en) * | 2018-03-28 | 2019-10-03 | Nvidia Corporation | Detecting data anomalies on a data interface using machine learning |
US10595805B2 (en) | 2014-06-27 | 2020-03-24 | Sunnybrook Research Institute | Systems and methods for generating an imaging biomarker that indicates detectability of conspicuity of lesions in a mammographic image |
US11137462B2 (en) * | 2016-06-10 | 2021-10-05 | Board Of Trustees Of Michigan State University | System and method for quantifying cell numbers in magnetic resonance imaging (MRI) |
CN117481672A (en) * | 2023-10-25 | 2024-02-02 | 深圳医和家智慧医疗科技有限公司 | Intelligent method for rapid screening in early stage of mammary tissue hardening |
Citations (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4769850A (en) | 1985-11-26 | 1988-09-06 | International Business Machines Corporation | Pattern recognition system |
US4839807A (en) | 1987-08-03 | 1989-06-13 | University Of Chicago | Method and system for automated classification of distinction between normal lungs and abnormal lungs with interstitial disease in digital chest radiographs |
US4845762A (en) | 1984-03-07 | 1989-07-04 | Fuji Photo Film Co., Ltd. | Frequency processing method and apparatus for radiation image |
US4851984A (en) * | 1987-08-03 | 1989-07-25 | University Of Chicago | Method and system for localization of inter-rib spaces and automated lung texture analysis in digital chest radiographs |
US4907156A (en) | 1987-06-30 | 1990-03-06 | University Of Chicago | Method and system for enhancement and detection of abnormal anatomic regions in a digital image |
US4912647A (en) | 1988-12-14 | 1990-03-27 | Gte Laboratories Incorporated | Neural network training tool |
US4918534A (en) | 1988-04-22 | 1990-04-17 | The University Of Chicago | Optical image processing method and system to perform unsharp masking on images detected by an I.I./TV system |
US4941122A (en) | 1989-01-12 | 1990-07-10 | Recognition Equipment Incorp. | Neural network image processing system |
US4961425A (en) | 1987-08-14 | 1990-10-09 | Massachusetts Institute Of Technology | Morphometric analysis of anatomical tomographic data |
US5003979A (en) | 1989-02-21 | 1991-04-02 | University Of Virginia | System and method for the noninvasive identification and display of breast lesions and the like |
US5046019A (en) | 1989-10-13 | 1991-09-03 | Chip Supply, Inc. | Fuzzy data comparator with neural network postprocessor |
US5133020A (en) | 1989-07-21 | 1992-07-21 | Arch Development Corporation | Automated method and system for the detection and classification of abnormal lesions and parenchymal distortions in digital medical images |
US5212637A (en) | 1989-11-22 | 1993-05-18 | Stereometrix Corporation | Method of investigating mammograms for masses and calcifications, and apparatus for practicing such method |
US5224177A (en) | 1991-10-31 | 1993-06-29 | The University Of Chicago | High quality film image correction and duplication method and system |
US5319549A (en) | 1992-11-25 | 1994-06-07 | Arch Development Corporation | Method and system for determining geometric pattern features of interstitial infiltrates in chest images |
US5343390A (en) | 1992-02-28 | 1994-08-30 | Arch Development Corporation | Method and system for automated selection of regions of interest and detection of septal lines in digital chest radiographs |
US5359513A (en) | 1992-11-25 | 1994-10-25 | Arch Development Corporation | Method and system for detection of interval change in temporally sequential chest images |
US5365429A (en) | 1993-01-11 | 1994-11-15 | North American Philips Corporation | Computer detection of microcalcifications in mammograms |
US5418380A (en) | 1994-04-12 | 1995-05-23 | Martin Marietta Corporation | Optical correlator using ferroelectric liquid crystal spatial light modulators and Fourier transform lenses |
US5452367A (en) | 1993-11-29 | 1995-09-19 | Arch Development Corporation | Automated method and system for the segmentation of medical images |
US5463548A (en) | 1990-08-28 | 1995-10-31 | Arch Development Corporation | Method and system for differential diagnosis based on clinical and radiological information using artificial neural networks |
US5469353A (en) | 1993-11-26 | 1995-11-21 | Access Radiology Corp. | Radiological image interpretation apparatus and method |
US5491627A (en) | 1993-05-13 | 1996-02-13 | Arch Development Corporation | Method and system for the detection of microcalcifications in digital mammograms |
US5513273A (en) | 1993-03-18 | 1996-04-30 | Fuji Photo Film Co., Ltd. | Method for obtaining information about interstitial patterns of the lungs |
US5526394A (en) | 1993-11-26 | 1996-06-11 | Fischer Imaging Corporation | Digital scan mammography apparatus |
US5537485A (en) | 1992-07-21 | 1996-07-16 | Arch Development Corporation | Method for computer-aided detection of clustered microcalcifications from digital mammograms |
US5572565A (en) | 1994-12-30 | 1996-11-05 | Philips Electronics North America Corporation | Automatic segmentation, skinline and nipple detection in digital mammograms |
US5574799A (en) | 1992-06-12 | 1996-11-12 | The Johns Hopkins University | Method and system for automated detection of microcalcification clusters in mammograms |
US5579360A (en) | 1994-12-30 | 1996-11-26 | Philips Electronics North America Corporation | Mass detection by computer using digital mammograms of the same breast taken from different viewing directions |
US5586160A (en) | 1995-03-20 | 1996-12-17 | The Regents Of The University Of California | Automated analysis for microcalcifications in high resolution digital mammograms |
US5713364A (en) * | 1995-08-01 | 1998-02-03 | Medispectra, Inc. | Spectral volume microprobe analysis of materials |
US5734739A (en) * | 1994-05-31 | 1998-03-31 | University Of Washington | Method for determining the contour of an in vivo organ using multiple image frames of the organ |
US5754693A (en) * | 1991-02-18 | 1998-05-19 | Sumitomo Osaka Cement Company Limited | Method of optical recognition and classification of patterns |
US5790691A (en) * | 1996-03-12 | 1998-08-04 | The Regents Of The University Of Colorado | Method and apparatus for robust shape detection using a hit/miss transform |
US5799100A (en) * | 1996-06-03 | 1998-08-25 | University Of South Florida | Computer-assisted method and apparatus for analysis of x-ray images using wavelet transforms |
US5815591A (en) * | 1996-07-10 | 1998-09-29 | R2 Technology, Inc. | Method and apparatus for fast detection of spiculated lesions in digital mammograms |
US5825910A (en) * | 1993-12-30 | 1998-10-20 | Philips Electronics North America Corp. | Automatic segmentation and skinline detection in digital mammograms |
US5836872A (en) * | 1989-04-13 | 1998-11-17 | Vanguard Imaging, Ltd. | Digital optical visualization, enhancement, quantification, and classification of surface and subsurface features of body surfaces |
-
1997
- 1997-06-06 US US08/870,709 patent/US6246782B1/en not_active Expired - Lifetime
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4845762A (en) | 1984-03-07 | 1989-07-04 | Fuji Photo Film Co., Ltd. | Frequency processing method and apparatus for radiation image |
US4769850A (en) | 1985-11-26 | 1988-09-06 | International Business Machines Corporation | Pattern recognition system |
US4907156A (en) | 1987-06-30 | 1990-03-06 | University Of Chicago | Method and system for enhancement and detection of abnormal anatomic regions in a digital image |
US4839807A (en) | 1987-08-03 | 1989-06-13 | University Of Chicago | Method and system for automated classification of distinction between normal lungs and abnormal lungs with interstitial disease in digital chest radiographs |
US4851984A (en) * | 1987-08-03 | 1989-07-25 | University Of Chicago | Method and system for localization of inter-rib spaces and automated lung texture analysis in digital chest radiographs |
US4961425A (en) | 1987-08-14 | 1990-10-09 | Massachusetts Institute Of Technology | Morphometric analysis of anatomical tomographic data |
US4918534A (en) | 1988-04-22 | 1990-04-17 | The University Of Chicago | Optical image processing method and system to perform unsharp masking on images detected by an I.I./TV system |
US4912647A (en) | 1988-12-14 | 1990-03-27 | Gte Laboratories Incorporated | Neural network training tool |
US4941122A (en) | 1989-01-12 | 1990-07-10 | Recognition Equipment Incorp. | Neural network image processing system |
US5003979A (en) | 1989-02-21 | 1991-04-02 | University Of Virginia | System and method for the noninvasive identification and display of breast lesions and the like |
US5836872A (en) * | 1989-04-13 | 1998-11-17 | Vanguard Imaging, Ltd. | Digital optical visualization, enhancement, quantification, and classification of surface and subsurface features of body surfaces |
US5133020A (en) | 1989-07-21 | 1992-07-21 | Arch Development Corporation | Automated method and system for the detection and classification of abnormal lesions and parenchymal distortions in digital medical images |
US5046019A (en) | 1989-10-13 | 1991-09-03 | Chip Supply, Inc. | Fuzzy data comparator with neural network postprocessor |
US5212637A (en) | 1989-11-22 | 1993-05-18 | Stereometrix Corporation | Method of investigating mammograms for masses and calcifications, and apparatus for practicing such method |
US5463548A (en) | 1990-08-28 | 1995-10-31 | Arch Development Corporation | Method and system for differential diagnosis based on clinical and radiological information using artificial neural networks |
US5754693A (en) * | 1991-02-18 | 1998-05-19 | Sumitomo Osaka Cement Company Limited | Method of optical recognition and classification of patterns |
US5224177A (en) | 1991-10-31 | 1993-06-29 | The University Of Chicago | High quality film image correction and duplication method and system |
US5343390A (en) | 1992-02-28 | 1994-08-30 | Arch Development Corporation | Method and system for automated selection of regions of interest and detection of septal lines in digital chest radiographs |
US5574799A (en) | 1992-06-12 | 1996-11-12 | The Johns Hopkins University | Method and system for automated detection of microcalcification clusters in mammograms |
US5537485A (en) | 1992-07-21 | 1996-07-16 | Arch Development Corporation | Method for computer-aided detection of clustered microcalcifications from digital mammograms |
US5319549A (en) | 1992-11-25 | 1994-06-07 | Arch Development Corporation | Method and system for determining geometric pattern features of interstitial infiltrates in chest images |
US5359513A (en) | 1992-11-25 | 1994-10-25 | Arch Development Corporation | Method and system for detection of interval change in temporally sequential chest images |
US5365429A (en) | 1993-01-11 | 1994-11-15 | North American Philips Corporation | Computer detection of microcalcifications in mammograms |
US5513273A (en) | 1993-03-18 | 1996-04-30 | Fuji Photo Film Co., Ltd. | Method for obtaining information about interstitial patterns of the lungs |
US5491627A (en) | 1993-05-13 | 1996-02-13 | Arch Development Corporation | Method and system for the detection of microcalcifications in digital mammograms |
US5469353A (en) | 1993-11-26 | 1995-11-21 | Access Radiology Corp. | Radiological image interpretation apparatus and method |
US5513101A (en) | 1993-11-26 | 1996-04-30 | Access Radiology Corporation | Radiological image interpretation apparatus and method |
US5526394A (en) | 1993-11-26 | 1996-06-11 | Fischer Imaging Corporation | Digital scan mammography apparatus |
US5452367A (en) | 1993-11-29 | 1995-09-19 | Arch Development Corporation | Automated method and system for the segmentation of medical images |
US5825910A (en) * | 1993-12-30 | 1998-10-20 | Philips Electronics North America Corp. | Automatic segmentation and skinline detection in digital mammograms |
US5418380A (en) | 1994-04-12 | 1995-05-23 | Martin Marietta Corporation | Optical correlator using ferroelectric liquid crystal spatial light modulators and Fourier transform lenses |
US5734739A (en) * | 1994-05-31 | 1998-03-31 | University Of Washington | Method for determining the contour of an in vivo organ using multiple image frames of the organ |
US5579360A (en) | 1994-12-30 | 1996-11-26 | Philips Electronics North America Corporation | Mass detection by computer using digital mammograms of the same breast taken from different viewing directions |
US5572565A (en) | 1994-12-30 | 1996-11-05 | Philips Electronics North America Corporation | Automatic segmentation, skinline and nipple detection in digital mammograms |
US5586160A (en) | 1995-03-20 | 1996-12-17 | The Regents Of The University Of California | Automated analysis for microcalcifications in high resolution digital mammograms |
US5713364A (en) * | 1995-08-01 | 1998-02-03 | Medispectra, Inc. | Spectral volume microprobe analysis of materials |
US5790691A (en) * | 1996-03-12 | 1998-08-04 | The Regents Of The University Of Colorado | Method and apparatus for robust shape detection using a hit/miss transform |
US5799100A (en) * | 1996-06-03 | 1998-08-25 | University Of South Florida | Computer-assisted method and apparatus for analysis of x-ray images using wavelet transforms |
US5815591A (en) * | 1996-07-10 | 1998-09-29 | R2 Technology, Inc. | Method and apparatus for fast detection of spiculated lesions in digital mammograms |
Non-Patent Citations (8)
Title |
---|
Cardenosa, "Mammography: An Overview", Digital Mammography '96, Proceedings of the 3rd International Workshop on Digital Mammography, Jun. 9-12, 1996, pp. 3-10. * |
D.H. Davies and D.R. Dance; "Automated Computer Detection of Clustered Calcifications in Digital Mammograms"; Apr. 12, 1990; pp 1111-1118; Phys. Med. Biol., 1990, vol. 35, No. 8. |
D.S. Brettle, et al; "Automatic Micro-calcification Localisation using Matched Fourier Filtering"; 1994; pp. 21-30; Elsevier Science B.V. |
Hara Takeshi et al, Digital Mammography '96, Proceedings of the 3rd International Workshop on Digital Mammography, Jun. 9-12, 1996, pp. 257-262. |
Kunio Doi, et al.; "Computer-Aided Diagnosis: Development of Automated Schemes for Quantitative Analysis of Radiographic Images"; Apr. 1992; pp 140-152; Seminars in Ultrasound, CT, and MRI, vol. 13, No. 2. |
R.M. Nishikawa, et al.; "Performance of Automated CAD Schemes for the Detection and Classification of Clustered Microcalcifications"; 1994; pp. 13-20; Elsevier Science V.B. |
Shun Leung Ng and Walter F. Bischof; "Automated Detection and Classification of Breast Tumors"; Mar. 4, 1991; pp. 218-237; Computers and Biomedical Research 25. |
W.P. Kegelmeyer, Jr. and M.C. Allmen; "Dense Feature Maps for Detection of Calcifications"; 1994; pp. 3-13; Elsevier Science B.V. |
Cited By (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6453058B1 (en) * | 1999-06-07 | 2002-09-17 | Siemens Corporate Research, Inc. | Computer-assisted diagnosis method using correspondence checking and change detection of salient features in digital images |
US7574053B2 (en) | 1999-12-14 | 2009-08-11 | Definiens Ag | Method for processing data structures |
US20070112823A1 (en) * | 1999-12-14 | 2007-05-17 | Definiens Ag | Method for processing data structures |
US20020028006A1 (en) * | 2000-09-07 | 2002-03-07 | Novak Carol L. | Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images |
US6944330B2 (en) * | 2000-09-07 | 2005-09-13 | Siemens Corporate Research, Inc. | Interactive computer-aided diagnosis method and system for assisting diagnosis of lung nodules in digital volumetric medical images |
US8162833B2 (en) | 2000-11-24 | 2012-04-24 | U-Systems, Inc. | Thick-slice ultrasound images derived from ultrasonic scans of a chestwardly-compressed breast of a supine patient |
US20110098566A1 (en) * | 2000-11-24 | 2011-04-28 | U-Systems, Inc. | Thick-slice ultrasound images derived from ultrasonic scans of a chestwardly-compressed breast of a supine patient |
US9861342B2 (en) | 2000-11-24 | 2018-01-09 | U-Systems, Inc. | Adjunctive ultrasound processing and display for breast cancer screening |
US8184882B2 (en) | 2000-11-24 | 2012-05-22 | U-Systems, Inc | Full-field breast image data processing and archiving |
US7556602B2 (en) * | 2000-11-24 | 2009-07-07 | U-Systems, Inc. | Breast cancer screening with adjunctive ultrasound mammography |
US20090240150A1 (en) * | 2000-11-24 | 2009-09-24 | U-Systems, Inc. | Breast cancer screening with adjunctive ultrasound mammography |
US7940966B2 (en) | 2000-11-24 | 2011-05-10 | U-Systems, Inc. | Full-field breast image data processing and archiving |
US7615008B2 (en) * | 2000-11-24 | 2009-11-10 | U-Systems, Inc. | Processing and displaying breast ultrasound information |
US20050171430A1 (en) * | 2000-11-24 | 2005-08-04 | Wei Zhang | Processing and displaying breast ultrasound information |
US20100040274A1 (en) * | 2000-11-24 | 2010-02-18 | U-Systems, Inc. | Processing and displaying breast ultrasound information |
US20030007598A1 (en) * | 2000-11-24 | 2003-01-09 | U-Systems, Inc. | Breast cancer screening with adjunctive ultrasound mammography |
US7828733B2 (en) | 2000-11-24 | 2010-11-09 | U-Systems Inc. | Coronal and axial thick-slice ultrasound images derived from ultrasonic scans of a chestwardly-compressed breast |
US7828732B2 (en) | 2000-11-24 | 2010-11-09 | U-Systems, Inc. | Breast cancer screening with adjunctive ultrasound mammography |
US20020196967A1 (en) * | 2001-06-13 | 2002-12-26 | Fuji Photo Film Co., Ltd. | Prospective abnormal shadow detecting system |
US7242795B2 (en) * | 2001-06-13 | 2007-07-10 | Fujifilm Corporation | Prospective abnormal shadow detecting system |
US20030039385A1 (en) * | 2001-08-20 | 2003-02-27 | Fuji Photo Film Co., Ltd. | Anomalous shadow detection apparatus |
US20030165262A1 (en) * | 2002-02-21 | 2003-09-04 | The University Of Chicago | Detection of calcifications within a medical image |
US20030231790A1 (en) * | 2002-05-02 | 2003-12-18 | Bottema Murk Jan | Method and system for computer aided detection of cancer |
US7593562B2 (en) | 2002-09-24 | 2009-09-22 | Carestream Health, Inc. | Method and system for computer aided detection (CAD) cued reading of medical images |
US20060122467A1 (en) * | 2002-09-24 | 2006-06-08 | Scott Harrington | Method and system for computer aided detection (cad) cued reading of medical images |
US20040136577A1 (en) * | 2002-10-11 | 2004-07-15 | University Of Massachusetts | Optical fourier systems and methods for medical image processing |
US7508966B2 (en) * | 2002-10-11 | 2009-03-24 | University Of Massachusetts | Optical fourier systems and methods for medical image processing |
US7873223B2 (en) | 2002-10-15 | 2011-01-18 | Definiens Ag | Cognition integrator and language |
US7467159B2 (en) | 2002-10-15 | 2008-12-16 | Definiens Ag | Generating applications that analyze image data using a semantic cognition network |
US9245337B2 (en) | 2002-10-15 | 2016-01-26 | Definiens Ag | Context driven image mining to generate image-based biomarkers |
US20080008349A1 (en) * | 2002-10-15 | 2008-01-10 | Definiens Ag | Analyzing pixel data using image, thematic and object layers of a computer-implemented network structure |
US7801361B2 (en) | 2002-10-15 | 2010-09-21 | Definiens Ag | Analyzing pixel data using image, thematic and object layers of a computer-implemented network structure |
US20070036440A1 (en) * | 2002-10-15 | 2007-02-15 | Definiens Ag | Generating applications that analyze image data using a semantic cognition network |
US20040147840A1 (en) * | 2002-11-08 | 2004-07-29 | Bhavani Duggirala | Computer aided diagnostic assistance for medical imaging |
US7244230B2 (en) | 2002-11-08 | 2007-07-17 | Siemens Medical Solutions Usa, Inc. | Computer aided diagnostic assistance for medical imaging |
US7087018B2 (en) * | 2002-11-13 | 2006-08-08 | Siemens Medical Solutions Usa, Inc. | System and method for real-time feature sensitivity analysis based on contextual information |
US20040133083A1 (en) * | 2002-11-13 | 2004-07-08 | Siemens Corporate Research Inc. | System and method for real-time feature sensitivity analysis based on contextual information |
WO2004047006A1 (en) * | 2002-11-20 | 2004-06-03 | Ostankovich Anatoly Aleksandro | Method for producing an information pattern of breast cancer upon mammograms |
US20080130833A1 (en) * | 2002-11-29 | 2008-06-05 | Shih-Ping Wang | Thick-slice display of medical images |
US20060153434A1 (en) * | 2002-11-29 | 2006-07-13 | Shih-Ping Wang | Thick-slice display of medical images |
US20040146190A1 (en) * | 2003-01-23 | 2004-07-29 | Konica Minolta Holdings, Inc. | Medical image processing system |
US20040258291A1 (en) * | 2003-06-20 | 2004-12-23 | Gustafson Gregory A. | Method and system for tracking abnormality data |
US20080212864A1 (en) * | 2004-02-13 | 2008-09-04 | Sectra Mamea Ab | Method and Arrangement Relating to X-Ray Imaging |
US20050259857A1 (en) * | 2004-05-21 | 2005-11-24 | Fanny Jeunehomme | Method and apparatus for classification of pixels in medical imaging |
US7298884B2 (en) * | 2004-05-21 | 2007-11-20 | General Electric Company | Method and apparatus for classification of pixels in medical imaging |
US20050283076A1 (en) * | 2004-06-18 | 2005-12-22 | Hangiandreou Nicholas J | Non-invasive diagnosis of breast cancer using real-time ultrasound strain imaging |
US20060082595A1 (en) * | 2004-10-15 | 2006-04-20 | Fujitsu Limited | Device part assembly drawing image search apparatus |
EP3391828A1 (en) * | 2004-11-29 | 2018-10-24 | Senorx, Inc. | Graphical user interface for tissue biopsy system |
US10687733B2 (en) | 2004-11-29 | 2020-06-23 | Senorx, Inc. | Graphical user interface for tissue biopsy system |
US20060149162A1 (en) * | 2004-11-29 | 2006-07-06 | Derek Daw | Graphical user interface for tissue biopsy system |
US8795195B2 (en) * | 2004-11-29 | 2014-08-05 | Senorx, Inc. | Graphical user interface for tissue biopsy system |
EP2263547A3 (en) * | 2004-11-29 | 2012-07-18 | Senorx, Inc. | Graphical user interface for tissue biopsy system |
KR100646828B1 (en) | 2005-10-08 | 2006-11-23 | 건국대학교 산학협력단 | System for measurement mass size on breast mammogram and method therefor |
US20070118384A1 (en) * | 2005-11-22 | 2007-05-24 | Gustafson Gregory A | Voice activated mammography information systems |
US20080255849A9 (en) * | 2005-11-22 | 2008-10-16 | Gustafson Gregory A | Voice activated mammography information systems |
US20070280525A1 (en) * | 2006-06-02 | 2007-12-06 | Basilico Robert F | Methods and Apparatus for Computer Automated Diagnosis of Mammogram Images |
US7865002B2 (en) * | 2006-06-02 | 2011-01-04 | Carolina Imaging Specialists, Inc. | Methods and apparatus for computer automated diagnosis of mammogram images |
US20110122138A1 (en) * | 2006-08-28 | 2011-05-26 | Guenter Schmidt | Context driven image mining to generate image-based biomarkers |
US8594410B2 (en) | 2006-08-28 | 2013-11-26 | Definiens Ag | Context driven image mining to generate image-based biomarkers |
US8019134B2 (en) | 2006-11-16 | 2011-09-13 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US8391575B2 (en) | 2006-11-16 | 2013-03-05 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US20080137937A1 (en) * | 2006-11-16 | 2008-06-12 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US8542899B2 (en) | 2006-11-30 | 2013-09-24 | Definiens Ag | Automatic image analysis and quantification for fluorescence in situ hybridization |
US8208700B2 (en) * | 2007-05-15 | 2012-06-26 | Three Palm Software | Mass spicules detection and tracing from digital mammograms |
US20080285825A1 (en) * | 2007-05-15 | 2008-11-20 | Three Palm Software | Mass spicules detection and tracing from digital mammograms |
US8687867B1 (en) * | 2008-09-16 | 2014-04-01 | Icad, Inc. | Computer-aided detection and classification of suspicious masses in breast imagery |
US9089307B2 (en) | 2009-10-30 | 2015-07-28 | Koninklijke Philips N.V. | Three-dimensional analysis of lesions represented by image data |
US20110137132A1 (en) * | 2009-11-24 | 2011-06-09 | Gustafson Gregory A | Mammography Information System |
US8687860B2 (en) | 2009-11-24 | 2014-04-01 | Penrad Technologies, Inc. | Mammography statistical diagnostic profiler and prediction system |
US20110125526A1 (en) * | 2009-11-24 | 2011-05-26 | Greg Gustafson | Multiple modality mammography image gallery and clipping system |
US8799013B2 (en) | 2009-11-24 | 2014-08-05 | Penrad Technologies, Inc. | Mammography information system |
US9183355B2 (en) | 2009-11-24 | 2015-11-10 | Penrad Technologies, Inc. | Mammography information system |
US9171130B2 (en) | 2009-11-24 | 2015-10-27 | Penrad Technologies, Inc. | Multiple modality mammography image gallery and clipping system |
US9256941B2 (en) | 2010-04-30 | 2016-02-09 | Vucomp, Inc. | Microcalcification detection and classification in radiographic images |
US20130208956A1 (en) * | 2010-04-30 | 2013-08-15 | Vucomp, Inc. | Spiculated Malignant Mass Detection and Classification in Radiographic Image |
US8958625B1 (en) | 2010-04-30 | 2015-02-17 | Vucomp, Inc. | Spiculated malignant mass detection and classification in a radiographic image |
US8923594B2 (en) * | 2010-04-30 | 2014-12-30 | Vucomp, Inc. | Spiculated malignant mass detection and classification in radiographic image |
US8855388B2 (en) | 2010-04-30 | 2014-10-07 | Vucomp, Inc. | Microcalcification detection classification in radiographic images |
US9256799B2 (en) | 2010-07-07 | 2016-02-09 | Vucomp, Inc. | Marking system for computer-aided detection of breast abnormalities |
US20130094726A1 (en) * | 2011-10-18 | 2013-04-18 | Olympus Corporation | Image processing device, image processing method, and computer readable storage device |
US9299137B2 (en) * | 2011-10-18 | 2016-03-29 | Olympus Corporation | Image processing device, image processing method, and computer readable storage device |
US9367765B2 (en) | 2011-12-05 | 2016-06-14 | University Of Lincoln | Method and apparatus for automatic detection of features in an image and method for training the apparatus |
GB2497516A (en) * | 2011-12-05 | 2013-06-19 | Univ Lincoln | Generating training data for automation of image analysis |
US20150254846A1 (en) * | 2012-03-14 | 2015-09-10 | Samsung Electronics Co., Ltd. | Diagnostic imaging apparatus and method |
US20130245426A1 (en) * | 2012-03-14 | 2013-09-19 | Samsung Electronics Co., Ltd. | Diagnostic imaging apparatus and method |
US9084578B2 (en) * | 2012-03-14 | 2015-07-21 | Samsung Electronics Co., Ltd. | Diagnostic imaging apparatus and method |
US20160314587A1 (en) * | 2014-01-10 | 2016-10-27 | Canon Kabushiki Kaisha | Processing apparatus, processing method, and non-transitory computer-readable storage medium |
US10102622B2 (en) * | 2014-01-10 | 2018-10-16 | Canon Kabushiki Kaisha | Processing apparatus, processing method, and non-transitory computer-readable storage medium |
US10595805B2 (en) | 2014-06-27 | 2020-03-24 | Sunnybrook Research Institute | Systems and methods for generating an imaging biomarker that indicates detectability of conspicuity of lesions in a mammographic image |
US20170150941A1 (en) * | 2014-07-02 | 2017-06-01 | Koninklijke Philips N.V. | Lesion signature to characterize pathology for specific subject |
US10639415B2 (en) * | 2014-09-04 | 2020-05-05 | Samsung Electronics Co., Ltd. | Medical imaging apparatus and controlling method thereof |
US20160067402A1 (en) * | 2014-09-04 | 2016-03-10 | Samsung Electronics Co., Ltd. | Medical imaging apparatus and controlling method thereof |
US10314563B2 (en) | 2014-11-26 | 2019-06-11 | Devicor Medical Products, Inc. | Graphical user interface for biopsy device |
US11137462B2 (en) * | 2016-06-10 | 2021-10-05 | Board Of Trustees Of Michigan State University | System and method for quantifying cell numbers in magnetic resonance imaging (MRI) |
US20190303567A1 (en) * | 2018-03-28 | 2019-10-03 | Nvidia Corporation | Detecting data anomalies on a data interface using machine learning |
US11934520B2 (en) * | 2018-03-28 | 2024-03-19 | Nvidia Corporation | Detecting data anomalies on a data interface using machine learning |
CN117481672A (en) * | 2023-10-25 | 2024-02-02 | 深圳医和家智慧医疗科技有限公司 | Intelligent method for rapid screening in early stage of mammary tissue hardening |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6246782B1 (en) | System for automated detection of cancerous masses in mammograms | |
US7646902B2 (en) | Computerized detection of breast cancer on digital tomosynthesis mammograms | |
US7903861B2 (en) | Method for classifying breast tissue density using computed image features | |
US5657362A (en) | Automated method and system for computerized detection of masses and parenchymal distortions in medical images | |
US6760468B1 (en) | Method and system for the detection of lung nodule in radiological images using digital image processing and artificial neural network | |
US5133020A (en) | Automated method and system for the detection and classification of abnormal lesions and parenchymal distortions in digital medical images | |
US4907156A (en) | Method and system for enhancement and detection of abnormal anatomic regions in a digital image | |
US5003979A (en) | System and method for the noninvasive identification and display of breast lesions and the like | |
AU2005207310B2 (en) | System and method for filtering a medical image | |
JP4184842B2 (en) | Image discrimination device, method and program | |
Ayres et al. | Characterization of architectural distortion in mammograms | |
Qian et al. | Image feature extraction for mass detection in digital mammography: Influence of wavelet analysis | |
US6738500B2 (en) | Method and system for detecting small structures in images | |
MacMahon et al. | Computer-aided diagnosis in chest radiology | |
JP2001510360A (en) | Lump detection in digital radiographic images using a two-stage classifier | |
JPH09508815A (en) | Improved Computerized Automated Method and System for Mammography Finding and Classification | |
JPH11501538A (en) | Method and system for detecting lesions in medical images | |
Schilham et al. | Multi-scale nodule detection in chest radiographs | |
Rizzi et al. | A supervised method for microcalcification cluster diagnosis | |
Lado et al. | Real and simulated clustered microcalcifications in digital mammograms. ROC study of observer performance | |
WO2000005677A1 (en) | System for automated detection of cancerous masses in mammograms | |
WO2002015113A2 (en) | Mammography screening to detect and classify microcalcifications | |
Haldorai et al. | Survey of Image Processing Techniques in Medical Image Assessment Methodologies | |
Nguyen et al. | Detect abnormalities in mammograms by local contrast thresholding and rule-based classification | |
Sample | Computer assisted screening of digital mammogram images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LOCKHEED MARTIN CORPORATION, MARYLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHAPIRO, GARY LEE;OLIVER, DAVIS REGOALT, JR.;REEL/FRAME:008629/0083 Effective date: 19970606 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |