WO2001069286A2 - System and method for data analysis of x-ray images - Google Patents
System and method for data analysis of x-ray images Download PDFInfo
- Publication number
- WO2001069286A2 WO2001069286A2 PCT/US2001/008692 US0108692W WO0169286A2 WO 2001069286 A2 WO2001069286 A2 WO 2001069286A2 US 0108692 W US0108692 W US 0108692W WO 0169286 A2 WO0169286 A2 WO 0169286A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wavelet
- image
- dimensional
- scale
- circular symmetric
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/52—Scale-space analysis, e.g. wavelet analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
Definitions
- This invention relates to computerized image processing, and more specifically to the matching of computerized images
- Wavelet transforms use various high pass and low pass filters to filter out either high frequency or low frequency portions of the signal
- the signal is a row or column of pixels
- the transform of the image is the combined transform of the rows and columns This procedure is repeated and each time some portion of the signal corresponding to some frequencies being removed from the signal
- a signal has frequencies up to 1000 Hz
- the first stage it is split into two parts by passing the signal through a high pass and a low pass filter which results in two different versions of the same signal portion of the signal corresponding to 0-500 Hz (low pass portion), and 500-1000 Hz (high pass portion)
- the process may continue like this until the signal is decomposed to a predefined certain level. That provides a collection of signals which represent the same signal, but all corresponding to different frequency bands.
- a complete transform is usually not necessary.
- the image signals are bandlimited, and therefore, computation of the transform for a limited interval of scales is usually adequate.
- the one-dimensional wavelet transform above described is easily extended to the two-dimensional wavelet transform.
- the two-dimensional wavelet transform is useful for image analysis.
- Conventional 2-dimensional wavelet transforms apply a one- dimensional wavelet transform to each row of an image. Then the same transform is applied to each column. However, the one-dimensional wavelet transform is performed sequentially level by level.
- a wavelet matching method fits a mother wavelet to a signal of interest, such that: (1) the wavelet generates an orthogonal multi-resolution analysis (MRA) and (2) the wavelet is as close as possible to the given signal in the sense of minimizing the mean squared errors between their squared amplitude spectra and their group delay spectra respectively. This generates a wavelet that is close to the given signal in shape.
- MRA orthogonal multi-resolution analysis
- the invention provides a two dimensional smooth circular symmetric wavelet that is rotatable.
- the wavelet is a modification of the Mexican hat wavelet.
- the wavelet is used in computer imaging systems and programs to process images.
- Reference images of a target for example, x-ray images of an apple, are processed to provide one or more filter wavelets that are matched to the reference image(s).
- Each filter wavelet may be chosen for its desired characteristic.
- the wavelet that has the highest coefficient of convolution provides the most likely filter wavelet for use in detecting images of the object in target images.
- the target images may be photographs or x-ray images.
- the wavelet operates on the image to recognize portions of the image that correlate to the filter wavelet. As such, the target image may include a number of images of the target object and other objects.
- the invention is implemented in a computer that matches a target image signal to one or more known or reference image signals.
- the computer has a processor and a main memory connected to the processor.
- the memory holds programs that the processor executes and stores data relevant to the operation of the computer and the computer programs.
- the computer has an output display subsystem connected to the processor and data entry subsystem connected to the processor.
- An input device such as a keyboard, mouse or scanner enters commands and data.
- a software program stored in the memory operates the computer to perform an number of steps for creating a matched wavelet of an object image and for processing a target image to detect the target object in the target image.
- the program begins by matching a plurality of two dimensional circular symmetric daughter wavelets to an original pixel image to produce a filter matched wavelet that operates on target images.
- Fig.4. illustrates a wavelet matching procedure.
- Fig. 5. is a flowchart of a general algorithm for wavelet matching.
- Fig. 7 is a flowchart of a shrink-convolution-restore algorithm.
- Fig. 14a is a matching wavelet filter for an apple.
- This 2-D circular symmetric wavelet function is defined as:
- the central aim of each loop is to find the maximum coefficient and its location.
- the flowchart of Fig. 6 shows the steps in finding the maximum coefficient M and the coordinates (x, y) where the value M is taken.
- the basic approach to calculating the 2D wavelet function and performing the wavelet transform requires substantial computing time and computer memory. It was desired to have one or more methods for reducing the time and computing capacity for a wavelet transform without substantially sacrificing accuracy.
- the invention provides several methods for quickly calculating the maximum coefficient M.
- the invention's matching wavelet possesses two important special properties: its behavior under dilation and rotation. These properties make the matching method more useful and efficient in the optimal detection of a signal when in the presence of signal dilations and noise.
- Fig. 9 shows a sample result of matching an object (a "pear").
- the result of matching is a sequence of coefficients, including the coordinates, the values, and their dilations. Once this information is calculated, the pattern image is expressed as the sum of circular wavelet units.
- the invention varies the matched result by simply making changes to the coefficients. One such change is dilation.
- Fig. 10 shows the above matching result being dilated in 2 scales, and the enlarged (shrunk) images are placed below each dilation for comparison. Behavior Under Rotation
- the result of matching has the property of rotation invariance which makes it possible that only one angular position per object is needed to be calculated through the matching process, and the matched wavelet at other angles can be quickly obtained by simply rotating the one matched result to the right angle.
- Fig. 11 shows the procedure of matching the pear at varying rotation angles.
- the invention's matching algorithm has been applied to many different images to extract features about edge, shape, and texture: x-ray images of fruit, luggage, mechanical parts, and CT slices, as well as samples of ultrasonic images and thermal images.
- Fig. 12 shows part of a mechanical fuse, with a spring cut out as the pattern to be matched. The matching result was then applied to sample images to detect the possible defects caused by improper positioning of the spring.
- Figs. 13a-d show two sample images and the results after applying the above matching filter.
- the relative position of the spring 10 with respect to the block 20 can be detected by analyzing the values along the lines 10a- 1 Od in the figures on the right, which are the results of applying the matching result of Fig. 12.
- the image in row one-left has the spring touching the block closely and corresponds to the transform result at right having less blue line above the red line.
- the spring doesn't touch the block and correspondently its transformation has one solid blue line above the red line.
- Fig. 14 shows one filter created using the matching wavelet method for an apple image and the result of applying that filter to the entire image of the apple. The filter was calculated using a small patch on the pulpy area of the apple.
- Fig. 15 shows original images of a cherry (upper row) and a piece of candy (lower row) and demonstrate the use of the method to distinguish between the two by applying a matching wavelet for texture pattern recognition.
- a feature extraction method that uses the class labels of the data may miss important structure that is not exhibited in the class labels, and therefore be more biased to the training data than a feature extractor that relies on the high-dimensional structure. This suggests that an unsupervised feature extraction method may have better generalization properties in high-dimensional problems.
- the Bienenstock, Cooper and Munro (BCM) neuron performs exploratory projection pursuit using a projection index that measures multi-modality. Sets of these neurons are organized in a lateral inhibition architecture which forces different neurons in the network to find different projections (i.e., features).
- a network implementation which can find several projections in parallel while retaining its computational efficiency, can be used for extracting features from very high dimensional vector space.
- the invention includes an Unsupervised Feature Extraction Network (UFENET) based on the BCM Neuron.
- UFENET Unsupervised Feature Extraction Network
- the BCM neurons are constructed such that their properties are determined by a nonlinear function ⁇ m which is called the modification threshold.
- the inhibited activity of neuron k is defined as
- Fig. 17 presents the flowchart for the UFENET adaptation algorithm.
- M — I — ⁇ (E — I) where I is the unit matrix and E is the matrix with every
- G '( x) is the derivative of G ( x) .
- the invention uses the following method to judge the status of a BCM network. 1 .
- the invention's goal here is only to find a set of good features such that the critical information for discriminating among different clusters is kept. Therefore, for the consideration of efficiency, precise convergence of the network is not really necessary. Therefore, a looser standard to judge the network results is used to observe the clustering of the resulting features extracted by using the neurons obtained.
- the single neuron network is used in two ways: it is simple, making it usable in analyzing and adjusting the network, and it has a single output when applied to the data, allowing its use as a simple classifier.
- the 6 pictures in Fig. 21a have small inner products (-0.72-0.14), corresponding to the six images with no gap (i.e., touching), while the 6 in Fig. 21b have large inner products (7.9-11.0) corresponding to the other six images with wider gap (i.e., not touching).
- Figs. 22a,b samples of cherries and hard candy have a matching wavelet applied and the results are sent to the network.
- Fig. 22b shows that the cherry feature vectors (represented by the dotted lines) and the candy features (represented by the plain lines) are separated (almost) after training.
- the invention is most useful in doing a variety of different types of image analysis.
- the matched wavelet of a sample can be applied to a given image and the locations of high similarity (a good match) will show strong distinguishable peaks.
- the invention can thus examine images individually and manually, or it can automate the analysis of images through the use of the UFENET and possibly a back-propagation network.
- the analysis can be done in a spatial context or in a textural context. See Fig. 23.
- the invention creates a matched wavelet of a shape, object or arrangement of objects being examined.
- the invention applies this filter to samples that may or may not contain the desired shape.
- the filter returns a strong signal on samples that match well.
- the invention reduces the dimensionality of the output to allow better examination of the data.
- the invention may then pass this data to a back-propagation network if fully automatic grouping is desired. This allows each sample processed to be labeled as group A/group B or good/bad etc. Processing the texture of samples is another application. Often the differences between a "good" sample and a "bad” sample are hidden in the texture. In this case the mvention subtracts the background from each of the samples (Fig.
- the invention creates a matched wavelet of one of the classes to be differentiated by applying the matching algorithm (described below) on a sample.
- the matched wavelet returned by the matching procedure can be applied as a whole, or as is the case with texture, it is sometimes helpful to use only a small piece of the matched wavelet as the filter.
- the basic flow of the matching procedure is to apply a number of different-scale circular wavelets to the input image (using convolution) on each pass of the main loop and to choose the one coefficient from the pool of all the transforms that has the greatest absolute value. The location and value of this coefficient is determined and stored along with the scale of the wavelet that produced it. Next a circular wavelet equal to the determined value multiplied by a circular wavelet (of the stored scale) is subtracted from the original image and added to an accumulator image (the matched wavelet). The next time through the loop the process is repeated using the remainder of the previous loop. This process continues until one of two conditions is met: the maximum number of coefficients is reached or the correlation between the original image and the matched wavelet is above a desired level.
- Step 2 (Fig. 24 Reference 2): Main Loop
- Step 5 If the current scale wavelet is smaller than a determined scale (baseScale) calculate coefficients in a straightforward manner (go to Step 5) otherwise we will shrink the current remainder and the wavelet to speed up the process and get an approximate location of the maximum. We then return to the current remainder and calculate more exact values in the area that we have determined the true max lies, (go to Step 6) Step 5 (Fig. 24 Reference 12): Convolve the circular wavelet at the current scale with the current remainder and store in templmg. (goto Step 13) Step 6 (Fig. 24 Reference 6):
- Step 7 (Fig. 24 Reference 9): Convolve the new shrunken remainder with the wavelet at baseScale.
- Step 14 (Fig. 24 Reference 18): Increment curScale and goto Step 3
- Step 15 (Fig. 24 Reference 7):
- Step 16 (Fig. 24 Reference 10) :
- Step 17 (Fig. 24 Reference 13): Subtract the subtraction image from remainder to create a new remainder.
- Step 19 (Fig. 24 Reference 21): return matched wavelet (original - remainder)
- the matched wavelet can now be applied to individual images for analysis or to groups of images for processing by the UFENET.
- Each wavelet transform returned by this process is then reshaped to a one-dimensional vector, and they are all grouped together into a matrix for use with the UFENET.
- Each row in the matrix is the one-dimensional version of the wavelet transform for a single sample.
- the resulting matrix is 40 x N 2 .
- UFENET Training and/or Feature Extraction Once we have our matrix (we will call it d) we can use the UFENET (described below) to reduce the dimensionality of the data. Instead of dealing with N 2 values per sample a UFENET with b neurons will reduce the number of values per sample to b. We can then use a smaller number of features to classify our samples.
- the output of training is an anay of weights that when applied to our matrix d, will give us a new matrix of fewer dimensions.
- Step 1
- Step 14 Set the current inhibited activity ckbar as the sel-th column ckbars(:,sel) of the whole inhibited activities matrix ckbars.
- Step 21
- Input Parameters a -the scale of the desired 2D circularly symmetric wavelet to return
- suitcase may be quickly transformed to locate one or more contraband items, such as weapons or illegal substance.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2001269679A AU2001269679A1 (en) | 2000-03-16 | 2001-03-16 | System and method for data analysis of x-ray images |
US10/221,879 US20040022436A1 (en) | 2001-03-16 | 2001-03-16 | System and method for data analysis of x-ray images |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18973600P | 2000-03-16 | 2000-03-16 | |
US60/189,736 | 2000-03-16 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2001069286A2 true WO2001069286A2 (en) | 2001-09-20 |
WO2001069286A3 WO2001069286A3 (en) | 2002-03-14 |
WO2001069286A9 WO2001069286A9 (en) | 2002-12-19 |
Family
ID=22698561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/008692 WO2001069286A2 (en) | 2000-03-16 | 2001-03-16 | System and method for data analysis of x-ray images |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2001269679A1 (en) |
WO (1) | WO2001069286A2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006074571A1 (en) * | 2005-01-17 | 2006-07-20 | Pixartis Sa | Temperature mapping on structural data |
US7412103B2 (en) | 2003-10-20 | 2008-08-12 | Lawrence Livermore National Security, Llc | 3D wavelet-based filter and method |
CN104933724A (en) * | 2015-06-25 | 2015-09-23 | 中国计量学院 | Automatic image segmentation method of trypetid magnetic resonance image |
US9322807B2 (en) | 2014-04-16 | 2016-04-26 | Halliburton Energy Services, Inc. | Ultrasonic signal time-frequency decomposition for borehole evaluation or pipeline inspection |
CN113160080A (en) * | 2021-04-16 | 2021-07-23 | 桂林市啄木鸟医疗器械有限公司 | CR image noise reduction method, device, equipment and medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5598481A (en) * | 1994-04-29 | 1997-01-28 | Arch Development Corporation | Computer-aided method for image feature analysis and diagnosis in mammography |
US5619998A (en) * | 1994-09-23 | 1997-04-15 | General Electric Company | Enhanced method for reducing ultrasound speckle noise using wavelet transform |
-
2001
- 2001-03-16 AU AU2001269679A patent/AU2001269679A1/en not_active Abandoned
- 2001-03-16 WO PCT/US2001/008692 patent/WO2001069286A2/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5598481A (en) * | 1994-04-29 | 1997-01-28 | Arch Development Corporation | Computer-aided method for image feature analysis and diagnosis in mammography |
US5619998A (en) * | 1994-09-23 | 1997-04-15 | General Electric Company | Enhanced method for reducing ultrasound speckle noise using wavelet transform |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7412103B2 (en) | 2003-10-20 | 2008-08-12 | Lawrence Livermore National Security, Llc | 3D wavelet-based filter and method |
WO2006074571A1 (en) * | 2005-01-17 | 2006-07-20 | Pixartis Sa | Temperature mapping on structural data |
US9322807B2 (en) | 2014-04-16 | 2016-04-26 | Halliburton Energy Services, Inc. | Ultrasonic signal time-frequency decomposition for borehole evaluation or pipeline inspection |
CN104933724A (en) * | 2015-06-25 | 2015-09-23 | 中国计量学院 | Automatic image segmentation method of trypetid magnetic resonance image |
CN104933724B (en) * | 2015-06-25 | 2019-07-26 | 中国计量学院 | The Automatic image segmentation method of trypetid magnetic resonance image |
CN113160080A (en) * | 2021-04-16 | 2021-07-23 | 桂林市啄木鸟医疗器械有限公司 | CR image noise reduction method, device, equipment and medium |
CN113160080B (en) * | 2021-04-16 | 2023-09-22 | 桂林市啄木鸟医疗器械有限公司 | CR image noise reduction method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
WO2001069286A9 (en) | 2002-12-19 |
WO2001069286A3 (en) | 2002-03-14 |
AU2001269679A1 (en) | 2001-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Berbar | Hybrid methods for feature extraction for breast masses classification | |
Letexier et al. | Noise removal from hyperspectral images by multidimensional filtering | |
Gross et al. | Multiscale image texture analysis in wavelet spaces | |
US20040022436A1 (en) | System and method for data analysis of x-ray images | |
WO2006064239A1 (en) | Method of identifying features within a dataset | |
Dash et al. | Multi-resolution Laws’ Masks based texture classification | |
Celik et al. | Bayesian texture classification and retrieval based on multiscale feature vector | |
Luo et al. | Fingerprint classification combining curvelet transform and gray-level cooccurrence matrix | |
Bala et al. | Wavelet and curvelet analysis for the classification of microcalcifiaction using mammogram images | |
Çelik et al. | Multiscale texture classification and retrieval based on magnitude and phase features of complex wavelet subbands | |
Noiboar et al. | Anomaly detection based on wavelet domain GARCH random field modeling | |
Ramakrishnan et al. | Image texture classification using wavelet based curve fitting and probabilistic neural network | |
Lee et al. | ECG-based biometrics using a deep network based on independent component analysis | |
WO2001069286A2 (en) | System and method for data analysis of x-ray images | |
Sharma et al. | Performance evaluation of 2D face recognition techniques under image processing attacks | |
Kara et al. | Using wavelets for texture classification | |
Meade et al. | Comparative performance of principal component analysis, Gabor wavelets and discrete wavelet transforms for face recognition | |
NAYEBİ et al. | Dorsal Hand Veins Based Biometric Identification System Using Deep Learning | |
Wang et al. | Texture image retrieval using dual-tree complex wavelet transform | |
Aro et al. | Enhanced Gabor features based facial recognition using ant colony optimization algorithm | |
Yifan et al. | Contourlet-based feature extraction on texture images | |
Tang | Status of pattern recognition with wavelet analysis | |
Alterson et al. | Object recognition with adaptive Gabor features | |
AU2021101325A4 (en) | Methodology for the Detection of Bone Fracture in Humans Using Neural Network | |
Hill et al. | Rotationally invariant texture classification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
AK | Designated states |
Kind code of ref document: C2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: C2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
COP | Corrected version of pamphlet |
Free format text: PAGES 36-42, CLAIMS, REPLACED BY NEW PAGES 36-42; AFTER RECTIFICATION OF OBVIOUS ERRORS AS AUTHORIZED BY THE INTERNATIONAL SEARCHING AUTHORITY; PAGES 1/21-21/21, DRAWINGS, REPLACED BY NEW PAGES 1/21-21/21 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10221879 Country of ref document: US |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: RULE 69(1) EPC |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |