WO2003003303A2 - Image segmentation - Google Patents
Image segmentation Download PDFInfo
- Publication number
- WO2003003303A2 WO2003003303A2 PCT/GB2002/002945 GB0202945W WO03003303A2 WO 2003003303 A2 WO2003003303 A2 WO 2003003303A2 GB 0202945 W GB0202945 W GB 0202945W WO 03003303 A2 WO03003303 A2 WO 03003303A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- grey
- pixel
- pixel unit
- image
- level intensity
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20152—Watershed segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20156—Automatic seed setting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present invention relates to a process for segmenting images.
- CT X-ray Computed Tomography
- a computer stores a large amount of data from a selected region of the scanned object, for example, a human body, making it possible to determine the spatial relationship of radiation-absorbing structures within the scanning x-ray beam. Once an image has been acquired by scanning it is then subj ected to segmentation which is a technique for delineating the various organs within the scanned area.
- Segmentation can be defined as the process which partitions an input image into its relevant constituent parts or objects, using image attributes such as pixel intensity, spectral values and textural properties.
- image attributes such as pixel intensity, spectral values and textural properties.
- the output of this process is an image represented in terms of edges, regions and their interrelationships. Segmentation is a key step in image processing and analysis, but it is one of the most difficult and intricate tasks. Many methods have been proposed to overcome image segmentation problems, but all of them are application dependent and problem specific.
- Radiotherapy Treatment Planning RTP
- RTP Radiotherapy Treatment Planning
- PTV Planning Target Volume
- the present invention seeks to provide an improved method of segmentation of an image.
- the present invention provides a method of segmenting an image comprising:
- apixel unit from a first group of pixel units in which the pixel units all have substantially the same grey-level intensity; comparing the grey-level intensity of said first pixel unit with the grey-level intensity of each of a plurality of selected pixel units of said image;
- each said selected pixel unit as a pixel unit of the same region as said first pixel unit in response to the grey-level intensity of said adj acent pixel unit falling within a preselected grey-level intensity range;
- the present invention also provides a method of segmenting an image comprising the steps of:
- said first group of pixel units is the largest group of pixel units in the image and said further group of pixel units is the next largest group of pixel units.
- ' 'pixel unit is used herein to refer to a single pixel or a group of adj acent pixels which are treated as a single pixel.
- the method further comprises the steps of building a mosaic image, deriving the gradient of the mosaic image and applying a watershed transform to said gradient to provide said segmented image.
- the method further comprises the step of applying a merging operation to said segmented image to reduce segmentation of the image.
- each said pixel unit is a single pixel.
- Figure 1 is a view of an image produced by a CT scan
- Figure 1 a is a flow chart of an image processing technique according to the present invention which can be applied to the image of Figure 1;
- Figure 2 is an image produced from the image of Figure 1 by application of a Watershed transform
- Figure 3 is a mosaic image generated from the image of Figure 1;
- Figure 4 is an image produced by a Watershed transformation of the image of Figure 3;
- Figures 5A and 5B are frequency histograms of two of a set of image "slices" similar to that of Figure 1;
- Figure 6 is a frequency histogram showing a Gaussian distribution curve and anon Gaussian distribution curve superimposed on one another;
- FIG. 7 is a simplified flowchart showing the process of operation ofaprefened method according to the present invention.
- FIG 8 is a detailed flowchart of part A of the process of Figure 7;
- FIG. 9 is a detailed flowchart of part B of the process of Figure 6.
- Figure 10 is a chart of histograms illustrating the effect of a couch and background on the Mstogram of Figure 9.
- Figure 1 shows an original grey scale image which is produced by a CT scan.
- Figure 1 a is a flow chart of an image processing technique according to the present invention which can be applied to the image of Figure 1.
- the image is transformed into a mosaic image and the gradient image obtained. It is the magnitude of the gradient which is used in order to avoid negative peaks .
- a morphological gradient operator would avoid the production of negative values and produces an image which can be used directly by a Watershed transform.
- the Watershed transform followed by a merging process is then applied to provide the final image of Figure 2.
- the number of discrete regions in the image of Figure 2 is considerable and would normally be of the order of several thousands, hi this particular example the number of regions is seven thousand nine hundred and sixty-eight.
- This image would then need to be processed manually by a skilled operator in order to produce a reasonable image for viewing by the medical practitioner (given the large niunber of regions this may become prohibitive in terms of time).
- the original image is digitally coded and stored with each unit (byte) of the digitally stored image representing the grey scale level of a pixel of the original image.
- the loss of information which occurs when the original image of Figure 1 is transformed into the mosaic image of Figure 3 , is important, the main contours of the initial image of Figure 1 are preserved, hi such a simplified image, regions with identical grey levels may include actually different structures due to overgrowing.
- the simplified image is further transformed.
- the pixels of the image are stored in a temporary list (the boundary list) of pixels which are to be analysed. This list contains spatial information (x and y co-ordinates) and the intensity value of the pixels (grey-level).
- a multi-region growing algorithm is used. This starts with a seed pixel which can be provided by the user who selects a seed point in the original image of Figure 1. This has previously been effected manually, for example by using a pointing device such as a mouse. The seed point chosen would nonnally be inside a region of interest in the image.
- a frequency histogram of the grey-levels of the original image is first of all determined, hi this way, each grey-level is referenced to each pixel within the original image which belongs to that particular level.
- Figures 5A and 5B show a histograms of two image slices similar to that of Figure 1 in which it can be seen that various parts of the body such as muscles, organs and bone structures are characterised by or exhibit different grey-levels and therefore different distributions in the histogram.
- a predetermined grey-level in each distribution is taken as corresponding to the intensity value of a representative pixel of the region which is represented by that distribution.
- the pixels of each distribution which form the representative pixels are selected as the seed pixels for each growing operation.
- Each distribution of the histogram maybe a Gaussian or non Gaussian distribution and Figure 6 shows a diagrammatic representation of two distribution curves 10, 12 of a frequency histogram.
- the curves represent two different regions of the histogram but are superimposed on one another to illustrate the differences between a Gaussian and anon Gaussian distribution.
- Curve 10 shows a Gaussian distribution with the threshold minimum and maximum grey levels for the region represented by the curve 10beingchosenatJ m consult andZ, mffiC (points 14 and 16 on the curves).
- Curve 12 shows a non Gaussian distribution superimposed on curve 10 with the minimum and maximum grey levels for the region represented by the curve also being chosen at L mm and L max .
- the threshold grey levels would be different values, but they are shown here having the same values for ease of explanation.
- the predetermined grey level used to define the representative pixel (seed pixel) for each region is the average grey level in each distribution.
- the average grey level in the distribution will not be equal to the peak of the distribution (curve 12).
- the predetermined grey level used to define the representative pixel (seed pixel) for each region could be the average grey level, the grey level corresponding to the pealc of the dishibution or the grey level corresponding to the central position between the thresholds L mm and L, mv
- the grey level values of the pixels are sorted according to frequency in descending order, ie the pixels having an intensity value which occurs most frequently are placed first in the sorting order.
- the effect of this is that the representative pixels will occur at the beginning of the ordered boundary list. It will be appreciated, therefore, that the region that occupies the largest portion of the image is grown first, the region occupying the second largest portion is grown second and so on.
- the growing process for the first region begins with the first pixel at the head of the ordered boundary list.
- the first pixel in the list is scanned in order to determine whether or not the grey-level of the pixel lies within a certain intensity range . If the scanned pixel meets the requirement it is transferred to a further store in a new list (the region list) . If the pixel does not meet the requirement then it is ignored.
- the eight immediately adj acent, surrounding pixels (which may or may not belong to distributions other than the one currently being created) of the image are tested to detennine if they also meet the requirement and can therefore be included in the region being grown. If a neighbour pixel being tested has aheady been assigned to a region then it is ignored. If the neighbour pixel has not aheady been assigned to a region and passes a statistical test for homogeneity criteria (ie if the pixel grey-level lies within a certain intensity range) it is inserted in the region list and its identifier value in the original image is changed to the region value. This procedure is repeated until all the pixels in the image belong to one of the regions. It will be appreciated that whilst the scanning refers to eight adj acent pixels, the scan maybe effected using other connectivities e.g. four or six.
- the following test is used as a basis for including a pixel in a region and applies for Gaussian distributions. It also applies for non Gaussian distributions where the average grey level intensity L ave is used to determine the seed pixel.
- a pixel p xy of intensity L (xy) is included in the region list if it passes the similarity criteria, i.e., if the following condition is satisfied:
- L ave is the average grey intensity level and T w is a threshold "window" control parameter
- T w is a threshold "window" control parameter
- L ave is equal to the pealc value grey level and is midway between L max and L min .
- T w is equal to ( L max - L min )/2.
- the parameter L ave acts as a cental value for growing the region, and the parameter T w acts as a tlrresholding distance in pixel intensity units from the central value.
- the values of the level parameter L ave and window control parameter J H ,mustbe set appropriately.
- the value o ⁇ L ave maybe set to the intensity value of the seed pixel, which in turn represents the central value of the region to be grown.
- it maybe obtained from aprevious processing step, which includes a statistical analysis of pixels around the region of interest, h this case L ave can be set equal to the mean of the sample region.
- L ave can be set equal to the mean of the sample region.
- a 20 x 20 pixel matrix is taken for the sample, but larger samples introduce a degree of data smoothing and may give more accurate calculation of the region statistics.
- the sample area is too large then the computational time can become too long.
- the values of the parameter _T w can be set interactively or automatically.
- the user can specify the value in a window which forms part of the GUI (graphical user interface) control panel for the algorithm.
- a range of results can be quickly observed simply by setting the threshold value T w at different levels in order to extract different regions from the original image. As will be appreciated, if the seed pixel remains the same, a higher value for the threshold T w will normally result in larger regions being grown. Changing the seed pixel for the same threshold value T w will also produce a different grown region pattern.
- the process produces good results with high contrasting obj ects within the image, such as pelvic bones and body contour. However, this is not the case when segmenting soft tissues such as the bladder and seminal vesicles where the contrasts are relatively low between obj ects .
- T w threshold value
- T w results in a relatively small niunber of regions being produced (typically several hundred) which results in a loss of structures.
- T w it is possible to obtain segmentation of just the bones and the body contour.
- the threshold value T w can be computed by the region growing algorithm which examines the statistics of the pixels within a sample regioni? of about 20 pixels i size (the figure of 20 may, of course, be varied as required). This sample regioni? is located centrally over the seed point of the region.
- the window threshold parameter ⁇ T w is computed by multiplying the standard deviation of the sample region with a scaling factor K which is dependent on the signal to noise ratio in the image. A scaling factor K of value of 2.0 has been found to give reasonable results for CT and Magnetic Resonance (MR) images.
- the threshold value T w for each region is calculated automatically by taking into account the histogram information.
- the threshold value T w for each region is calculated prior to and independently of the growing process and is effected firstly by looking for sequences of pixels in the histogram that follow a "pealc like" pattern. To avoid identifying false pealcs because of noise, the process ignores pealcs wliich have a pixel widtli less than a preselected number, typically seven pixels. If the grey-level spacing between adj acent pealcs is relatively large then the threshold value T w for the region being grown can also be large. Where the adjacent pealcs are close together on the grey-level scale then the threshold value T w will need to be relatively small.
- the segmented image may still contain some false regions that are produced as a result of CT artifacts. These are undesrred regions which are not wanted by the clinicians and are removed through a merging process.
- the merging process looks at adj acent regions and will merge a first region into an adj acent second region if the number of elements of the first region are:
- An element is a preselected area of a region and is typically a single pixel.
- the intensity level of each of the pixels is adjusted to that of the pixels of the second region.
- the resulting image is the mosaic image shown in Figure 3. It is a simplified image made of a mosaic of homogeneous pieces of constant grey-levels and is a homotopy modification of the original image.
- the boundaries of the grey scale areas in the image are differentiated to provide boundary ridges to which a Watershed transform can be applied.
- the above process can be applied in different domains without previous knowledge of the regions of interest within the original image.
- the preferred method is based on homotopy modification of the original image prior to applying the Watershed transformation.
- the homotopy modification of the original image produces a mosaic image.
- Figures 7 illustrates a flow chart showing the steps which are carried out in order to obtain the image of Figure 4.
- Figure 8 is a flow chart showing in more detail the steps for region growing of Figure 7 and
- Figure 9 shows in more detail the steps for obtaining the gradient of the mosaic image of Figure 3 with Gaussian smoothing. It will be appreciated that other ways of obtaining the gradient used by the Watershed transform can be used, for example, morphological gradient/operators .
- the technique of analysing histograms aims to determine a seed pixel and a threshold.
- Figure 10 shows three different histograms 20, 22 and 24 similar to those of Figures 5 A and 5B of a pelvic CT image.
- Graph 20 is from the original CT image
- graph 22 is graph 20 with the couch removed
- graph 24 is graph 20 without the couch and background.
- this contains four distinct pealcs 30, 32, 34 and 36. These have been found automatically using relational operators to define pealcs in the histogram and a minimum height to allow small pealcs to be disregarded.
- the first peak 30 is by far the largest, typically being composed of about half of all the image pixels. It is located at the low intensity end of the histogram and analysis of the image shows that this represents mainly air with some background counts.
- the second pealc 32 very close to the first, is much smaller, with only about 1.5% of pixels at the pealc grey-level. This represents much of the image of the couch on which the patient lies, although this will vary between couches.
- the final two pealcs 34, 36 are located further along the histogram and very close together. This indicates a degree of overlap in intensities between regions. These are separated by finding the local minimum between the peaks using a similar method to that used to findpeaks automatically.
- the darker pealc 34 represents fat and soft tissue.
- the brighter pealc 36 represents muscle and organs. These pixels include the bladder and prostate.
- the bones and rectum region wliich include a wide range of grey-level are not represented by pealcs but by valleys or plateau.
- the interior of the rectum is located at the grey-levels between pealcs 32 and 34 as depicted in the top left image in Figure 10.
- the bones canbefoundat grey-levels above the fourth pealc 36.
- the threshold value T w (L max . A - L min . A )/2
- the seed point (L max . A + L min . A )/2
- the threshold value T w (L max . D - L min . D )l2
- the seed point (L max . D + L min . D )l2
- the threshold value T w (L max . B - L min . B )/2
- the seed point (L max . B + L min . B )l2
- the threshold value T w (L max . c - L min . c )/2
- the seed point (L max _ c + L mifact. c )l2
- the original code was modified such that the rectum can be identified from the sharp cut-off, below w ch no pixels are found.
- TMs cut-off grey- level has been used to define the start ofthe lowest tirreshold region in a modified image.
- the result of applying the multi-region growing gives us a simplified image made of a mosaic of homogeneous pieces of constant grey-levels (mean grey-level ofthe growth region) with the same properties as the mosaic image.
- TMs produces a homotopy modification ofthe original image and consequently ofthe gradient image.
- the watershed transform in this simplified image the number of watershed lines and the computational process in terms of time and memory requirements are optimised.
- the method ofthe present invention produces a segmented image with less overgrowing of regions while reducing the number of regions which would be produced by watershed alone.
- the invention has application outside ofthe medical field, such as mihtary applications, robotics or any application which involves pattern recognition schemes.
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02748982A EP1399888A2 (en) | 2001-06-27 | 2002-06-27 | Image segmentation |
US10/482,196 US20040258305A1 (en) | 2001-06-27 | 2002-06-27 | Image segmentation |
AU2002319397A AU2002319397A1 (en) | 2001-06-27 | 2002-06-27 | Image segmentation |
CA002468456A CA2468456A1 (en) | 2001-06-27 | 2002-06-27 | Image segmentation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0115615.7 | 2001-06-27 | ||
GBGB0115615.7A GB0115615D0 (en) | 2001-06-27 | 2001-06-27 | Image segmentation |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2003003303A2 true WO2003003303A2 (en) | 2003-01-09 |
WO2003003303A3 WO2003003303A3 (en) | 2003-09-18 |
Family
ID=9917385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2002/002945 WO2003003303A2 (en) | 2001-06-27 | 2002-06-27 | Image segmentation |
Country Status (7)
Country | Link |
---|---|
US (1) | US20040258305A1 (en) |
EP (1) | EP1399888A2 (en) |
AU (1) | AU2002319397A1 (en) |
CA (1) | CA2468456A1 (en) |
GB (1) | GB0115615D0 (en) |
PL (1) | PL367727A1 (en) |
WO (1) | WO2003003303A2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005057493A1 (en) * | 2003-12-10 | 2005-06-23 | Agency For Science, Technology And Research | Methods and apparatus for binarising images |
US7091974B2 (en) * | 2001-11-30 | 2006-08-15 | Eastman Kodak Company | Method for selecting and displaying a subject or interest in a still digital image |
GB2463141A (en) * | 2008-09-05 | 2010-03-10 | Siemens Medical Solutions | Medical image segmentation |
CN106651885A (en) * | 2016-12-31 | 2017-05-10 | 中国农业大学 | Image segmentation method and apparatus |
Families Citing this family (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6985612B2 (en) * | 2001-10-05 | 2006-01-10 | Mevis - Centrum Fur Medizinische Diagnosesysteme Und Visualisierung Gmbh | Computer system and a method for segmentation of a digital image |
US8275091B2 (en) | 2002-07-23 | 2012-09-25 | Rapiscan Systems, Inc. | Compact mobile cargo scanning system |
US7963695B2 (en) | 2002-07-23 | 2011-06-21 | Rapiscan Systems, Inc. | Rotatable boom cargo scanning system |
US8804899B2 (en) | 2003-04-25 | 2014-08-12 | Rapiscan Systems, Inc. | Imaging, data acquisition, data transmission, and data distribution methods and systems for high data rate tomographic X-ray scanners |
GB0309371D0 (en) * | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-Ray tubes |
US10483077B2 (en) | 2003-04-25 | 2019-11-19 | Rapiscan Systems, Inc. | X-ray sources having reduced electron scattering |
US8451974B2 (en) | 2003-04-25 | 2013-05-28 | Rapiscan Systems, Inc. | X-ray tomographic inspection system for the identification of specific target items |
GB0812864D0 (en) | 2008-07-15 | 2008-08-20 | Cxr Ltd | Coolign anode |
US9208988B2 (en) | 2005-10-25 | 2015-12-08 | Rapiscan Systems, Inc. | Graphite backscattered electron shield for use in an X-ray tube |
US8094784B2 (en) | 2003-04-25 | 2012-01-10 | Rapiscan Systems, Inc. | X-ray sources |
US9113839B2 (en) | 2003-04-25 | 2015-08-25 | Rapiscon Systems, Inc. | X-ray inspection system and method |
GB0309385D0 (en) | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-ray monitoring |
US8837669B2 (en) | 2003-04-25 | 2014-09-16 | Rapiscan Systems, Inc. | X-ray scanning system |
GB0309387D0 (en) * | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-Ray scanning |
US8243876B2 (en) | 2003-04-25 | 2012-08-14 | Rapiscan Systems, Inc. | X-ray scanners |
GB0309379D0 (en) * | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-ray scanning |
GB0309374D0 (en) * | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-ray sources |
US8223919B2 (en) | 2003-04-25 | 2012-07-17 | Rapiscan Systems, Inc. | X-ray tomographic inspection systems for the identification of specific target items |
US7949101B2 (en) | 2005-12-16 | 2011-05-24 | Rapiscan Systems, Inc. | X-ray scanners and X-ray sources therefor |
GB0309383D0 (en) | 2003-04-25 | 2003-06-04 | Cxr Ltd | X-ray tube electron sources |
GB0525593D0 (en) | 2005-12-16 | 2006-01-25 | Cxr Ltd | X-ray tomography inspection systems |
US6928141B2 (en) | 2003-06-20 | 2005-08-09 | Rapiscan, Inc. | Relocatable X-ray imaging system and method for inspecting commercial vehicles and cargo containers |
US7327880B2 (en) * | 2004-03-12 | 2008-02-05 | Siemens Medical Solutions Usa, Inc. | Local watershed operators for image segmentation |
US7394933B2 (en) * | 2004-11-08 | 2008-07-01 | Siemens Medical Solutions Usa, Inc. | Region competition via local watershed operators |
US7689038B2 (en) * | 2005-01-10 | 2010-03-30 | Cytyc Corporation | Method for improved image segmentation |
US7894568B2 (en) * | 2005-04-14 | 2011-02-22 | Koninklijke Philips Electronics N.V. | Energy distribution reconstruction in CT |
US7471764B2 (en) | 2005-04-15 | 2008-12-30 | Rapiscan Security Products, Inc. | X-ray imaging system having improved weather resistance |
US8184119B2 (en) * | 2005-07-13 | 2012-05-22 | Siemens Medical Solutions Usa, Inc. | Fast ambient occlusion for direct volume rendering |
EP1945596B8 (en) | 2005-09-15 | 2015-11-04 | Anuvia Plant Nutrients Holdings LLC | Organic containing sludge to fertilizer alkaline conversion process |
US9046465B2 (en) | 2011-02-24 | 2015-06-02 | Rapiscan Systems, Inc. | Optimization of the source firing pattern for X-ray scanning systems |
US8036423B2 (en) * | 2006-10-11 | 2011-10-11 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Contrast-based technique to reduce artifacts in wavelength-encoded images |
US8068668B2 (en) | 2007-07-19 | 2011-11-29 | Nikon Corporation | Device and method for estimating if an image is blurred |
US8260048B2 (en) * | 2007-11-14 | 2012-09-04 | Exelis Inc. | Segmentation-based image processing system |
GB0803641D0 (en) | 2008-02-28 | 2008-04-02 | Rapiscan Security Products Inc | Scanning systems |
GB0803644D0 (en) | 2008-02-28 | 2008-04-02 | Rapiscan Security Products Inc | Scanning systems |
GB0809110D0 (en) | 2008-05-20 | 2008-06-25 | Rapiscan Security Products Inc | Gantry scanner systems |
GB0816823D0 (en) | 2008-09-13 | 2008-10-22 | Cxr Ltd | X-ray tubes |
US9013596B2 (en) * | 2008-09-24 | 2015-04-21 | Nikon Corporation | Automatic illuminant estimation that incorporates apparatus setting and intrinsic color casting information |
WO2010036249A1 (en) * | 2008-09-24 | 2010-04-01 | Nikon Corporation | Autofocus technique utilizing gradient histogram distribution characteristics |
US8860838B2 (en) | 2008-09-24 | 2014-10-14 | Nikon Corporation | Automatic illuminant estimation and white balance adjustment based on color gamut unions |
WO2010036240A1 (en) * | 2008-09-24 | 2010-04-01 | Nikon Corporation | Image segmentation from focus varied images using graph cuts |
WO2010036247A1 (en) * | 2008-09-24 | 2010-04-01 | Nikon Corporation | Principal components analysis based illuminant estimation |
GB0901338D0 (en) | 2009-01-28 | 2009-03-11 | Cxr Ltd | X-Ray tube electron sources |
WO2012160511A1 (en) | 2011-05-24 | 2012-11-29 | Koninklijke Philips Electronics N.V. | Apparatus and method for generating an attenuation correction map |
WO2012160520A1 (en) | 2011-05-24 | 2012-11-29 | Koninklijke Philips Electronics N.V. | Apparatus for generating assignments between image regions of an image and element classes |
US9008372B2 (en) * | 2011-05-31 | 2015-04-14 | Schlumberger Technology Corporation | Method for determination of spatial distribution and concentration of contrast components in a porous and/or heterogeneous sample |
US9218933B2 (en) | 2011-06-09 | 2015-12-22 | Rapidscan Systems, Inc. | Low-dose radiographic imaging system |
US8781187B2 (en) * | 2011-07-13 | 2014-07-15 | Mckesson Financial Holdings | Methods, apparatuses, and computer program products for identifying a region of interest within a mammogram image |
MX350070B (en) | 2013-01-31 | 2017-08-25 | Rapiscan Systems Inc | Portable security inspection system. |
CN112950747A (en) | 2013-09-13 | 2021-06-11 | 斯特拉克斯私人有限公司 | Method and system for assigning color to image, computer readable storage medium |
US9626476B2 (en) | 2014-03-27 | 2017-04-18 | Change Healthcare Llc | Apparatus, method and computer-readable storage medium for transforming digital images |
US9235903B2 (en) * | 2014-04-03 | 2016-01-12 | Sony Corporation | Image processing system with automatic segmentation and method of operation thereof |
US9773325B2 (en) * | 2015-04-02 | 2017-09-26 | Toshiba Medical Systems Corporation | Medical imaging data processing apparatus and method |
JP2018126389A (en) * | 2017-02-09 | 2018-08-16 | キヤノン株式会社 | Information processing apparatus, information processing method, and program |
CN109840914B (en) * | 2019-02-28 | 2022-12-16 | 华南理工大学 | Texture segmentation method based on user interaction |
US11551903B2 (en) | 2020-06-25 | 2023-01-10 | American Science And Engineering, Inc. | Devices and methods for dissipating heat from an anode of an x-ray tube assembly |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2827060B1 (en) * | 2001-07-05 | 2003-09-19 | Eastman Kodak Co | METHOD FOR IDENTIFYING THE SKY IN AN IMAGE AND IMAGE OBTAINED THANKS TO THIS PROCESS |
-
2001
- 2001-06-27 GB GBGB0115615.7A patent/GB0115615D0/en not_active Ceased
-
2002
- 2002-06-27 EP EP02748982A patent/EP1399888A2/en not_active Withdrawn
- 2002-06-27 CA CA002468456A patent/CA2468456A1/en not_active Abandoned
- 2002-06-27 WO PCT/GB2002/002945 patent/WO2003003303A2/en not_active Application Discontinuation
- 2002-06-27 PL PL02367727A patent/PL367727A1/en not_active Application Discontinuation
- 2002-06-27 US US10/482,196 patent/US20040258305A1/en not_active Abandoned
- 2002-06-27 AU AU2002319397A patent/AU2002319397A1/en not_active Abandoned
Non-Patent Citations (5)
Title |
---|
BUENO G ET AL: "Watershed transform for segmenting medical images" SYSTEMS SCIENCE, 1997, WROCLAW TECH. UNIV. PRESS, POLAND, vol. 23, no. 3, pages 95-106, XP008020084 ISSN: 0137-1223 * |
GONZALEZ R C ET AL: "Digital Image Processing" , DIGITAL IMAGE PROCESSING, XX, XX, PAGE(S) 458-465 , READING, MASSACHUSETTS XP002248957 sections 7.4.1, 7.4.2 * |
VICKERS J P ET AL: "Histogram-based segmentation of pelvic computed tomographic images" WORKSHOP ON EUROPEAN SCIENTIFIC AND INDUSTRIAL COLLABORATION. WESIC '99. PROMOTING: ADVANCED TECHNOLOGIES IN MANUFACTURING, WORKSHOP ON EUROPEAN SCIENTIFIC AND INDUSTRIAL COLLABORATION. WESIC '99. PROMOTING: ADVANCED TECHNOLOGIES IN MANUFACTURING, NE, pages 291-298, XP008020078 1999, Newport, South Wales, UK, Univ. Wales College, Newport, UK ISBN: 1-899274-23-5 * |
ZEUGE, W.: "Skripte zur Mathematik - Wahrscheinlichkeitsrechnung, Statistik, Ausgleichsrechnung" 1998 , UNIVERSIT[T HAMBURG , HAMBURG WANDSBEK, PAGES 49-53, XP002248958 page 53, section 2.3.4 figures 2.1,2.2 * |
ZUCKER S W: "REGION GROWING: CHILDHOOD AND ADOLESCENCE" COMPUTER GRAPHICS AND IMAGE PROCESSING, ACADEMIC PRESS. NEW YORK, US, vol. 5, no. 3, September 1976 (1976-09), pages 382-399, XP001149042 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7091974B2 (en) * | 2001-11-30 | 2006-08-15 | Eastman Kodak Company | Method for selecting and displaying a subject or interest in a still digital image |
WO2005057493A1 (en) * | 2003-12-10 | 2005-06-23 | Agency For Science, Technology And Research | Methods and apparatus for binarising images |
GB2463141A (en) * | 2008-09-05 | 2010-03-10 | Siemens Medical Solutions | Medical image segmentation |
GB2463141B (en) * | 2008-09-05 | 2010-12-08 | Siemens Medical Solutions | Methods and apparatus for identifying regions of interest in a medical image |
US9349184B2 (en) | 2008-09-05 | 2016-05-24 | Siemens Medical Solutions Usa, Inc. | Method and apparatus for identifying regions of interest in a medical image |
CN106651885A (en) * | 2016-12-31 | 2017-05-10 | 中国农业大学 | Image segmentation method and apparatus |
Also Published As
Publication number | Publication date |
---|---|
CA2468456A1 (en) | 2003-01-09 |
US20040258305A1 (en) | 2004-12-23 |
WO2003003303A3 (en) | 2003-09-18 |
EP1399888A2 (en) | 2004-03-24 |
GB0115615D0 (en) | 2001-08-15 |
AU2002319397A1 (en) | 2003-03-03 |
PL367727A1 (en) | 2005-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003003303A2 (en) | Image segmentation | |
US7536041B2 (en) | 3D image segmentation | |
EP0965104B1 (en) | Autosegmentation/autocontouring methods for use with three-dimensional radiation therapy treatment planning | |
US7796790B2 (en) | Manual tools for model based image segmentation | |
US8577115B2 (en) | Method and system for improved image segmentation | |
EP2252204B1 (en) | Ct surrogate by auto-segmentation of magnetic resonance images | |
US7388973B2 (en) | Systems and methods for segmenting an organ in a plurality of images | |
RU2589292C2 (en) | Device and method for formation of attenuation correction map | |
US8527244B2 (en) | Generating model data representing a biological body section | |
WO2012072129A1 (en) | Longitudinal monitoring of pathology | |
Sivewright et al. | Interactive region and volume growing for segmenting volumes in MR and CT images | |
CN106537452A (en) | Device, system and method for segmenting an image of a subject. | |
US8094895B2 (en) | Point subselection for fast deformable point-based imaging | |
Tan et al. | An approach to extraction midsagittal plane of skull from brain CT images for oral and maxillofacial surgery | |
CN105678711B (en) | A kind of attenuation correction method based on image segmentation | |
US20080285822A1 (en) | Automated Stool Removal Method For Medical Imaging | |
CN106780492A (en) | A kind of extraction method of key frame of CT pelvises image | |
CN114187293B (en) | Oral cavity palate part soft and hard tissue segmentation method based on attention mechanism and integrated registration | |
Krawczyk et al. | YOLO and morphing-based method for 3D individualised bone model creation | |
Sun et al. | Stepwise local synthetic pseudo-CT imaging based on anatomical semantic guidance | |
US20220180525A1 (en) | Organ segmentation method and system | |
Stough et al. | Clustering on local appearance for deformable model segmentation | |
Bhise et al. | Lung Segmentation and Nodule Detection based on CT Images using Image Processing Method | |
Liamsuwan et al. | CTScanTool, a semi-automated organ segmentation tool for radiotherapy treatment planning | |
Bacher et al. | Model-based segmentation of anatomical structures in MR images of the head and neck area |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG US Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002748982 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002748982 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
ENP | Entry into the national phase |
Ref document number: 2004115104 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2468456 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10482196 Country of ref document: US |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2002748982 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |