US7689038B2 - Method for improved image segmentation - Google Patents
Method for improved image segmentation Download PDFInfo
- Publication number
- US7689038B2 US7689038B2 US11/328,354 US32835406A US7689038B2 US 7689038 B2 US7689038 B2 US 7689038B2 US 32835406 A US32835406 A US 32835406A US 7689038 B2 US7689038 B2 US 7689038B2
- Authority
- US
- United States
- Prior art keywords
- image
- identify
- per
- features
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims description 76
- 238000003709 image segmentation Methods 0.000 title abstract description 10
- 238000002372 labelling Methods 0.000 claims abstract description 28
- 210000004027 cell Anatomy 0.000 claims description 15
- 230000003287 optical effect Effects 0.000 claims description 6
- 210000000805 cytoplasm Anatomy 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 5
- 210000003000 inclusion body Anatomy 0.000 claims 2
- 210000001739 intranuclear inclusion body Anatomy 0.000 claims 2
- 230000011218 segmentation Effects 0.000 description 13
- 210000004940 nucleus Anatomy 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 238000013459 approach Methods 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000012804 iterative process Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 241000270295 Serpentes Species 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000003855 cell nucleus Anatomy 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000005291 magnetic effect Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 206010008342 Cervix carcinoma Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 201000010881 cervical cancer Diseases 0.000 description 1
- 238000011961 computed axial tomography Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001086 cytosolic effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000005294 ferromagnetic effect Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000009595 pap smear Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present invention relates generally to the field of image analysis. More specifically, the present invention is related to a novel image segmentation method.
- segmentation In the analysis of objects in digital images it is essential that the objects be distinguished from the background of the image. To characterize cells or objects, the objects must first be located. The process of locating objects within the digital image is known as “segmentation.” A variety of techniques are used in the segmentation process to locate the objects of interest so that subsequent computer analysis can characterize the objects. For example, segmentation of an image containing cells might allow the cell's nucleus and/or cytoplasm to be located.
- a traditional approach to the task of locating and classifying objects within an image involves several stages: first—segmenting the image to create a binary mask of the objects; then—labeling the objects in this mask, with each connected set of pixels assigned a different label; and finally—measuring various features of the labeled objects.
- a threshold value of image brightness is chosen and each pixel in the image is then compared with this threshold value. Pixels with a brightness value above this threshold are considered background pixels; pixels with values below the threshold are considered object pixels.
- the threshold value for locating objects may be chosen based on an image histogram, which is a frequency distribution of the darkness values found within an image.
- a thresholding algorithm may find a single threshold value using these histograms. For instance, the threshold value might be half-way between the darkest and lightest pixels. Alternatively, the threshold value might be chosen as an inflection point between the abundant “background” pixels and the more rare “object” pixels. Finding an ideal threshold for each object in an image is a difficult task. Often a single threshold value is not optimal for multiple objects with varying darkness values within an entire image.
- the “object” pixels can form a binary mask of the objects in the image.
- a boundary around the mask might be used to represent each object.
- the boundary might or might not reflect the object accurately.
- Many methods have been developed to refine the boundary once it is located. Such methods may use darkness information near the boundary, or constraints such as gradient, curvature, “closeness to a circle,” etc. to refine boundaries.
- the present invention provides for a method to identify each object in an image, the method comprising steps of: (a) sorting pixels based on a range of attribute values of the image; (b) adding the sorted pixels, one by one, to a “labeling image” to locate objects in the labeling image, starting with an extreme point in the range of attribute values; (c) outputting the objects onto an output image if features of the objects match a pre-defined acceptance criteria; and (d) performing steps b and c repeatedly until a stopping point is reached, the stopping point representing another extreme point in the range of attribute values.
- the present invention provides for an article of manufacture comprising a computer readable medium having computer readable program code embodied therein which identifies each object in an image, the medium comprising: (a) computer readable program code sorting pixels based on a range of attribute values of the image; (b) computer readable program code adding the sorted pixels, one by one, to a labeling image to locate objects in the labeling image, starting with an extreme point in the range of attribute values; (c) computer readable program code outputting the objects onto an output image if features of the objects match a pre-defined acceptance criteria; and (d) computer readable program code performing steps b and c repeatedly until a stopping point is reached, the stopping point representing another extreme point in the range of attribute values.
- the present invention provides for a method to identify each object in an image under a plurality of threshold values, the method comprising steps of: (a) sorting pixels in the image based on a range of attribute values of the pixels, wherein the range of attribute values correspond to the plurality of threshold values; (b) adding pixels, one by one, to a labeling image to create new objects or update old objects, starting with an extreme point in the range of attribute values; (c) calculating features of the created new objects and the updated old objects; (d) matching the calculated features of the created new objects and the updated old objects with a pre-defined criteria; (e) outputting the created new objects and the updated old objects on an output image if an acceptance criteria is satisfied for the features; and (f) performing step b-e repeatedly until a stopping point is reached, wherein the stopping point chosen from any of the following: another extreme point in the range of values, a point representing background pixels values in the range of attribute values or a point representing pixel values not related to the new objects and the updated old objects.
- FIG. 1 illustrates a brightness “contour map” superimposed on cells in a cell group.
- FIG. 2 illustrates steps of the image segmentation algorithm, as per one embodiment of the present invention.
- FIGS. 3 a - 3 n collectively illustrate the processing during addition of pixels into a labeling image, as per one embodiment of the present invention.
- FIG. 4 illustrates steps of the image segmentation algorithm, as per a preferred embodiment of the present invention.
- FIG. 5 depicts the growth of objects in an image at five different threshold values, as per one embodiment of the present invention.
- FIG. 6 illustrates the output image showing objects that were located at different threshold values where only the best state for an object is retained, as per one embodiment of the present invention.
- An improved automated image segmentation technique is described herein for identification of object boundaries in a digital two dimensional image. While the image segmentation technique described herein identifies nuclei, the technique itself can be applied to identifying any object in a digital image, such as cytoplasms or tissue structures in biological applications, or objects for non-biological applications such as different components on a circuit board, or man-made and natural features on a satellite image. This technique could be extended to a three dimensional image, such as those produced by X-Ray, CAT (Computed Axial Tomography) scan or MRI (Magnetic Resonance Imaging) devices. Three dimensional pixel elements in such three dimensional images are known as “voxels.” Clusters of voxels might represent an organ, or a tumor in three dimensions.
- a digital two dimensional image may be thought of as a three-dimensional surface, where the height dimension represents the grey value (i.e. brightness) of each pixel.
- identifying the nuclei might be done by finding the contours that are within a certain size range and round enough to be nuclei. If one contour is contained within another, the “better” of the two should be chosen.
- FIG. 1 shows such a contour map superimposed on cells in a cell group.
- the objects of interest on the image are sets of connected pixels
- FIG. 2 illustrates an image segmentation algorithm as per one embodiment of the present invention that identifies each object found in an image by performing the following steps.
- First the algorithm sorts pixels based on a range of attribute values of the image (step 202 ). It then adds the sorted pixels one by one to a “labeling image” for the purpose of locating objects in the labeling image, starting with an extreme point in the range of attribute values (step 204 ).
- a labeling image is an image wherein the pixels of each distinct object are assigned a unique value, or “label.” Also, the extreme point used for the starting value may be at the lowest or the highest end of the range of attribute values.
- the pixels can be sorted based on attributes such as, but not limited to, brightness, hue, gradient, etc.
- a stopping point is also defined for the image. This stopping point represents a point in the range of the attribute values at which the algorithm should stop performing steps 204 and 206 repeatedly (step 208 ). This stopping point may be the other extreme point in the range of attribute values. In one embodiment, this stopping point is a point representing background pixel values in the range of attribute values. In another embodiment, the stopping point is a point representing pixel values not related to the objects being located.
- FIG. 4 illustrates an algorithm, as per a preferred embodiment of the present invention.
- step 402 pixels in an image are sorted based on a range of attribute values of the image.
- the range of attribute values corresponds to a plurality of threshold values in the image.
- the threshold values are determined from a histogram of the attribute values being used.
- the pixels can be sorted based on attributes such as, but not limited to, brightness, hue, gradient, etc. Also, the indexing on a histogram could be done by brightness, hue, gradient, etc.
- sorted pixels are added to a blank labeling image one by one, starting with an extreme point in the range of attribute values as shown in step 404 , wherein new objects are created or old objects are updated in the process.
- a new object is created (see FIGS. 3 a , 3 b and 3 e ). If the added pixel is adjacent to an old object, the pixel is combined with the old object to update the old object (see FIGS. 3 c , 3 d , 3 f , 3 g , 3 h , 3 i , 3 j , 3 k , 3 m and 3 n ). Further, if the added pixel joins two old objects, the two old objects are updated and merged into one object (see FIG. 31 ). In step 406 , features of these new or updated objects are calculated.
- step 408 the new or updated object (representing a located nucleus), is output onto the output image in step 410 .
- the algorithm performs the steps 404 - 410 repeatedly until a stopping point is reached (step 412 ).
- this stopping point may be the other extreme point in the range of attribute values.
- this stopping point is a point representing background pixel values in the range of attribute values.
- the stopping point is a point representing pixel values not related to the objects being located.
- the algorithm tracks the objects created or updated in the steps described earlier, along with their features.
- the objects are also assigned labels (for example, see FIGS. 3 a and 3 b ). Whenever a pixel is added that touches an existing object, that object's label is assigned to it and the features of that object are updated to reflect the addition of the new pixel (for example, see FIGS. 3 d and 3 g ). When a pixel is added that joins two objects, the labels and features of the two are merged into one (for example, see FIG. 31 ).
- FIGS. 3 a - 3 n show how pixels are added into an originally blank mask/labeling image one at a time and labels are assigned as needed to each pixel.
- an object is output to the output image only if its current features calculated at a current threshold are a better match to the acceptance criteria than its features calculated in all previous thresholds.
- data from the previous best match is used.
- To compute an optimal set of objects that best matches the acceptance criteria out of all objects at all thresholds a method called dynamic programming is used. The program passes through the thresholds one at a time, keeping track of the best set of objects that can be formed using only the objects located so far. Each object “remembers” the best state it has attained so far.
- FIG. 5 depicts the growth of objects in an image at five different threshold values. Individual objects that were separate at one threshold may be joined at another threshold.
- Each object in the figure “points to” its previous best state. Objects with no arrows point to themselves (not shown). This means that the current state of these objects is better than any of their previous states. Thus, the new state of an object only needs to be compared to the state to which its arrow points, since this state represents the best state achieved so far. If two objects merge at a given threshold, the arrow for the new, merged object points to the better of the two objects that form the merged object.
- FIG. 6 shows the output image as it is growing. First it contains the objects from a first threshold. Then it contains the best objects from the first and second threshold, then the best objects from the first three thresholds, and so on. The final output image contains all of the objects that were marked as best objects and not later rejected in favor of a better one (compare to FIG. 5 ).
- an object when it matches the acceptance criteria, instead of being directly output to an output image, it may be conditionally output based upon a comparison between the object's current state and its previous state or states.
- acceptance criteria may also be used such as size, shape, texture, color, density, contrast etc.
- Multiple criteria sets may allow different kinds of objects such as nuclei, nuclei of different shapes, cytoplasms, nuclear or cytoplasmic “inclusions” and clustered cells, in a single image to be located at the same time.
- the present invention is not limited to the use of these features or acceptance criteria.
- ellipticity (comparison of the measured object to the ellipse defined by the moment of inertia matrix) may be used as a feature to identify nuclei. Segmentation based on ellipticity may better segmentation results, since it would distinguish elongated nuclei from irregularly shaped artifacts.
- the algorithm of the present invention provides for various improvements: reduction in processing time for segmentation in images, elimination of time consuming pre-processing to locate potential objects of interest and then establishing regions of interest to more accurately define an object in a secondary process, handling of images with varying darkness and contrast to minimize false negative cases, handling images with abnormal clusters to minimize false negative cases, identifying multiple objects in a single image at the same time using multiple acceptance criteria.
- the algorithm of the present invention differs from other published iterative image analysis techniques such as region growing algorithms and active contours (also called “snakes”).
- Region growing algorithms work by first dividing an image into many separate regions (individual pixels, small groups of pixels, contiguous pixels with the same grey level, etc.). Next, regions “grow” by merging with other regions that touch them and have similar features. This is usually an iterative process that stops once no more merges of similar regions can be made. The goal of the technique is for the final set of regions to correspond to the objects being segmented.
- the way that the objects will “grow” is predetermined: there is only one path that the growth can take and the algorithm simply “watches” the features of the objects changing as they travel along that path. The features that are measured do not influence which pixel is added next or which regions are merged. After a single pass through the image, the “best objects” observed in that path are reported.
- the features of the regions determine which regions are grown or merged.
- the region growing technique is like is a search through a tree with many branches. Object features are used to determine which branch to take. Usually multiple passes are made through the image, and the final state of the regions is reported. Region growing algorithms do not store the best states attained by the objects; rather, the algorithm may have to “back up” to return to a better state.
- Active contours also called “snakes,” are object outlines represented either by polygons or by parametric equations, overlaid on the image.
- these lines “evolve” to improve their own shape and their correspondence with the underlying image. Again, this is an iterative process ending when the outlines cannot be improved by further evolution.
- the present invention operates on individual pixels and performs operations using only the immediate neighbors of each pixel.
- the contours are represented, not by pixels, but by mathematical functions.
- the differences between the present algorithm and the active contours techniques are the same as the differences noted with the region growing technique.
- a key difference that distinguishes the algorithm of the present invention from other segmentation algorithms is that other algorithms involve an iterative process seeking the best path through a tree or graph, using the object features to choose which direction to move next. Sometimes this requires backtracking as the algorithm works to end up with the best possible result.
- the present algorithm follows a linear, predetermined path, passing by and remembering all the “best” objects on its way.
- the present invention provides for an article of manufacture comprising computer readable program code contained within implementing one or more modules to segment images and identify objects.
- the present invention includes a computer program code-based product, which is a storage medium having program code stored therein that can be used to instruct a computer to perform any of the methods associated with the present invention.
- the computer storage medium includes any of, but is not limited to, the following: CD-ROM, DVD, magnetic tape, optical disc, hard drive, floppy disk, ferroelectric memory, flash memory, ferromagnetic memory, optical storage, charge coupled devices, magnetic or optical cards, smart cards, EEPROM, EPROM, RAM, ROM, DRAM, SRAM, SDRAM, or any other appropriate static or dynamic memory or data storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Transition And Organic Metals Composition Catalysts For Addition Polymerization (AREA)
- Lubrication Of Internal Combustion Engines (AREA)
Abstract
Description
-
- a) sorting pixels based on a range of attribute values of an image;
- b) adding the sorted pixels, one by one, to a labeling image to locate objects in the labeling image, starting with an extreme point in the range of attribute values;
- c) outputting the objects onto an output image if features of the objects match a pre-defined acceptance criteria; and
- d) performing steps (b) and (c) repeatedly until a stopping point is reached, said stopping point representing another extreme point in said range of attribute values.
Claims (33)
Priority Applications (16)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/328,354 US7689038B2 (en) | 2005-01-10 | 2006-01-09 | Method for improved image segmentation |
AU2006205061A AU2006205061B2 (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
PT06717873T PT1836680E (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
DK06717873.1T DK1836680T3 (en) | 2005-01-10 | 2006-01-10 | Process for improved image segmentation |
DE602006016066T DE602006016066D1 (en) | 2005-01-10 | 2006-01-10 | PROCESS FOR IMPROVED IMAGE SEGMENTATION |
KR1020077015436A KR101255865B1 (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
TW095100920A TWI307058B (en) | 2005-01-10 | 2006-01-10 | Method for identifying objects in an image and computer readable medium |
CN2006800020193A CN101103373B (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
JP2007550545A JP4825222B2 (en) | 2005-01-10 | 2006-01-10 | Improved image segmentation method |
BRPI0606705-0A BRPI0606705B1 (en) | 2005-01-10 | 2006-01-10 | METHOD FOR IDENTIFYING OBJECTS IN AN IMAGE |
EP06717873A EP1836680B1 (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
AT06717873T ATE477556T1 (en) | 2005-01-10 | 2006-01-10 | METHOD FOR IMPROVED IMAGE SEGMENTATION |
MX2007008363A MX2007008363A (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation. |
PCT/US2006/000726 WO2006076312A2 (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
CA2591241A CA2591241C (en) | 2005-01-10 | 2006-01-10 | Method for improved image segmentation |
US12/712,023 US7881532B2 (en) | 2005-01-10 | 2010-02-24 | Imaging device with improved image segmentation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US64211005P | 2005-01-10 | 2005-01-10 | |
US11/328,354 US7689038B2 (en) | 2005-01-10 | 2006-01-09 | Method for improved image segmentation |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/712,023 Continuation US7881532B2 (en) | 2005-01-10 | 2010-02-24 | Imaging device with improved image segmentation |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070036436A1 US20070036436A1 (en) | 2007-02-15 |
US7689038B2 true US7689038B2 (en) | 2010-03-30 |
Family
ID=36593741
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/328,354 Active 2028-07-21 US7689038B2 (en) | 2005-01-10 | 2006-01-09 | Method for improved image segmentation |
US12/712,023 Active US7881532B2 (en) | 2005-01-10 | 2010-02-24 | Imaging device with improved image segmentation |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/712,023 Active US7881532B2 (en) | 2005-01-10 | 2010-02-24 | Imaging device with improved image segmentation |
Country Status (15)
Country | Link |
---|---|
US (2) | US7689038B2 (en) |
EP (1) | EP1836680B1 (en) |
JP (1) | JP4825222B2 (en) |
KR (1) | KR101255865B1 (en) |
CN (1) | CN101103373B (en) |
AT (1) | ATE477556T1 (en) |
AU (1) | AU2006205061B2 (en) |
BR (1) | BRPI0606705B1 (en) |
CA (1) | CA2591241C (en) |
DE (1) | DE602006016066D1 (en) |
DK (1) | DK1836680T3 (en) |
MX (1) | MX2007008363A (en) |
PT (1) | PT1836680E (en) |
TW (1) | TWI307058B (en) |
WO (1) | WO2006076312A2 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080056577A1 (en) * | 2006-08-29 | 2008-03-06 | Leo Grady | Seed segmentation using L-infinity minimization |
US20090087058A1 (en) * | 2007-09-28 | 2009-04-02 | Satoshi Ihara | Image analysis apparatus, image processing apparatus, image analysis program storage medium, image processing program storage medium, image analysis method and image processing method |
US20090269799A1 (en) * | 2008-04-25 | 2009-10-29 | Constitutional Medical Investors, Inc. | Method of determining a complete blood count and a white blood cell differential count |
US20100098317A1 (en) * | 2007-02-13 | 2010-04-22 | Tomoharu Kiyuna | Cell feature amount calculating apparatus and cell feature amount calculating method |
US20100150443A1 (en) * | 2005-01-10 | 2010-06-17 | Cytyc Corporation | Method for improved image segmentation |
US20100278389A1 (en) * | 2009-04-30 | 2010-11-04 | Industrial Technology Research Institute | Method for image recombination of a plurality of images and image identification and system for image acquiring and identification |
US20110103711A1 (en) * | 2009-11-03 | 2011-05-05 | Samsung Electronics Co., Ltd. | Structured grids for label propagation on a finite number of layers |
WO2012142496A1 (en) | 2011-04-15 | 2012-10-18 | Constitution Medical, Inc. | Measuring volume and constituents of cells |
US20130156287A1 (en) * | 2010-08-30 | 2013-06-20 | Sanyo Electric Co., Ltd | Observation device, observation program, and observation system |
US20140056504A1 (en) * | 2012-08-24 | 2014-02-27 | Drvision Technologies Llc | Progressive decision for cellular process selection |
US9083857B2 (en) | 2008-04-25 | 2015-07-14 | Roche Diagnostics Hematology, Inc. | Systems and methods for analyzing body fluids |
US9111343B2 (en) | 2011-01-18 | 2015-08-18 | Roche Diagnostics Hematology, Inc. | Microscope slide coordinate system registration |
US10402623B2 (en) | 2017-11-30 | 2019-09-03 | Metal Industries Research & Development Centre | Large scale cell image analysis method and system |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7480412B2 (en) * | 2003-12-16 | 2009-01-20 | Siemens Medical Solutions Usa, Inc. | Toboggan-based shape characterization |
US20070177799A1 (en) * | 2006-02-01 | 2007-08-02 | Helicos Biosciences Corporation | Image analysis |
US9020964B1 (en) * | 2006-04-20 | 2015-04-28 | Pinehill Technology, Llc | Generation of fingerprints for multimedia content based on vectors and histograms |
US8175390B2 (en) * | 2008-03-28 | 2012-05-08 | Tandent Vision Science, Inc. | System and method for illumination invariant image segmentation |
TWI385595B (en) * | 2008-08-22 | 2013-02-11 | Univ Ishou | Image segmentation method using image region merging algorithm |
US8260002B2 (en) * | 2008-09-26 | 2012-09-04 | Axis Ab | Video analytics system, computer program product, and associated methodology for efficiently using SIMD operations |
TWI405145B (en) | 2008-11-20 | 2013-08-11 | Ind Tech Res Inst | Pixel region-based image segmentation method, system and machine-readable storage medium |
US8379960B2 (en) * | 2009-03-30 | 2013-02-19 | Ge Healthcare Bio-Sciences Corp. | System and method for distinguishing between biological materials |
US8335374B2 (en) | 2009-08-12 | 2012-12-18 | Genetix Corporation | Image segmentation |
TWI419061B (en) | 2010-01-18 | 2013-12-11 | Pixart Imaging Inc | Method for recognizing multiple objects |
US8593457B2 (en) * | 2010-05-27 | 2013-11-26 | National Tsing Hua University | Method of three-dimensional image data processing |
CN102096816B (en) * | 2011-01-28 | 2012-12-26 | 武汉大学 | Multi-scale multi-level image segmentation method based on minimum spanning tree |
CN102103744A (en) * | 2011-01-28 | 2011-06-22 | 武汉大学 | Image segmentation method based on minimum spanning trees and statistical learning theory |
US8964171B2 (en) | 2011-07-22 | 2015-02-24 | Roche Diagnostics Hematology, Inc. | Identifying and measuring reticulocytes |
US9292763B2 (en) * | 2013-07-25 | 2016-03-22 | Analog Devices Global | System, method, and medium for image object and contour feature extraction |
TWI496112B (en) * | 2013-09-13 | 2015-08-11 | Univ Nat Cheng Kung | Cell image segmentation method and a nuclear-to-cytoplasmic ratio evaluation method using the same |
CN107079112B (en) * | 2014-10-28 | 2020-09-29 | 惠普发展公司,有限责任合伙企业 | Method, system and computer readable storage medium for dividing image data |
EP3264362A4 (en) | 2015-02-23 | 2018-01-03 | Konica Minolta, Inc. | Image processing device, image processing method, and image processing program |
WO2016172612A1 (en) | 2015-04-23 | 2016-10-27 | Cedars-Sinai Medical Center | Automated delineation of nuclei for three dimensional (3-d) high content screening |
WO2016190129A1 (en) * | 2015-05-22 | 2016-12-01 | コニカミノルタ株式会社 | Image processing device, image processing method, and program for image processing |
US9754378B2 (en) * | 2016-01-21 | 2017-09-05 | Molecular Devices, Llc | System and method for segmentation of three-dimensional microscope images |
DE102016105102A1 (en) * | 2016-03-18 | 2017-09-21 | Leibniz-Institut für Photonische Technologien e. V. | Method for examining distributed objects |
EP3465610B1 (en) * | 2016-06-03 | 2020-04-15 | Koninklijke Philips N.V. | Biological object detection |
CN110892247B (en) | 2017-08-17 | 2023-08-25 | 雅培医护站股份有限公司 | Apparatus, systems, and methods for performing optical and electrochemical assays |
US11244459B2 (en) * | 2018-12-16 | 2022-02-08 | Masahiko Sato | Method for segmentation of grayscale images and segmented area tracking |
KR102260169B1 (en) * | 2019-10-08 | 2021-06-02 | 주식회사 수아랩 | Method to generate data |
CN112862829B (en) * | 2019-11-27 | 2024-03-12 | 武汉Tcl集团工业研究院有限公司 | Label picture segmentation method, device and storage medium |
CN114445421B (en) * | 2021-12-31 | 2023-09-29 | 中山大学肿瘤防治中心(中山大学附属肿瘤医院、中山大学肿瘤研究所) | Identification and segmentation method, device and system for nasopharyngeal carcinoma lymph node region |
CN117522760B (en) * | 2023-11-13 | 2024-06-25 | 书行科技(北京)有限公司 | Image processing method, device, electronic equipment, medium and product |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4484081A (en) * | 1980-09-19 | 1984-11-20 | Trw Inc. | Defect analysis system |
US6021213A (en) * | 1996-06-13 | 2000-02-01 | Eli Lilly And Company | Automatic contextual segmentation for imaging bones for osteoporosis therapies |
WO2001011564A1 (en) | 1999-08-09 | 2001-02-15 | Smithkline Beecham P.L.C. | Image processing methods, programs and systems |
US20030068074A1 (en) * | 2001-10-05 | 2003-04-10 | Horst Hahn | Computer system and a method for segmentation of a digital image |
US20030152262A1 (en) * | 2002-02-11 | 2003-08-14 | Fei Mao | Method and system for recognizing and selecting a region of interest in an image |
US20040258305A1 (en) * | 2001-06-27 | 2004-12-23 | Burnham Keith J. | Image segmentation |
US20060098858A1 (en) * | 2002-11-18 | 2006-05-11 | Qinetiq Limited | Measurement of mitotic activity |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999057683A1 (en) * | 1998-05-04 | 1999-11-11 | The Johns Hopkins University | Method and apparatus for segmenting small structures in images |
US7167583B1 (en) * | 2000-06-28 | 2007-01-23 | Landrex Technologies Co., Ltd. | Image processing system for use with inspection systems |
US7187389B2 (en) * | 2001-04-12 | 2007-03-06 | International Business Machines Corporation | System and method for simultaneous display of multiple object categories |
US7689038B2 (en) * | 2005-01-10 | 2010-03-30 | Cytyc Corporation | Method for improved image segmentation |
-
2006
- 2006-01-09 US US11/328,354 patent/US7689038B2/en active Active
- 2006-01-10 DK DK06717873.1T patent/DK1836680T3/en active
- 2006-01-10 CN CN2006800020193A patent/CN101103373B/en active Active
- 2006-01-10 PT PT06717873T patent/PT1836680E/en unknown
- 2006-01-10 MX MX2007008363A patent/MX2007008363A/en active IP Right Grant
- 2006-01-10 CA CA2591241A patent/CA2591241C/en active Active
- 2006-01-10 DE DE602006016066T patent/DE602006016066D1/en active Active
- 2006-01-10 JP JP2007550545A patent/JP4825222B2/en active Active
- 2006-01-10 KR KR1020077015436A patent/KR101255865B1/en active IP Right Grant
- 2006-01-10 TW TW095100920A patent/TWI307058B/en active
- 2006-01-10 AT AT06717873T patent/ATE477556T1/en active
- 2006-01-10 EP EP06717873A patent/EP1836680B1/en active Active
- 2006-01-10 BR BRPI0606705-0A patent/BRPI0606705B1/en active IP Right Grant
- 2006-01-10 WO PCT/US2006/000726 patent/WO2006076312A2/en active Application Filing
- 2006-01-10 AU AU2006205061A patent/AU2006205061B2/en active Active
-
2010
- 2010-02-24 US US12/712,023 patent/US7881532B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4484081A (en) * | 1980-09-19 | 1984-11-20 | Trw Inc. | Defect analysis system |
US6021213A (en) * | 1996-06-13 | 2000-02-01 | Eli Lilly And Company | Automatic contextual segmentation for imaging bones for osteoporosis therapies |
WO2001011564A1 (en) | 1999-08-09 | 2001-02-15 | Smithkline Beecham P.L.C. | Image processing methods, programs and systems |
US20040258305A1 (en) * | 2001-06-27 | 2004-12-23 | Burnham Keith J. | Image segmentation |
US20030068074A1 (en) * | 2001-10-05 | 2003-04-10 | Horst Hahn | Computer system and a method for segmentation of a digital image |
US20030152262A1 (en) * | 2002-02-11 | 2003-08-14 | Fei Mao | Method and system for recognizing and selecting a region of interest in an image |
US20060098858A1 (en) * | 2002-11-18 | 2006-05-11 | Qinetiq Limited | Measurement of mitotic activity |
Non-Patent Citations (13)
Title |
---|
Carvalho, Marco A.G. et al., "Segmentation of Images of Yeast Cells by Scale-Space Analysis," Proceedings on the XVI Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI'03), XP-010664309, IEEE Computer Society, Oct. 2003, 376-380, 5 pages. |
Fisher et al., "Hierarchical Image Segmentation Using a Watershed Scale-Space Tree," University of East Anglia, UK paper, Image Processing and its Applications, XP-006501128, IEE, UK, vol. 2, Jul. 1999, pp. 522-526. |
Jones, Ronald, "Connected Filtering and Segmentation Using Component Trees," XP-004444612, Computer Vision and Image Understanding, vol. 75, No. 3, Sep. 1999, pp. 215-228. |
Kao Ming-Yih, "Use of 3-D Region Growing Method in Segmentation of Magnetic Resonance Cerebral Imaging", Master's Candidate Thesis Paper with English translation, Jul. 2003, p. 14 and 20 (6 pages). |
Kupinski, Matthew et al., "Automated Seeded Lesion Segmentation on Digital Mammograms, " XP-011035755, IEEE Transactions on Medical Imaging, vol. 17, No. 4, Aug. 1998, pp. 510-517. |
Matas, J. et al., "Robust Wide Baseline Stereo from Maximally Stable External Regions," XP-002390323, Electronic Proceedings of the BMVC, 2002, pp. 384-393. |
Meijster, Arnold et al., "A Comparison of Algorithms for Connected Set Openings and Closings," XP-001144048, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 4, Apr. 2002, pp. 484-494. |
PCT International Search Report for PCT/US2006/000726, Applicant: Cytyc Corporation, Form PCT/ISA/210, dated Jul. 27, 2006 (5 pages). |
PCT Written Opinion of the International Searching Authority for PCT/US2006/000726, Applicant Cytyc Corporation, Form PCT/ISA/237) dated Jul. 27, 2006 (7 pages). |
Shwu-Huey Yen et al., "Segmentation on Color Images Based on Watershed Algorithm", Proceedings of the 10th International Multimedia Modelling Conference (MMM '04), 2004, pp. 227-232 (6 pages). |
Sonka, M. et al., "Image Processing, Analysis and Machine Vision", (Passage), XP-002390329, 1998, pp. 232-235 (p. 233). |
Tai An-Chi, "Study of Watershed Algorithm-Based Segmentation of Color Images", Master's Candidate Thesis Paper with English translation, Jun. 2006, Chapter III, pp. 33-36 (10 pages). |
Taiwan Office Action with English translation for Taiwan Patent Application No. 95100920, dated Aug. 28, 2008, Applicant Cytyc Corporation (30 pages). |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7881532B2 (en) * | 2005-01-10 | 2011-02-01 | Cytyc Corporation | Imaging device with improved image segmentation |
US20100150443A1 (en) * | 2005-01-10 | 2010-06-17 | Cytyc Corporation | Method for improved image segmentation |
US20080056577A1 (en) * | 2006-08-29 | 2008-03-06 | Leo Grady | Seed segmentation using L-infinity minimization |
US7773807B2 (en) * | 2006-08-29 | 2010-08-10 | Siemens Medical Solutions Usa, Inc. | Seed segmentation using l∞ minimization |
US20100098317A1 (en) * | 2007-02-13 | 2010-04-22 | Tomoharu Kiyuna | Cell feature amount calculating apparatus and cell feature amount calculating method |
US8280139B2 (en) * | 2007-02-13 | 2012-10-02 | Nec Corporation | Cell feature amount calculating apparatus and cell feature amount calculating method |
US20090087058A1 (en) * | 2007-09-28 | 2009-04-02 | Satoshi Ihara | Image analysis apparatus, image processing apparatus, image analysis program storage medium, image processing program storage medium, image analysis method and image processing method |
US8265368B2 (en) * | 2007-09-28 | 2012-09-11 | Fujifilm Corporation | Image analysis apparatus, image processing apparatus, image analysis program storage medium, image processing program storage medium, image analysis method and image processing method |
US8815537B2 (en) | 2008-04-25 | 2014-08-26 | Roche Diagnostics Hematology, Inc. | Method for determining a complete blood count on a white blood cell differential count |
US20090269799A1 (en) * | 2008-04-25 | 2009-10-29 | Constitutional Medical Investors, Inc. | Method of determining a complete blood count and a white blood cell differential count |
US9083857B2 (en) | 2008-04-25 | 2015-07-14 | Roche Diagnostics Hematology, Inc. | Systems and methods for analyzing body fluids |
US9602777B2 (en) | 2008-04-25 | 2017-03-21 | Roche Diagnostics Hematology, Inc. | Systems and methods for analyzing body fluids |
US20100284602A1 (en) * | 2008-04-25 | 2010-11-11 | Constitution Medical Investors, Inc. | Method for determining a complete blood count on a white blood cell differential count |
US9217695B2 (en) | 2008-04-25 | 2015-12-22 | Roche Diagnostics Hematology, Inc. | Method for determining a complete blood count on a white blood cell differential count |
US10764538B2 (en) | 2008-04-25 | 2020-09-01 | Roche Diagnostics Hematology, Inc. | Systems and methods for analyzing body fluids |
US20110014645A1 (en) * | 2008-04-25 | 2011-01-20 | Constitution Medical Investors, Inc. | Method for determining a complete blood count on a white blood cell differential count |
US9017610B2 (en) | 2008-04-25 | 2015-04-28 | Roche Diagnostics Hematology, Inc. | Method of determining a complete blood count and a white blood cell differential count |
US10094764B2 (en) | 2008-04-25 | 2018-10-09 | Roche Diagnostics Hematology, Inc. | Systems and methods for determining a complete blood count and a white blood cell differential count |
US20100278389A1 (en) * | 2009-04-30 | 2010-11-04 | Industrial Technology Research Institute | Method for image recombination of a plurality of images and image identification and system for image acquiring and identification |
US20150086135A1 (en) * | 2009-04-30 | 2015-03-26 | Industrial Technology Research Institute | Method for Image Recombination of a Plurality of Images and Image Identification and System for Image Acquiring and Identification |
US9286533B2 (en) * | 2009-04-30 | 2016-03-15 | Industrial Technology Research Institute | Method for image recombination of a plurality of images and image identification and system for image acquiring and identification |
US8630509B2 (en) * | 2009-11-03 | 2014-01-14 | Samsung Electronics Co., Ltd. | Structured grids for label propagation on a finite number of layers |
US9047674B2 (en) * | 2009-11-03 | 2015-06-02 | Samsung Electronics Co., Ltd. | Structured grids and graph traversal for image processing |
US20110103712A1 (en) * | 2009-11-03 | 2011-05-05 | Samsung Electronics Co., Ltd. | Structured grids and graph traversal for image processing |
US20110103711A1 (en) * | 2009-11-03 | 2011-05-05 | Samsung Electronics Co., Ltd. | Structured grids for label propagation on a finite number of layers |
US20130156287A1 (en) * | 2010-08-30 | 2013-06-20 | Sanyo Electric Co., Ltd | Observation device, observation program, and observation system |
US9578220B2 (en) | 2010-08-30 | 2017-02-21 | Panasonic Healthcare Holdings Co., Ltd. | Observation device, observation program, and observation system |
US9060684B2 (en) * | 2010-08-30 | 2015-06-23 | Panasonic Healthcare Holdings Co., Ltd. | Observation device, observation program, and observation system |
US9280699B2 (en) | 2011-01-18 | 2016-03-08 | Roche Diagnostics Hematology, Inc. | Microscope slide coordinate system registration |
US9111343B2 (en) | 2011-01-18 | 2015-08-18 | Roche Diagnostics Hematology, Inc. | Microscope slide coordinate system registration |
US10068126B2 (en) | 2011-01-18 | 2018-09-04 | Roche Diagnostics Hematology, Inc. | Microscope slide coordinate system registration |
WO2012142496A1 (en) | 2011-04-15 | 2012-10-18 | Constitution Medical, Inc. | Measuring volume and constituents of cells |
EP3904859A1 (en) | 2011-04-15 | 2021-11-03 | Roche Diagnostics Hematology, Inc. | System and method for determining a platelet volume for a blood sample |
US9123120B2 (en) * | 2012-08-24 | 2015-09-01 | DR Vision Technologies LLC | Progressive decision for cellular process selection |
US20140056504A1 (en) * | 2012-08-24 | 2014-02-27 | Drvision Technologies Llc | Progressive decision for cellular process selection |
US10402623B2 (en) | 2017-11-30 | 2019-09-03 | Metal Industries Research & Development Centre | Large scale cell image analysis method and system |
Also Published As
Publication number | Publication date |
---|---|
CA2591241A1 (en) | 2006-07-20 |
US20100150443A1 (en) | 2010-06-17 |
AU2006205061B2 (en) | 2010-08-12 |
US20070036436A1 (en) | 2007-02-15 |
CN101103373A (en) | 2008-01-09 |
CN101103373B (en) | 2010-06-02 |
TWI307058B (en) | 2009-03-01 |
WO2006076312A2 (en) | 2006-07-20 |
ATE477556T1 (en) | 2010-08-15 |
AU2006205061A1 (en) | 2006-07-20 |
CA2591241C (en) | 2014-07-22 |
DE602006016066D1 (en) | 2010-09-23 |
WO2006076312A3 (en) | 2006-09-21 |
EP1836680B1 (en) | 2010-08-11 |
JP2008527546A (en) | 2008-07-24 |
BRPI0606705A2 (en) | 2009-07-07 |
US7881532B2 (en) | 2011-02-01 |
TW200703148A (en) | 2007-01-16 |
DK1836680T3 (en) | 2010-12-06 |
KR20070097053A (en) | 2007-10-02 |
MX2007008363A (en) | 2007-09-06 |
PT1836680E (en) | 2010-11-16 |
JP4825222B2 (en) | 2011-11-30 |
BRPI0606705B1 (en) | 2018-06-19 |
EP1836680A2 (en) | 2007-09-26 |
KR101255865B1 (en) | 2013-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7689038B2 (en) | Method for improved image segmentation | |
CN113454733A (en) | Multi-instance learner for prognostic tissue pattern recognition | |
CN111931811B (en) | Calculation method based on super-pixel image similarity | |
Ghosh et al. | Unsupervised grow-cut: cellular automata-based medical image segmentation | |
US11699224B2 (en) | Neural network training device, system and method | |
CN112150477B (en) | Full-automatic segmentation method and device for cerebral image artery | |
CN114998220B (en) | Tongue image detection and positioning method based on improved Tiny-YOLO v4 natural environment | |
Lv et al. | Nuclei R-CNN: improve mask R-CNN for nuclei segmentation | |
US20210150688A1 (en) | Neural network training device, system and method | |
Xiaojuan et al. | Top-Down Approach to the Automatic Extraction of Individual Trees from Scanned Scene Point Cloud Data. | |
Sarathi et al. | Automated Brain Tumor segmentation using novel feature point detector and seeded region growing | |
Mayerich et al. | Hardware accelerated segmentation of complex volumetric filament networks | |
Mousavi et al. | Machine-assisted annotation of forensic imagery | |
ES2351052T3 (en) | METHOD FOR IMPROVED SEGMENTATION OF IMAGES. | |
Mantau et al. | Detecting ellipses in embryo images using arc detection method with particle swarm for Blastomere-quality measurement system | |
Sui et al. | Point supervised extended scenario nuclear analysis framework based on LSTM-CFCN | |
Ram et al. | Segmentation and classification of 3-D spots in FISH images | |
CN106203384A (en) | A kind of cell division identification method of multiresolution | |
CN114820454A (en) | Feature extraction and matching method applied to MRA image aneurysm | |
Qian et al. | Coarse-to-fine particle segmentation in microscopic urinary images | |
CN118552570A (en) | X-ray ore image segmentation method, device, equipment and medium | |
Dunn | An Introduction to Image Analysis of Hemocyte Serial Sections | |
Waller et al. | Texture segmentation by evidence gathering | |
Li | Locality sensitive modelling approach for object detection, tracking and segmentation in biomedical images | |
Tajoddin | Semi-Automatic Segmentation for Serial Section Electron Microscopy Images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CYTYC CORPORATION,MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZAHNISER, MICHAEL;DIAGNOSTIC VISION CORPORATION;REEL/FRAME:017438/0804 Effective date: 20060407 Owner name: CYTYC CORPORATION, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZAHNISER, MICHAEL;DIAGNOSTIC VISION CORPORATION;REEL/FRAME:017438/0804 Effective date: 20060407 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS CREDIT PARTNERS L.P., CALIFORNIA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:CYTYC CORPORATION;REEL/FRAME:020018/0529 Effective date: 20071022 Owner name: GOLDMAN SACHS CREDIT PARTNERS L.P.,CALIFORNIA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:CYTYC CORPORATION;REEL/FRAME:020018/0529 Effective date: 20071022 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS CREDIT PARTNERS L.P., AS COLLATERAL Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:CYTYC CORPORATION;REEL/FRAME:021301/0879 Effective date: 20080717 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: CYTYC PRENATAL PRODUCTS CORP., MASSACHUSETTS Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: CYTYC SURGICAL PRODUCTS III, INC., MASSACHUSETTS Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: BIOLUCENT, LLC, CALIFORNIA Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: SUROS SURGICAL SYSTEMS, INC., INDIANA Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: DIRECT RADIOGRAPHY CORP., DELAWARE Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: CYTYC SURGICAL PRODUCTS LIMITED PARTNERSHIP, MASSA Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: CYTYC SURGICAL PRODUCTS II LIMITED PARTNERSHIP, MA Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: R2 TECHNOLOGY, INC., CALIFORNIA Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: CYTYC CORPORATION, MASSACHUSETTS Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: HOLOGIC, INC., MASSACHUSETTS Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 Owner name: THIRD WAVE TECHNOLOGIES, INC., WISCONSIN Free format text: TERMINATION OF PATENT SECURITY AGREEMENTS AND RELEASE OF SECURITY INTERESTS;ASSIGNOR:GOLDMAN SACHS CREDIT PARTNERS, L.P., AS COLLATERAL AGENT;REEL/FRAME:024892/0001 Effective date: 20100819 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS BANK USA, NEW YORK Free format text: SECURITY AGREEMENT;ASSIGNORS:HOLOGIC, INC.;BIOLUCENT, LLC;CYTYC CORPORATION;AND OTHERS;REEL/FRAME:028810/0745 Effective date: 20120801 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP, MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: SUROS SURGICAL SYSTEMS, INC., MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: HOLOGIC, INC., MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: GEN-PROBE INCORPORATED, MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: CYTYC CORPORATION, MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: THIRD WAVE TECHNOLOGIES, INC., MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP, MASS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 Owner name: BIOLUCENT, LLC, MASSACHUSETTS Free format text: SECURITY INTEREST RELEASE REEL/FRAME 028810/0745;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:035820/0239 Effective date: 20150529 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNORS:HOLOGIC, INC.;BIOLUCENT, LLC;CYTYC CORPORATION;AND OTHERS;REEL/FRAME:036307/0199 Effective date: 20150529 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: SECURITY AGREEMENT;ASSIGNORS:HOLOGIC, INC.;BIOLUCENT, LLC;CYTYC CORPORATION;AND OTHERS;REEL/FRAME:036307/0199 Effective date: 20150529 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
AS | Assignment |
Owner name: CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP, MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: GOLDMAN SACHS BANK USA, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 028810 FRAME: 0745. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY AGREEMENT;ASSIGNORS:HOLOGIC, INC.;BIOLUCENT, LLC;CYTYC CORPORATION;AND OTHERS;REEL/FRAME:044432/0565 Effective date: 20120801 Owner name: SUROS SURGICAL SYSTEMS, INC., MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP, MASS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: BIOLUCENT, LLC, MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: GEN-PROBE INCORPORATED, MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: HOLOGIC, INC., MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: CYTYC CORPORATION, MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 Owner name: THIRD WAVE TECHNOLOGIES, INC., MASSACHUSETTS Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE INCORRECT PATENT NO. 8081301 PREVIOUSLY RECORDED AT REEL: 035820 FRAME: 0239. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST RELEASE;ASSIGNOR:GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT;REEL/FRAME:044727/0529 Effective date: 20150529 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |