WO1997043732A1 - Method and apparatus for automatically detecting malignancy-associated changes - Google Patents
Method and apparatus for automatically detecting malignancy-associated changes Download PDFInfo
- Publication number
- WO1997043732A1 WO1997043732A1 PCT/CA1997/000301 CA9700301W WO9743732A1 WO 1997043732 A1 WO1997043732 A1 WO 1997043732A1 CA 9700301 W CA9700301 W CA 9700301W WO 9743732 A1 WO9743732 A1 WO 9743732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pixels
- ofthe
- intensity
- pixel
- Prior art date
Links
- 206010028980 Neoplasm Diseases 0.000 title claims abstract description 52
- 201000011510 cancer Diseases 0.000 title claims abstract description 51
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000036210 malignancy Effects 0.000 title claims abstract description 21
- 238000005286 illumination Methods 0.000 claims abstract description 7
- 210000004027 cell Anatomy 0.000 claims description 85
- 230000003287 optical effect Effects 0.000 claims description 43
- 210000004940 nucleus Anatomy 0.000 claims description 26
- 210000003855 cell nucleus Anatomy 0.000 claims description 21
- 230000008859 change Effects 0.000 claims description 6
- 238000010186 staining Methods 0.000 claims description 6
- 230000000916 dilatatory effect Effects 0.000 claims description 3
- 230000003628 erosive effect Effects 0.000 claims description 2
- 239000000463 material Substances 0.000 abstract description 3
- 108091093105 Nuclear DNA Proteins 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 38
- 108020004414 DNA Proteins 0.000 description 18
- 230000008569 process Effects 0.000 description 12
- 239000011159 matrix material Substances 0.000 description 10
- 239000013598 vector Substances 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 239000012153 distilled water Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 230000001747 exhibiting effect Effects 0.000 description 5
- 210000001519 tissue Anatomy 0.000 description 5
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 4
- 238000003066 decision tree Methods 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 210000000265 leukocyte Anatomy 0.000 description 4
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 3
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 3
- 230000008033 biological extinction Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000003902 lesion Effects 0.000 description 3
- 210000004072 lung Anatomy 0.000 description 3
- 201000005202 lung cancer Diseases 0.000 description 3
- 208000020816 lung neoplasm Diseases 0.000 description 3
- 230000003211 malignant effect Effects 0.000 description 3
- 239000011824 nuclear material Substances 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000002919 epithelial cell Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- LSNNMFCWUKXFEE-UHFFFAOYSA-M Bisulfite Chemical compound OS([O-])=O LSNNMFCWUKXFEE-UHFFFAOYSA-M 0.000 description 1
- 235000002566 Capsicum Nutrition 0.000 description 1
- 206010008342 Cervix carcinoma Diseases 0.000 description 1
- 108010077544 Chromatin Proteins 0.000 description 1
- CTQNGGLPUBDAKN-UHFFFAOYSA-N O-Xylene Chemical compound CC1=CC=CC=C1C CTQNGGLPUBDAKN-UHFFFAOYSA-N 0.000 description 1
- 239000006002 Pepper Substances 0.000 description 1
- 235000016761 Piper aduncum Nutrition 0.000 description 1
- 235000017804 Piper guineense Nutrition 0.000 description 1
- 244000203593 Piper nigrum Species 0.000 description 1
- 235000008184 Piper nigrum Nutrition 0.000 description 1
- 208000006994 Precancerous Conditions Diseases 0.000 description 1
- 206010036790 Productive cough Diseases 0.000 description 1
- 101100489892 Sus scrofa ABCG2 gene Proteins 0.000 description 1
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 239000006285 cell suspension Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 201000010881 cervical cancer Diseases 0.000 description 1
- 210000003483 chromatin Anatomy 0.000 description 1
- 210000000349 chromosome Anatomy 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000004163 cytometry Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000003743 erythrocyte Anatomy 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 210000004698 lymphocyte Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 150000003839 salts Chemical class 0.000 description 1
- 238000007790 scraping Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 210000003802 sputum Anatomy 0.000 description 1
- 208000024794 sputum Diseases 0.000 description 1
- KUUVQVSHGLHAKZ-UHFFFAOYSA-N thionine Chemical compound C=1C=CC=CSC=CC=1 KUUVQVSHGLHAKZ-UHFFFAOYSA-N 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 239000008096 xylene Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/5005—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells
- G01N33/5008—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics
- G01N33/5014—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing toxicity
- G01N33/5017—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing toxicity for testing neoplastic activity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1429—Signal processing
- G01N15/1433—Signal processing using image recognition
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/5005—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells
- G01N33/5091—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing the pathological state of an organism
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
Definitions
- the present invention relates to image cytometry systems in general, and in particular to automated systems for detecting malignancy-associated changes in cell nuclei.
- the most common method of diagnosing cancer in patients is by obtaining a sample of the suspect tissue and examining it under a microscope for the presence of obviously malignant cells. While this process is relatively easy when the location of the suspect tissue is known, it is not so easy when there is no readily identifiable tumor or pre-cancerous lesion. For example, to detect the presence of lung cancer from a sputum sample requires one or more relatively rare cancer cells to be present in the sample. Therefore patients having lung cancer may not be diagnosed properly if the sample does not accurately reflect the conditions ofthe lung.
- MACs Malignancy-associated changes
- MACs are subtle changes that are known to take place in the nuclei of apparently normal cells found near cancer tissue.
- MACs have been detected in tissue found near pre-cancerous lesions. Because the cells exhibiting MACs are more numerous than the malignant cells, MACs offer an additional way of diagnosing the presence of cancer, especially in cases where no cancerous cells can be located.
- MACs have not yet achieved wide acceptance as a screening tool to determine whether a patient has or will develop cancer
- MACs have been detected by carefully selecting a cell sample from a location near a tumor or pre-cancerous lesion and viewing the cells under relatively high magnification.
- the malignancy-associated changes that take place in the cells are too subtle to be reliably detected by a human pathologist working with conventional microscopic equipment, especially when the pathologist does not know beforehand if the patient has cancer or not.
- a malignancy-associated change may be indicated by the distribution of DNA within the nucleus coupled with slight variations in the shape of the nucleus edge.
- nuclei from normal cells may exhibit similar types of changes but not to the degree that would signify a MAC. Because human operators cannot easily quantify such subtle cell changes, it is difficult to determine which cells exhibit MACs. Furthermore, the changes which indicate a MAC may vary between different types of cancer, thereby increasing the difficulty of detecting them.
- the present invention is a system for automatically detecting malignancy- associated changes in cell samples.
- the system includes a digital microscope having a CCD camera that is controlled by and interfaced with a computer system. Images captured by the digital microscope are stored in an image processing board and manipulated by the computer system to detect the presence of malignancy-associated changes (MACs).
- MACs malignancy-associated changes
- a cell sample is obtained and stained to identify the nuclear material of the cells and is imaged by the microscope.
- the stain is stoichiometric and specific to DNA only.
- the computer system analyzes the image to compute a histogram of all pixels comprising the image.
- an intensity threshold is set that divides the background pixels from those comprising the objects in the image. All pixels having an intensity value less than the threshold are identified as possible objects of interest while those having an intensity value greater than the threshold are identified as background and are ignored.
- the computer system calculates the area, shape and optical density of the object. Those objects that could not possibly be cell nuclei are ignored.
- the image is decalibrated, i.e., corrected by subtracting an empty frame captured before the scanning of the slide from the current frame and adding back an offset value equal to the average background light level. This process corrects for any shading of the system, uneven illumination, and other imperfections ofthe image acquisition system.
- the images of all remaining objects must be captured in a more precise focus. This is achieved by moving the microscope in the stage z-direction in multiple focal planes around the approximate frame focus.
- a contrast function (a texture feature) is calculated.
- the contrast function has a peak value at the exact focus of the object Only the image at the highest contrast value is retained in the computer memory and any object which did not reach such a peak value is also discarded from further considerations.
- Each remaining in-focus object on the image is further compensated for local absorbency ofthe materials surrounding the object.
- This is a local decalibration which is similar to that described for the frame decalibration described above, except that only a small subset of pixels having an area equal to the area of a square into which the object will fit is corrected using an equivalent square ofthe empty frame.
- the edge of the object is calculated, i.e., the boundary which determines which pixels in the square belong to the object and which belong to the background.
- the edge determination is achieved by the edge-relocation algorithm.
- the edge ofthe original mask ofthe first contoured frame of each surviving object is dilated for several pixels inward and outward.
- a gradient value is calculated, i.e., the sum and difference between all neighbor pixels touching the pixel in question. " Then the lowest gradient value pixel is removed from the rim, subject to the condition that the rim is not ruptured. The process continues until such time as a single pixel rim remains.
- this edge may be again dilated as before, and the process repeated until such time as the new edge is identical to the previous edge. In this way the edge is calculated along the highest local gradient.
- the computer system then calculates a set of feature values for each object. For some feature calculations the edge along the highest gradient value is corrected by either dilating the edge by one or more pixels or eroding the edge by one or more pixels. This is done such that each feature achieves a greater discriminating power between classes of objects and is thus object specific.
- These feature values are then analyzed by a classifier that uses the feature values to determine whether the object is an artifact or is a cell nucleus.
- the feature values are further analyzed by the classifier to determine whether the nucleus exhibits malignancy-associated changes. Based on the number of objects found in the sample that appear to have malignancy-associated changes and/or an overall malignancy-associated score, a determination can be made whether the patient from whom the cell sample was obtained is healthy or harbors a malignant growth.
- FIGURE 1 is a block diagram of the MAC detection system according to the present invention
- FIGURES 2A-2C are a series of flow charts showing the steps performed by the present invention to detect MACs;
- FIGURE 3 is an illustrative example of a histogram used to separate objects of interest from the background of a slide;
- FIGURE 4 is a flow chart of the preferred staining procedure used to prepare a cell sample for the detection of MACs
- FIGURES 5 and 6 are illustrations of objects located in an image
- FIGURES 7A-7F illustrate how the present invention operates to locate the edge of an object
- FIGURES 8 and 9 are diagrammatic illustrations of a classifier that separates artifacts from cell nuclei and MAC nuclei from non-MAC nuclei;
- FIGURE 10 is a flow chart of the steps performed by the present invention to determine whether a patient is normal or abnormal based on the presence of MACs.
- the present invention is a system for automatically detecting malignancy-associated changes (MACs) in the nuclei of cells obtained from a patient. From the presence or absence of MACs, a determination can be made whether the patient has a malignant cancer.
- MACs malignancy-associated changes
- FIGURE 1 A block diagram of the MAC detection system according to the present invention is shown in FIGURE 1.
- the system 10 includes a digital microscope 12 that is controlled by and interfaced with a computer system 30.
- the microscope 12 preferably has a digital CCD camera 14 employing a scientific CCD having square pixels of approximately 0.3 ⁇ m by 0.3 ⁇ m size.
- the scientific CCD has a 100% fill factor and at least a 256 gray level resolution.
- the CCD camera is preferably mounted in the primary image plane of a planar objective lens 22 of the microscope 12.
- a cell sample is placed on a motorized stage 20 of the microscope whose position is controlled by the computer system 30.
- the motorized stage preferably has an automatic slide loader so that the process of analyzing slides can be completely automated.
- the lens 22 placed between the sample 16 and the CCD camera 14 is preferably a 20x/0.75 objective that provides a depth of field in the range of 1-2 ⁇ m that yields a distortion- free image.
- the digital CCD camera 14 used is the MicroimagerTM produced by Xillix Technologies Corp. of Richmond, B.C., Canada.
- the images produced by the CCD camera are received by an image processing board 32 that serves as the interface between the digital camera 14 and the computer system 30.
- the digital images are stored in the image processing board and manipulated to facilitate the detection of MACs.
- the image processing board creates a set of analog video signals from the digital image and feeds the video signals to an image monitor 36 in order to display an image of the objects viewed by the microscope.
- the computer system 30 also includes one or more input devices 38, such as a keyboard and mouse, as well as one or more peripherals 42, such as a mass digital storage device, a modem or a network card for communicating with a remotely located computer, and a monitor 40.
- input devices 38 such as a keyboard and mouse
- peripherals 42 such as a mass digital storage device, a modem or a network card for communicating with a remotely located computer, and a monitor 40.
- FIGURES 2A-2C show the steps performed by the system of the present invention to determine whether a sample exhibits MACs or not.
- a cell sample is obtained.
- Cells may be obtained by any number of conventional methods such as biopsy, scraping, etc.
- the cells are affixed to a slide and stained using a modified Feulgen procedure at a step 52 that identifies the nuclear DNA in the sample. The details of the staining procedure are shown in FIGURE 4 and described in detail below.
- an image of a frame from the slide is captured by the CCD camera and is transferred into the image processor.
- the CCD sensor within the camera is cleared and a shutter of the camera is opened for a fixed period that is dependent on the intensity of the light source 18.
- the stage then moves to a new position on the slide such that another image of the new frame can be captured by the camera and transferred into the computer memory. Because the cell sample on the slide occupies a much greater area than the area viewed by the microscope, a number of slide images are used to determine whether the sample is MAC-positive or negative. The position of each captured image on the slide is recorded in the computer system so that the objects of interest in the image can be found on the slide if desired.
- the computer system determines whether the image produced by the CCD camera is devoid of objects. This is performed by scanning the digital image for dark pixels. If the number of dark pixels, i.e., those pixels having an intensity ofthe background intensity minus a predetermined offset value, is fewer than a predetermined minimum, the computer system assumes that the image is blank and the microscope stage is moved to a new position at step 60 and a new image is captured at step 54.
- the computer system attempts to globally focus the image.
- the objects of interest in the image have a maximum darkness. Therefore, for focus determination the height of the stage is adjusted and a new image is captured. The darkness of the object pixels is determined and the process repeats until the average darkness of the pixels in the image is a maximum.
- the computer system assumes that global focus has been obtained.
- the computer system computes a histogram of all pixels. As shown in FIGURE 3, a histogram is a plot of the number of pixels at each intensity level.
- each pixel can have an intensity ranging from 0 (maximum darkness) to 255 (maximum brightness).
- the histogram typically contains a first peak 90 that represents the average intensity ofthe background pixels.
- a second, smaller peak 92 represents the average intensity ofthe pixels that comprise the objects.
- the computer system computes the threshold that separates objects in the image from the background at step 68.
- a step 72 all pixels in the cell image having an intensity less than the threshold value are identified.
- the results of step 72 are shown in FIGURE 5.
- the frame image 200 contains numerous objects of interest 202, 204, 206 . . . 226. Some of these objects are cell nuclei, which will be analyzed for the presence of MACs, while other objects are artifacts such as debris, dirt particles, white blood cells, etc., and should be removed from the cell image.
- the computer system calculates the area, shape (sphericity) and optical density of each object according to formulas that are described in further detail below.
- the computer system removes from memory any objects that cannot be cell nuclei. In the present embodiment of the invention those objects that are not possibly cell nuclei are identified as having an area greater than 2,000 ⁇ m 2 , an optical density less than 1 c (i.e., less that 1/2 ofthe overall chromosome count of a normal individual) or a shape or sphericity greater than 4.
- step 76 The results of step 76 are shown in FIGURE 6 where only a few of the previously identified objects of interest remain. Each of the remaining objects is more likely to be a cell nuclei that is to be examined for a malignancy-associated change.
- the computer system determines whether there are any objects remaining by scanning for dark pixels at step 78. If no objects remain, the computer system returns to step 54, a new image on the slide is captured and steps 54-76 are repeated.
- the computer system then compensates the image for variations in illumination intensity at step 80. To do this, the computer system recalls a calibration image that was obtained by scanning in a blank slide for the same exposure time that was used for the image of the cells under consideration. The computer system then begins a pixel-by-pixel subtraction of the intensity values of the pixels in the calibration image obtained from the blank slide from the corresponding pixels found in the image obtained from the cell sample. The computer system then adds a value equal to the average illumination of the pixels in the calibration image obtained from the blank slide to each pixel of the cell image The result of the addition illuminates the cell image with a uniform intensity
- the computer system attempts to refine the focus of each object of interest in the image at step 82 (FIGURE 2C)
- the optimum focus is obtained when the object has a minimum size and maximum darkness
- the computer system therefore causes the stage to move a predefined amount above the global focus position and then moves in a sequence of descending positions
- the CCD camera captures an image of the frame and calculates the area and the intensity of the pixels comprising the remaining objects Only one image of each object is eventually stored in the computer memory coming from the position in which the pixels comprising the object have the maximum darkness and occupy a minimum area If the optimum focus is not obtained after a predetermined number of stage positions, then the object is removed from the computer memory and is ignored Once the optimum focus of the object is determined, the image received from the CCD camera overwrites those pixels that comprise the object under consideration in the computer's memory The result of the local focusing produces a pseudo-focused image in the computer's memory whereby each object of interest is ultimately recorded at its best possible focus
- the computer system determines whether any in-focus objects in the cell image were found. If not, the computer system returns to step 54 shown in FIGURE 2A whereby the slide is moved to another position and a new image is captured
- the computer system then compensates for local absorbency of light near the object at a step 85 to do this, the computer system analyzes a number of pixels within a box having an area that is larger than the object by two pixels on all sides.
- An example of such a box is the box 207 shown in FIGURE 6
- the computer system then performs a pixel-by-pixel subtraction of the intensity values from a corresponding square in the calibration image obtained from the blank slide
- the average illumination intensity of the calibration image is added to each pixel in the box surrounding the object
- the average intensity value for those pixels that are in the box but are not part of the object is determined and this local average value is then subtracted from each pixel in the box that encloses the object
- the computer system determines a more precise edge of each remaining object in the cell image at step 86 The steps required to compute the edge are discussed in further detail below
- the computer system calculates a set of features for each remaining object at a step 87 These feature values are used to further separate artifacts from cell nuclei as well as to identify nuclei exhibiting MACs The details of the feature calculation are described below
- the computer system runs a classifier that compares the feature values calculated for each object and determines whether the object is an artifact and, if not, whether the object is a nucleus that exhibits MACs
- the pseudo-focus digital image, the feature calculations and the results ofthe classifier for each in-focus object are stored in the computer's memory
- the computer system determines whether further scans of the slide are required As indicated above, because the size of each cell image is much less than the size of the entire slide, a number of cell images are captured to ensure that the slide has been adequately analyzed Once a sufficient number of cell images have been analyzed, processing stops at step 94 Alternatively, if further scans are required, the computer system loops back to step 54 and a new image of the cell sample is captured As indicated above, before the sample can be imaged by the digital microscope, the sample is stained to identify the nuclear material
- FIGURE 4 is a flow chart of the steps used to stain the cell samples
- the cell sample is placed on a slide, air dried and then soaked in a 50% glycerol solution for four minutes The ceil is then washed in distilled water for two minutes at a step 102
- the sample is bathed in a 50% ethanol solution for two minutes and again washed with distilled water for two minutes at a step 106
- the sample is then soaked in a Bohm-Springer solution for 30 minutes at a step 108 followed by washing with distilled water for one minute at a step 110.
- the sample is soaked in a 5N HC1 solution for 45 minutes and rinsed with distilled water for one minute at a step 114
- the sample is then stained in a thionine stain for 60 minutes at a step 116 and rinsed with distilled water for one minute at a step 118
- the sample is soaked in a bisulfite solution for six minutes followed by a rinse for one minute with distilled water at a step 122
- the sample is dehydrated in solutions of 50%, 75% and 100% ethanol for approximately 10 seconds each at a step 124
- the sample is then soaked in a final bath of xylene for one minute at a step 126 before a cover slip is applied at a step 128.
- the cell sample After the cell sample has been prepared, it is ready to be imaged by the digital microscope and analyzed as described above.
- FIGURES 7A-7F illustrate the manner in which the present invention calculates the precise edge of an object.
- an object 230 is comprised of those pixels having an intensity value less than the background/object threshold which is calculated from the histogram and described above.
- the pixels lying at the original edge ofthe object are dilated to form a new edge region 242.
- a second band of pixels lying inside the original edge are also selected to form a second edge region 244.
- the computer system assumes that the true edge is somewhere within the annular ring bounded by the edge regions 242 and 244.
- the annular ring has a width of approximately ten pixels.
- the computer calculates a gradient for each pixel contained in the annular ring.
- the gradient for each pixel is defined as the sum of the differences in intensity between each pixel and its surrounding eight neighbors. Those pixels having neighbors with similar intensity levels will have a low gradient while those pixels at the edge of the object will have a high gradient.
- the computer system divides the range of gradients into multiple thresholds and begins removing pixels having lower gradient values from the ring.
- the computer scans the object under consideration in a raster fashion. As shown in FIGURE 7C, the raster scan begins at a point A and continues to the right until reaching a point B. During the first scan, only pixels on the outside edge, i.e., pixels on the edge region 242, are removed.
- the computer system then scans in the opposite direction by starting, for example, at point D and continuing upwards to point B returning in a raster fashion while only removing pixels on the inside edge region 244 of the annular ring.
- the computer system then scans in another orthogonal direction—for example, starting at point C and continuing in the direction of point D in a raster fashion, this time only removing pixels on the outside edge region 242. This process continues until no more pixels at that gradient threshold value can be removed.
- Pixels are removed from the annular ring subject to the conditions that no pixel can be removed that would break the chain of pixels around the annular ring. Furthermore, adjacent pixels cannot be removed during the same pass of pixel removal. Once all the pixels are removed having a gradient that is less than or equal to the first gradient threshold, the threshold is increased and the process starts over. As shown in FIGURE 7D, the pixel-by-pixel removal process continues until a single chain of pixels 240' encircles the object in question.
- the intensity of each pixel that comprises the newly found edge is compared with its eight neighbors. As shown in FIGURE 7E, for example, the intensity of a pixel 246 is compared with its eight surrounding pixels. If the intensity of pixel 246 is less than the intensity of pixel 250, then the pixel 246 is removed from the pixel chain as it belongs to the background. To complete the chain, pixels 248 and 252 are added so that the edge is not broken as shown in FIGURE 7F. After completing the edge relocation algorithm and determining whether each pixel should be included in the object of interest, the system is ready to compute the feature values for the object
- the computer system must make a determination whether the object is a cell nucleus that should be analyzed for malignancy-associated changes or is an artifact that should be ignored. As discussed above, the system removes obvious artifacts based on their area, shape (sphericity) and optical density. However, other artifacts may be more difficult for the computer to recognize. To further remove artifacts, the computer system uses a classifier that interprets the values ofthe features calculated for the object.
- a classifier 290 is a computer program that analyzes an object based on its feature values.
- the first database 275 contains feature values of objects that have been imaged by the system shown in FIGURE 1 and that have been previously identified by an expert pathologist as non-nuclei, i.e., artifacts.
- a second database 285 contains the features calculated for objects that have been imaged by the system and that have been previously identified by an expert as cell nuclei.
- the data in each of these databases is fed into a statistical computer program which uses a stepwise linear discriminant function analysis to derive a discriminant function that can distinguish cell nuclei from artifacts.
- the classifier is then constructed as a binary decision tree based on thresholds and/or the linear discriminant functions. The binary tree answers a series of questions based on the feature values to determine the identity of an object.
- the particular thresholds used in the binary tree are set by statisticians who compare histograms of feature values calculated on known objects. For example, white blood cells typically have an area less than 50 ⁇ m 2 . Because the present invention treats a red blood cell as an artifact, the binary decision tree can contain a node that compares the area of an object to the 50 ⁇ m 2 threshold Objects with an area less than the threshold are ignored while those with an area having a greater area are further analyzed to determine if they are possible MAC cells or artifacts
- the discriminant functions that separate types of objects are generated by the BMDP program available from BMDP Statistical Software, Inc., of Los Angeles, California Given the discriminant functions and the appropriate thresholds, the construction of the binary tree classifier is considered routine for one of ordinary skill in the art.
- the binary tree classifier Once the binary tree classifier has been developed, it can be supplied with a set of feature values 292 taken from an unknown object and will provide an indication 294 of whether the object associated with the feature data is most likely an artifact or a cell nucleus
- FIGURE 9 shows how a classifier is used to determine whether a slide exhibits malignancy-associated changes or not
- the classifier 300 is constructed using a pair of databases
- a first database 302 contains feature values obtained from apparently normal cells that have been imaged by the digital microscope system shown in FIGURE 1 and are known to have come from healthy patients.
- a second database 304 contains feature values calculated from apparently normal cells that were imaged by the digital microscope system described above and were known to have come from abnormal (i e., cancer) patients
- classifier 300 used in the presently preferred embodiment of the invention is a binary decision tree made up of discriminant functions and/or thresholds that can separate the two groups of cells
- FIGURE 10 is a flow chart ofthe steps performed by the present invention to determine whether a patient potentially has cancer
- the computer system recalls the features calculated for each in-focus nuclei on the slide.
- the computer system runs the classifier that identifies MACs based on these features.
- the computer system provides an indication of whether the nucleus in question is MAC-positive or not If the answer to step 332 is yes, then an accumulator that totals the number of MAC-positive nuclei for the slide is increased at a step 334
- the computer system determines whether all the nuclei for which features have been calculated have been analyzed.
- the computer system determines whether the frequency of MAC-positive cells on the slide exceeds a predetermined threshold. For example, in a particular preparation of cells (air dried, as is the practice in British Columbia, Canada) to detect cervical cancer, it has been determined that if the total number of MAC-positive epithelial cells divided by the total number of epithelial cells analyzed exceeds 0.45 per slide, then there is an 85% chance that the patient has or will develop cancer. If the frequency of cells exhibiting MACs exceeds the threshold, the computer system can indicate that the patient is healthy at step 342 or likely has or will develop cancer at step 344.
- a predetermined threshold For example, in a particular preparation of cells (air dried, as is the practice in British Columbia, Canada) to detect cervical cancer, it has been determined that if the total number of MAC-positive epithelial cells divided by the total number of epithelial cells analyzed exceeds 0.45 per slide, then there is an 85% chance that the patient has or will develop cancer. If the frequency of cells exhibiting MACs exceed
- the threshold above which it is likely that a patient exhibiting MACs has or will develop cancer is determined by comparing the MAC scores of a large numbers of patients who did develop cancer and those who did not.
- the particular threshold used will depend on the type of cancer to be detected, the equipment used to image the cells, etc.
- the MAC detection system of the present invention can also be used to determine the efficacy of cancer treatment. For example, patients who have had a portion of a lung removed as a treatment for lung cancer can be asked to provide a sample of apparently normal cells taken from the remaining lung tissue. If a strong MAC presence is detected, there is a high probability that the cancer will return. Conversely, the inventors have found that the number of MAC cells decreases when a cancer treatment is effective.
- the ability of the present invention to detect malignancy- associated changes depends on the values of the features computed. The following is a list ofthe features that is currently calculated for each in-focus object.
- Each image is a rectangular array of square pixels that contains within it the image of an (irregularly shaped) object, surrounded by background.
- Each pixel P ⁇ is an integer representing the photometric value (gray scale) of a corresponding small segment ofthe image, and may range from 0 (completely opaque) to 255 (completely transparent).
- the image rectangle is larger than the smallest rectangle that can completely contain the object by at least two rows, top and bottom, and two columns left and right, ensuring that background exists all around the object.
- the region of the image that is the object is denoted by its characteristic function, ⁇ ; this is also sometimes called the "object mask” or, simply, the "mask.”
- the object mask is a binary function:
- Mo ⁇ hological features estimate the image area, shape, and boundary variations ofthe object.
- the area, A is defined as the total number of pixels belonging to the object, as defined by the mask, ⁇ :
- the x_centroid and y centroid are the coordinates ofthe geometrical center of the object, defined with respect to the image origin (upper-left hand corner)
- the mean radius and max radius features are the mean and maximum values ofthe length of the object's radial vectors from the object centroid to its 8 connected edge pixels *
- N W max_ radius max(r /( ) (6)
- r k is the fa> radial vector
- N is the number of 8 connected pixels on the object edge
- the var_radius feature is the variance of length of the object's radius vectors, as defined in Section II 3
- N- l V ' where r k is the k& radius vector, ⁇ is the mean_radius, and N is the number of 8 connected edge pixels 11.5 sphericity
- the sphericity feature is a shape measure, calculated as a ratio of the radii of two circles centered at the object centroid (defined in Section II.2 above).
- One circle is the largest circle that is fully inscribed inside the object perimeter, corresponding to the absolute minimum length of the object's radial vectors.
- the other circle is the minimum circle that completely circumscribes the object's perimeter, corresponding to the absolute maximum length of the object's radial vectors.
- the eccentricity feature is a shape function calculated as the square root ofthe ratio of maximal and minimal eigenvalues ofthe second central moment matrix ofthe object's characteristic function, ⁇ :
- Eccentricity may be interpreted as the ratio ofthe major axis to minor axis ofthe "best fit" ellipse which describes the object, and gives the minimal value 1 for circles
- the inertia shape feature is a measure of the "roundness" of an object calculated as the moment of inertia of the object mask, normalized by the area squared, to give the minimal value 1 for circles.
- Equation 1 L M 0 2 ⁇ ⁇ ⁇ R, 2 y ⁇ , y inertia _ shape - (U) where R t , is the distance of the pixel, P j ,, to the object centroid (defined in Section II.2), and A is the object area, and ⁇ is the mask defined by Equation 1
- the compactness feature is another measure ofthe object's "roundness.” It is calculated as the perimeter squared divided by the object area, giving the minimal value 1 for circles.
- Perimeter is calculated from boundary pixels (which are themselves 8 connected) by considering their 4 connected neighborhood where Nj is the number of pixels on the edge with 1 non-object neighbor, N2 is the number of pixels on the edge with 2 non-object neighbors, and Nj is the number of pixels on the edge with 3 non-object neighbors.
- the cell_orient feature represents the object orientation measured as a deflection ofthe main axis ofthe object from the y direction:
- y momeM2 and xy cmamommt2 are the second central moments of the characteristic function ⁇ defined by Equation 1 above
- ⁇ j is the maximal eigenvalue of the second central moment matrix of that function (see Section II.6 above).
- the main axis of the object is defined by the eigenvector corresponding to the maximal eigenvalue.
- a geometrical interpretation of the cell orient is that it is the angle (measured in a clockwise sense) between ihey axis and the "best fit" ellipse major axis.
- this feature should be meaningless, as there should not be any a priori preferred cellular orientation.
- this feature may have value. In smears, for example, debris may be preferentially elongated along the slide long axis.
- Sections II.10 to II.13 are calculated by sweeping the radius vector
- the function is interpolated from an average ofthe object edge pixel locations at each ofthe 128 angles.
- the elongation feature is another measure of the extent of the object along the principal direction (corresponding to the major axis) versus the direction normal to it. These lengths are estimated using Fourier Transform coefficients ofthe radial function ofthe object: (15)
- « 2 ,£ 2 are Fourier Transform coefficients of the radial function of the object, r( ⁇ ), defined by: a m m
- freq_low_fft gives an estimate of coarse boundary variation, measured as the energy of the lower harmonics of the Fourier spectrum of the object's radial function (from 3rd to 11th harmonics):
- the freq_high_fft gives an estimate of the fine boundary variation, measured as the energy of the high frequency Fourier spectrum (from 12th to 32nd harmonics) of the obj ect ' s radial function :
- n ,b n are Fourier Transform coefficients of the nth harmonic, defined by
- harmonOl fft, ... harmon32_fft features are estimates of boundary variation, calculated as the magnitude of the Fourier Transform coefficients of the object radial function for each harmonic 1 - 32:
- Photometric features give estimations of absolute intensity and optical density levels ofthe object, as well as their distribution characteristics.
- DNA Amount is the "raw" (unnormalized) measure of the integrated optical density ofthe object, defined by a once dilated mask, ⁇ + :
- OD IJ bg l0 I B - log l0 I IJ (21) where I B is the intensity of the local background, and I, 0 is the intensity of the ij th pixel.
- DNA_Index is the normalized measure ofthe integrated optical density of the object:
- the var intensity and mean intensity features are the variance and mean ofthe intensity function ofthe object, /, defined by the mask, ⁇
- OD_maximum is the largest value of the optical density of the object, normalized to JOG? norm ,”> as defined in Section III 2 above
- OD variance is the normalized variance (second moment) of optical density function ofthe object
- OD variance (27) where ⁇ is the object mask as defined in Section 1.2, OD is the mean value of the optical density ofthe object and A is the object area (total number of pixels) The variance is divided by the square of the mean optical density in order to make the measurement independent of the staining intensity ofthe cell.
- the OD_skewness feature is the normalized third moment of the optical density function ofthe object
- ⁇ is the object mask as defined in Section 1.2
- OD is the mean value of the optical density ofthe object
- A is the object area (total number of pixels)
- OD kurtosis is the normalized fourth moment of the optical density function ofthe object.
- ⁇ is the object mask as defined in Section 1.2
- OD is the mean value of the optical density ofthe object
- A is the object area IV
- the discrete texture features are based on segmentation of the object into regions of low, medium and high optical density. This segmentation of the object into low, medium and high density regions is based on two thresholds: optical density high threshold and optical density medium threshold. These thresholds are scaled to the sample's iodnorm value, based on the DNA amount of a particular subset of objects (e.g., lymphocytes), as described in Section III.2 above.
- objects e.g., lymphocytes
- these thresholds have been selected such that the condensed chromatin in leukocytes is high optical density material.
- the second threshold is located halfway between the high threshold and zero.
- CHROMATINJ ⁇ GH THRES 36
- CHROMAT ⁇ N_MED ⁇ UM_THRES is Ahig* 1 is the area of the pixels having an optical density between 0 and 18,
- a med is the area of the pixels having an optical density between 18 and 36 and A low is the area of the pixels having an optical density greater than 36. Together the areas A.high ; ⁇ jne ⁇ anc j j ⁇ low sum t0 t h e total area of the object.
- the actual thresholds used are these parameters, divided by 100, and multiplied by the factor iod norm /100.
- ⁇ ow , ⁇ med , and ⁇ 1 " * ⁇ are masks for low-, medium-, and high-optical density regions of the object, respectively, defined in analogy to Equation 1.
- ⁇ is the object mask as defined in Equation 1
- OD is the optical density as defined by Equation 21 IV.3 lowDNAcomp, medDNAcomp, hiDNAcomp, mhDNAcomp
- OD is the region optical density defined in analogy to Equation 21
- ⁇ is the region mask, defined in analogy to Equation 1
- A is the region area, defined in analogy to Equation 2 IV.6 low den obj, med_den_obj, high den obj
- Equation 5 mean radius of the object is defined by Equation 5
- the object's centroid is ddeeffiinneedd iinn SSeeccttiioonn IIII..22,, ⁇ iiss tthhee rreeggiieon mask '- d ⁇ ef r "ine -d ⁇ ! in anal ⁇ ogy ⁇ to ⁇ Equat -i ' on - 1
- A is the region area defined in analogy to Equation 2.
- Markovian texture features are defined from the co-occurrence matrix, A ⁇ ⁇ of object pixels. Each element of that matrix stands for the conditional probability of the pixel of grey level ⁇ occurring next (via 8-connectedness) to a pixel of grey level ⁇ , where ⁇ , ⁇ are row and column indices of the matrix, respectively.
- the computational algorithms used here for the calculation of Markovian texture features uses so-called sum and difference histograms: H' and H m d ⁇ where H ⁇ is the probability of neighboring pixels having grey levels which sum to 1, and H m d is the probability of neighboring pixels having grey level differences of m, where an 8- connected neighborhood is assumed. Values of grey levels, 1, m, used in the sum and difference histogram are obtained by quantization of the dynamic range of each individual object into 40 levels.
- the energy feature gives large values for an object with a spatially organized grey scale distribution. It is the opposite of entropy, giving large values to an object with large regions, of constant grey level:
- the contrast feature gives large values for an object with frequent large grey scale variations:
- a large value for correlation indicates an object with large connected subcomponents of constant grey level and with large grey level differences between adjacent components:
- /' is the mean intensity of the object calculated for the grey scale quantized to 40 levels.
- the cl_shade feature gives large absolute values for objects with a few distinct clumps of uniform intensity having large contrast with the rest of the object. Negative values correspond to dark clumps against a light background while positive values indicate light clumps against a dark background:
- the feature cl_prominence measures the darkness of clusters.
- cl_ prominence ⁇ ⁇ ( ⁇ + ⁇ - 21 q ) ⁇ ⁇ ⁇ (conventional) ⁇ ⁇
- DNA amount value, iod no ⁇ n > defined in Section III 2 above The local maxima, max, ; and minima, min,, ; , values used are those from Section VI 1 above
- the center_of_grav ⁇ ty feature represents the distance from the geometrical center ofthe object to the "center of mass" ofthe optical density function, normalized by the mean radius ofthe object
- center_ of_ gravity mean radius
- the fractal texture features are based on the area of the three-dimensional surface of the object's optical density represented essentially as a three-dimensional bar graph, with the vertical axis representing optical density, and the horizontal axes representing the x and y spatial coordinates.
- each pixel is assigned a unit area in the x - y plane plus the area of the sides of the three-dimensional structure proportional to the change in the pixel optical density with respect to its neighbors.
- the largest values of fractal areas correspond to large objects containing small subcomponents with high optical density variations between them.
- fractal 1 area and fractal2_area are calculated on different scales: the second one is based on an image in which four pixels are averaged into a single pixel, thereby representing a change of scale of fractal 1 area.
- This calculation needs the additional mask transformation.
- ⁇ /2, / 2 represents the original mask ⁇ with 4 pixels mapped into one pixel and any square of 4 pixels not completely consisting of object pixels is set to zero.
- ⁇ .i,j represents ⁇ l2 2 expanded by 4 so that each pixel in ⁇ , 2 j2 is 4 pixeis in ⁇ z, j .
- the fractal_dimen feature is calculated as the difference between logarithms of fractal 1 area and fractal2_area, divided by log 2. This varies from 2 to 3 and gives a measure of the "fractal behavior" of the image, associated with a rate at which measured surface area increases at finer and finer scales.
- fractal dimen '°8 ⁇ o0 ⁇ 1- g *Q - ⁇ o(Jractal2_area) logio 2
- Run length features describe texture in terms of grey level runs, representing sets of consecutive, collinear pixels having the same grey level value. The length of the run is the number of pixels in the run. These features are calculated over the image with intensity function values transformed into 8 levels.
- a run length feature may be oriented at 45°, but at 90° in the next; in general, these are completely equivalent
- Each element of matrix 9?? ⁇ specifies the number of times that the object contains a run of length q, in a given direction, ⁇ , consisting of pixels lying in grey level range, p (out of 8 grey levels).
- N g 8 be the number of grey levels, and N be the number of different run lengths that occur in the object; then the run length features are described as follows:
- VDH.4 runO_length, run45_length, run90_length, runl35_length These features estimate the nonuniformity of the run lengths, taking on their lowest values when the runs are equally distributed throughout the lengths
- A is the object's area.
- This feature estimates the dominant orientation ofthe object's linear texture.
- texture orient — h arctan (70) ⁇ 2 *y pseudo-cross moment 2 J
- ⁇ j is the maximal eigenvalue of the run length pseudo-second moment matrix (calculated in analogy to Section II.9).
- the run length pseudo-second moments are calculated as follows:
- Orientation is defined as it is for cell orient, Section II.9, as the angle (measured in a clockwise sense) between the y axis and the dominant orientation of the image's linear structure.
- This feature amplifies the texture orientation for long runs.
- Each of the above features are calculated for each in-focus object located in the image. Certain features are used by the classifier to separate artifacts from cell nuclei and to distinguish cells exhibiting MACs from normal cells. As indicated above, it is not possible to predict which features will be used to distinguish artifacts from cells or MAC cells from non-MAC cells, until the classifier has been completely trained and produces a binary decision tree or linear discriminant function.
- the ability ofthe system according to the present invention to distinguish cell nuclei from artifacts or cells that exhibit MACs from those that do not depends on the ability of the classifier to make distinctions based on the values of the features computed.
- the present invention may apply several different discriminant functions each of which is trained to identify particular types of objects. For example, the following discriminant function has been used in the presently preferred embodiment of the invention to separate intermediate cervical cells from small picnotic objects:
- cervical cells picnotic max radius 4.56914 3.92899 freq_low fft -.03624 -.04714 harmon03_fft 1.29958 1.80412 harmon04.fft .85959 1.20653 lowVSmed_DNA 58.83394 61.84034 energy 6566.14355 6182.17139 correlation .56801 .52911 homogeneity -920.05017 -883.31567 cl_shade -67.37746 -63.68423 den drk spot 916.69360 870.75739
- Another discriminant function that can separate cells from junk particles is:
- linear discriminant function produced by the classifier will depend on the type of classifier used and the training sets of cells. The above examples are given merely for purposes of illustration.
- the present invention is a system that automatically detects malignancy-associated changes in a cell sample By properly staining and imaging a cell sample, the features of each object found on the slide can be determined and used to provide an indication whether the patient from which the cell sample was obtained is normal or abnormal
- MACs provide an indication of whether cancer treatment given is effective as well as if a cancer is in remission
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Immunology (AREA)
- Chemical & Material Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Hematology (AREA)
- Urology & Nephrology (AREA)
- Pathology (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Microbiology (AREA)
- Tropical Medicine & Parasitology (AREA)
- Food Science & Technology (AREA)
- Toxicology (AREA)
- Cell Biology (AREA)
- Biotechnology (AREA)
- Medicinal Chemistry (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Physiology (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Dispersion Chemistry (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002253850A CA2253850C (en) | 1996-05-10 | 1997-05-01 | Method and apparatus for automatically detecting malignancy-associated changes |
EP97919229A EP0901665A1 (en) | 1996-05-10 | 1997-05-01 | Method and apparatus for automatically detecting malignancy-associated changes |
AU23779/97A AU2377997A (en) | 1996-05-10 | 1997-05-01 | Method and apparatus for automatically detecting malignancy-associated changes |
JP09540332A JP2000510266A (en) | 1996-05-10 | 1997-05-01 | Method and apparatus for automatically detecting changes associated with malignancy |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/644,893 | 1996-05-10 | ||
US08/644,893 US5889881A (en) | 1992-10-14 | 1996-05-10 | Method and apparatus for automatically detecting malignancy-associated changes |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1997043732A1 true WO1997043732A1 (en) | 1997-11-20 |
Family
ID=24586780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA1997/000301 WO1997043732A1 (en) | 1996-05-10 | 1997-05-01 | Method and apparatus for automatically detecting malignancy-associated changes |
Country Status (6)
Country | Link |
---|---|
US (2) | US5889881A (en) |
EP (1) | EP0901665A1 (en) |
JP (1) | JP2000510266A (en) |
AU (1) | AU2377997A (en) |
CA (1) | CA2253850C (en) |
WO (1) | WO1997043732A1 (en) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999036094A2 (en) * | 1998-01-14 | 1999-07-22 | The Board Of Regents Of The University Of Oklahoma | Composition and method for treating metastatic tumors or cancer induced by cells expressing sv40 tumor antigen |
WO2000004497A1 (en) * | 1998-07-14 | 2000-01-27 | The Perkin-Elmer Corporation Pe Biosystems Division | Automatic masking of objects in images |
WO2001042786A2 (en) * | 1999-12-09 | 2001-06-14 | Cellomics, Inc. | System for cell based screening : cell spreading |
WO2001081895A2 (en) * | 2000-04-26 | 2001-11-01 | Cytokinetics, Inc. | Method and apparatus for predictive cellular bioinformatics |
EP1203339A1 (en) * | 1999-07-21 | 2002-05-08 | SurroMed, Inc. | System for microvolume laser scanning cytometry |
WO2002048949A1 (en) * | 2000-12-15 | 2002-06-20 | Cellavision Ab | Method and arrangment for processing digital image information |
WO2003071468A1 (en) * | 2002-02-22 | 2003-08-28 | Istituto Clinico Humanitas | Method and apparatus for analyizing biological tissue specimens |
US6651008B1 (en) | 1999-05-14 | 2003-11-18 | Cytokinetics, Inc. | Database system including computer code for predictive cellular bioinformatics |
EP1419370A4 (en) * | 2000-01-11 | 2004-05-19 | Richard A Thomas | Nuclear packing efficiency |
WO2004044237A1 (en) * | 2002-11-13 | 2004-05-27 | G6 Science Corp. | Method of identifying and assessing dna euchromatin for detecting disease |
US6743576B1 (en) | 1999-05-14 | 2004-06-01 | Cytokinetics, Inc. | Database system for predictive cellular bioinformatics |
US6876760B1 (en) | 2000-12-04 | 2005-04-05 | Cytokinetics, Inc. | Classifying cells based on information contained in cell images |
US6956961B2 (en) | 2001-02-20 | 2005-10-18 | Cytokinetics, Inc. | Extracting shape information contained in cell images |
US6986993B1 (en) | 1999-08-05 | 2006-01-17 | Cellomics, Inc. | System for cell-based screening |
US7016787B2 (en) | 2001-02-20 | 2006-03-21 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US7151847B2 (en) | 2001-02-20 | 2006-12-19 | Cytokinetics, Inc. | Image analysis of the golgi complex |
US7218764B2 (en) | 2000-12-04 | 2007-05-15 | Cytokinetics, Inc. | Ploidy classification method |
US7235353B2 (en) | 2003-07-18 | 2007-06-26 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7246012B2 (en) | 2003-07-18 | 2007-07-17 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US7323318B2 (en) | 2004-07-15 | 2008-01-29 | Cytokinetics, Inc. | Assay for distinguishing live and dead cells |
US7346200B1 (en) | 1999-11-18 | 2008-03-18 | Ikonisys, Inc. | Method and apparatus for computer controlled cell based diagnosis |
US7415148B2 (en) * | 2003-08-04 | 2008-08-19 | Raytheon Company | System and method for detecting anomalous targets including cancerous cells |
US7640112B2 (en) | 1998-05-09 | 2009-12-29 | Ikenisys, Inc. | Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis |
US7817840B2 (en) | 2003-07-18 | 2010-10-19 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7901887B2 (en) | 1998-05-09 | 2011-03-08 | Ikonisys, Inc. | Automated cancer diagnostic methods using fish |
US9528141B2 (en) | 2009-02-19 | 2016-12-27 | National University Corporation Chiba University | Nuclear localization of Src-family tyrosine kinases is required for growth factor-induced euchromatinization |
EP2973408A4 (en) * | 2013-03-15 | 2017-06-14 | Richard Harry Turner | A system and methods for the in vitro detection of particles and soluble chemical entities in body fluids |
GB2497116B (en) * | 2011-12-01 | 2017-11-15 | Inst For Medical Informatics | Micrography |
Families Citing this family (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5889881A (en) * | 1992-10-14 | 1999-03-30 | Oncometrics Imaging Corp. | Method and apparatus for automatically detecting malignancy-associated changes |
US5978497A (en) * | 1994-09-20 | 1999-11-02 | Neopath, Inc. | Apparatus for the identification of free-lying cells |
US6654505B2 (en) * | 1994-10-13 | 2003-11-25 | Lynx Therapeutics, Inc. | System and apparatus for sequential processing of analytes |
USRE43097E1 (en) | 1994-10-13 | 2012-01-10 | Illumina, Inc. | Massively parallel signature sequencing by ligation of encoded adaptors |
US6406848B1 (en) | 1997-05-23 | 2002-06-18 | Lynx Therapeutics, Inc. | Planar arrays of microparticle-bound polynucleotides |
JP2000501184A (en) * | 1995-11-30 | 2000-02-02 | クロマビジョン メディカル システムズ,インコーポレイテッド | Method and apparatus for automatic image analysis of biological specimens |
US6718053B1 (en) * | 1996-11-27 | 2004-04-06 | Chromavision Medical Systems, Inc. | Method and apparatus for automated image analysis of biological specimens |
US7286695B2 (en) * | 1996-07-10 | 2007-10-23 | R2 Technology, Inc. | Density nodule detection in 3-D digital images |
US6263092B1 (en) | 1996-07-10 | 2001-07-17 | R2 Technology, Inc. | Method and apparatus for fast detection of spiculated lesions in digital mammograms |
US6909797B2 (en) * | 1996-07-10 | 2005-06-21 | R2 Technology, Inc. | Density nodule detection in 3-D digital images |
US5836877A (en) * | 1997-02-24 | 1998-11-17 | Lucid Inc | System for facilitating pathological examination of a lesion in tissue |
US6388788B1 (en) | 1998-03-16 | 2002-05-14 | Praelux, Inc. | Method and apparatus for screening chemical compounds |
US20030036855A1 (en) * | 1998-03-16 | 2003-02-20 | Praelux Incorporated, A Corporation Of New Jersey | Method and apparatus for screening chemical compounds |
US20090111101A1 (en) * | 1998-05-09 | 2009-04-30 | Ikonisys, Inc. | Automated Cancer Diagnostic Methods Using FISH |
US20080241848A1 (en) * | 1998-05-09 | 2008-10-02 | Ikonisys, Inc. | Methods for prenatal diagnosis of aneuploidy |
AU3493800A (en) | 1999-02-17 | 2000-09-04 | Lucid, Inc. | Cassette for facilitating optical sectioning of a retained tissue specimen |
AU2882800A (en) * | 1999-02-17 | 2000-09-04 | Lucid, Inc. | Tissue specimen holder |
WO2000062247A1 (en) * | 1999-04-13 | 2000-10-19 | Chromavision Medical Systems, Inc. | Histological reconstruction and automated image analysis |
AU6093400A (en) * | 1999-07-13 | 2001-01-30 | Chromavision Medical Systems, Inc. | Automated detection of objects in a biological sample |
US6665060B1 (en) | 1999-10-29 | 2003-12-16 | Cytyc Corporation | Cytological imaging system and method |
US20060073509A1 (en) * | 1999-11-18 | 2006-04-06 | Michael Kilpatrick | Method for detecting and quantitating multiple subcellular components |
IL138123A0 (en) * | 2000-08-28 | 2001-10-31 | Accuramed 1999 Ltd | Medical decision support system and method |
US7194118B1 (en) | 2000-11-10 | 2007-03-20 | Lucid, Inc. | System for optically sectioning and mapping surgically excised tissue |
US20030054419A1 (en) * | 2001-02-07 | 2003-03-20 | Slawin Kevin M. | Method to determine prognosis after therapy for prostate cancer |
JP2005506140A (en) * | 2001-10-16 | 2005-03-03 | ザ・ユニバーシティー・オブ・シカゴ | Computer-aided 3D lesion detection method |
US8676509B2 (en) * | 2001-11-13 | 2014-03-18 | Dako Denmark A/S | System for tracking biological samples |
JP4259802B2 (en) * | 2002-02-19 | 2009-04-30 | 日本分光株式会社 | Abnormal part and degree of abnormality identification method in cancer diagnosis |
US20040133112A1 (en) * | 2002-03-08 | 2004-07-08 | Milind Rajadhyaksha | System and method for macroscopic and confocal imaging of tissue |
US6658143B2 (en) * | 2002-04-29 | 2003-12-02 | Amersham Biosciences Corp. | Ray-based image analysis for biological specimens |
US20050037406A1 (en) * | 2002-06-12 | 2005-02-17 | De La Torre-Bueno Jose | Methods and apparatus for analysis of a biological specimen |
US7272252B2 (en) * | 2002-06-12 | 2007-09-18 | Clarient, Inc. | Automated system for combining bright field and fluorescent microscopy |
US6999624B1 (en) * | 2002-07-12 | 2006-02-14 | The United States Of America As Represented By The Secretary Of The Navy | Context discriminate classification for digital images |
US6999625B1 (en) * | 2002-07-12 | 2006-02-14 | The United States Of America As Represented By The Secretary Of The Navy | Feature-based detection and context discriminate classification for digital images |
US6990239B1 (en) * | 2002-07-16 | 2006-01-24 | The United States Of America As Represented By The Secretary Of The Navy | Feature-based detection and context discriminate classification for known image structures |
US7211225B2 (en) * | 2002-08-26 | 2007-05-01 | Perceptronix Medical Inc. | Filter devices for depositing material and density gradients of material from sample suspension |
US7274809B2 (en) * | 2002-08-29 | 2007-09-25 | Perceptronix Medical, Inc. And British Columbia Cancer Agency | Computerized methods and systems related to the detection of malignancy-associated changes (MAC) to detect cancer |
US7106892B2 (en) * | 2002-09-19 | 2006-09-12 | Koninklijke Philips Electronics, N.V. | Display of image data information |
GB2396406A (en) * | 2002-12-17 | 2004-06-23 | Qinetiq Ltd | Image analysis |
US20040202357A1 (en) * | 2003-04-11 | 2004-10-14 | Perz Cynthia B. | Silhouette image acquisition |
US7480412B2 (en) * | 2003-12-16 | 2009-01-20 | Siemens Medical Solutions Usa, Inc. | Toboggan-based shape characterization |
DE10361073A1 (en) * | 2003-12-22 | 2005-07-21 | Innovatis Ag | Method and device for taking microscopic images |
JP4487180B2 (en) * | 2004-03-18 | 2010-06-23 | ソニー株式会社 | Information generating apparatus and information generating method |
US20070210183A1 (en) * | 2004-04-20 | 2007-09-13 | Xerox Corporation | Environmental system including a micromechanical dispensing device |
US7609887B2 (en) * | 2004-06-07 | 2009-10-27 | Siemens Medical Solutions Usa, Inc. | System and method for toboggan-based object segmentation using distance transform |
US7653260B2 (en) * | 2004-06-17 | 2010-01-26 | Carl Zeis MicroImaging GmbH | System and method of registering field of view |
US8582924B2 (en) * | 2004-06-30 | 2013-11-12 | Carl Zeiss Microimaging Gmbh | Data structure of an image storage and retrieval system |
US20060018549A1 (en) * | 2004-07-20 | 2006-01-26 | Jianming Liang | System and method for object characterization of toboggan-based clusters |
US20060209063A1 (en) * | 2004-10-12 | 2006-09-21 | Jianming Liang | Toboggan-based method for automatic detection and segmentation of objects in image data |
US20060104484A1 (en) * | 2004-11-16 | 2006-05-18 | Bolle Rudolf M | Fingerprint biometric machine representations based on triangles |
US20070031043A1 (en) * | 2005-08-02 | 2007-02-08 | Perz Cynthia B | System for and method of intelligently directed segmentation analysis for automated microscope systems |
US7526116B2 (en) * | 2006-01-19 | 2009-04-28 | Luigi Armogida | Automated microscopic sperm identification |
US7864996B2 (en) * | 2006-02-17 | 2011-01-04 | Lucid, Inc. | System for macroscopic and confocal imaging of tissue |
US8805743B2 (en) * | 2006-12-27 | 2014-08-12 | International Business Machines Corporation | Tracking, distribution and management of apportionable licenses granted for distributed software products |
WO2008143849A2 (en) * | 2007-05-14 | 2008-11-27 | Historx, Inc. | Compartment segregation by pixel characterization using image data clustering |
WO2008156669A1 (en) * | 2007-06-15 | 2008-12-24 | Historx, Inc. | Method and system for standardizing microscope instruments |
US9607372B2 (en) * | 2007-07-11 | 2017-03-28 | Hernani D. Cualing | Automated bone marrow cellularity determination |
CA2604317C (en) | 2007-08-06 | 2017-02-28 | Historx, Inc. | Methods and system for validating sample images for quantitative immunoassays |
CA2596204C (en) * | 2007-08-07 | 2019-02-26 | Historx, Inc. | Method and system for determining an optimal dilution of a reagent |
WO2009029810A1 (en) * | 2007-08-31 | 2009-03-05 | Historx, Inc. | Automatic exposure time selection for imaging tissue |
WO2009094623A2 (en) * | 2008-01-24 | 2009-07-30 | Balter, Inc. | Method for discriminating between malignant and benign tissue lesions |
US8346574B2 (en) | 2008-02-29 | 2013-01-01 | Dako Denmark A/S | Systems and methods for tracking and providing workflow information |
EP2335221B8 (en) * | 2008-09-16 | 2016-05-25 | Novartis AG | Reproducible quantification of biomarker expression |
GB2466818B (en) * | 2009-01-09 | 2014-08-13 | Inst Cancer Genetics And Informatics | Optimizing the initialization and convergence of active contours for segmentation of cell nuclei in histological sections |
US8379985B2 (en) * | 2009-07-03 | 2013-02-19 | Sony Corporation | Dominant gradient method for finding focused objects |
US8930394B2 (en) | 2010-08-17 | 2015-01-06 | Fujitsu Limited | Querying sensor data stored as binary decision diagrams |
US8572146B2 (en) | 2010-08-17 | 2013-10-29 | Fujitsu Limited | Comparing data samples represented by characteristic functions |
US8874607B2 (en) | 2010-08-17 | 2014-10-28 | Fujitsu Limited | Representing sensor data as binary decision diagrams |
US9002781B2 (en) | 2010-08-17 | 2015-04-07 | Fujitsu Limited | Annotating environmental data represented by characteristic functions |
US8645108B2 (en) | 2010-08-17 | 2014-02-04 | Fujitsu Limited | Annotating binary decision diagrams representing sensor data |
US8583718B2 (en) | 2010-08-17 | 2013-11-12 | Fujitsu Limited | Comparing boolean functions representing sensor data |
US9138143B2 (en) * | 2010-08-17 | 2015-09-22 | Fujitsu Limited | Annotating medical data represented by characteristic functions |
US8909592B2 (en) | 2011-09-23 | 2014-12-09 | Fujitsu Limited | Combining medical binary decision diagrams to determine data correlations |
US9075908B2 (en) | 2011-09-23 | 2015-07-07 | Fujitsu Limited | Partitioning medical binary decision diagrams for size optimization |
US8781995B2 (en) | 2011-09-23 | 2014-07-15 | Fujitsu Limited | Range queries in binary decision diagrams |
US9177247B2 (en) | 2011-09-23 | 2015-11-03 | Fujitsu Limited | Partitioning medical binary decision diagrams for analysis optimization |
US8838523B2 (en) | 2011-09-23 | 2014-09-16 | Fujitsu Limited | Compression threshold analysis of binary decision diagrams |
US8812943B2 (en) | 2011-09-23 | 2014-08-19 | Fujitsu Limited | Detecting data corruption in medical binary decision diagrams using hashing techniques |
US9176819B2 (en) | 2011-09-23 | 2015-11-03 | Fujitsu Limited | Detecting sensor malfunctions using compression analysis of binary decision diagrams |
US8620854B2 (en) | 2011-09-23 | 2013-12-31 | Fujitsu Limited | Annotating medical binary decision diagrams with health state information |
US8719214B2 (en) | 2011-09-23 | 2014-05-06 | Fujitsu Limited | Combining medical binary decision diagrams for analysis optimization |
US9229210B2 (en) | 2012-02-26 | 2016-01-05 | Caliber Imaging And Diagnostics, Inc. | Tissue specimen stage for an optical sectioning microscope |
US10429292B2 (en) * | 2013-03-15 | 2019-10-01 | Iris International, Inc. | Dynamic range extension systems and methods for particle analysis in blood samples |
US9865086B2 (en) * | 2014-03-21 | 2018-01-09 | St. Jude Medical, Cardiololgy Division, Inc. | Methods and systems for generating a multi-dimensional surface model of a geometric structure |
EP3175773A4 (en) * | 2014-07-30 | 2018-10-10 | Olympus Corporation | Image processing device |
US9832366B2 (en) * | 2014-09-29 | 2017-11-28 | Biosurfit, S.A. | Focusing method |
US10430955B2 (en) * | 2016-08-31 | 2019-10-01 | The Regents Of The University Of California | High content screening workflows for microscope imaging |
CN112136071B (en) | 2018-02-26 | 2023-08-11 | 凯利博成像和诊断公司 | System and method for macroscopic and microscopic imaging of in vitro tissue |
US20220172355A1 (en) * | 2020-12-02 | 2022-06-02 | Mayo Foundation For Medical Education And Research | Cytological analysis of nuclear neat-1 expression for detection of cholangiocarcinoma |
WO2023012241A1 (en) * | 2021-08-04 | 2023-02-09 | Samantree Medical Sa | Systems and methods for providing live sample monitoring information with parallel imaging systems |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1990010277A1 (en) * | 1989-02-24 | 1990-09-07 | Cell Analysis Systems, Inc. | Method and apparatus for determining a proliferation index of a cell sample |
WO1991015826A1 (en) * | 1990-03-30 | 1991-10-17 | Neuromedical Systems, Inc. | Automated cytological specimen classification system and method |
WO1993016436A1 (en) * | 1992-02-18 | 1993-08-19 | Neopath, Inc. | Method for identifying normal biomedical specimens |
WO1993016442A1 (en) * | 1992-02-18 | 1993-08-19 | Neopath, Inc. | Method for identifying objects using data processing techniques |
EP0595506A2 (en) * | 1992-10-14 | 1994-05-04 | Xillix Technologies Corporation | Automated detection of cancerous or precancerous tissue by measuring malignancy associated changes |
EP0610916A2 (en) * | 1993-02-09 | 1994-08-17 | Cedars-Sinai Medical Center | Method and apparatus for providing preferentially segmented digital images |
WO1996009594A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for automated identification of thick cell groupings on a biological specimen |
WO1996009605A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for the identification of free-lying cells |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5672845A (en) * | 1979-11-19 | 1981-06-17 | Hitachi Ltd | Detecting apparatus of examination position of sample |
US4453266A (en) * | 1980-04-21 | 1984-06-05 | Rush-Presbyterian-St. Luke's Medical Center | Method and apparatus for measuring mean cell volume of red blood cells |
US4702595A (en) * | 1980-10-15 | 1987-10-27 | Smithkline Beckman Corporation | Pattern recognition system with working area detection |
JPS58154064A (en) * | 1982-03-08 | 1983-09-13 | Mitsubishi Rayon Co Ltd | Method for measuring percentage between lymph bead and t cell |
US4513438A (en) * | 1982-04-15 | 1985-04-23 | Coulter Electronics, Inc. | Automated microscopy system and method for locating and re-locating objects in an image |
US4700298A (en) * | 1984-09-14 | 1987-10-13 | Branko Palcic | Dynamic microscope image processing scanner |
US4741043B1 (en) * | 1985-11-04 | 1994-08-09 | Cell Analysis Systems Inc | Method of and apparatus for image analyses of biological specimens |
US5016283A (en) * | 1985-11-04 | 1991-05-14 | Cell Analysis Systems, Inc. | Methods and apparatus for immunoploidy analysis |
DE3718066A1 (en) * | 1987-05-29 | 1988-12-08 | Zeiss Carl Fa | METHOD FOR MICROINJECTION IN CELLS OR. FOR SUCTION FROM SINGLE CELLS OR WHOLE CELLS FROM CELL CULTURES |
JP2686274B2 (en) * | 1988-03-24 | 1997-12-08 | 東亜医用電子株式会社 | Cell image processing method and apparatus |
DE3836716A1 (en) * | 1988-10-28 | 1990-05-03 | Zeiss Carl Fa | METHOD FOR EVALUATING CELL IMAGES |
US5073857A (en) * | 1989-06-01 | 1991-12-17 | Accuron Corporation | Method and apparatus for cell analysis |
US5072382A (en) * | 1989-10-02 | 1991-12-10 | Kamentsky Louis A | Methods and apparatus for measuring multiple optical properties of biological specimens |
US5317644A (en) * | 1992-06-16 | 1994-05-31 | Mcgill University | Method for the enhancement of cell images |
US5889881A (en) * | 1992-10-14 | 1999-03-30 | Oncometrics Imaging Corp. | Method and apparatus for automatically detecting malignancy-associated changes |
US6026174A (en) * | 1992-10-14 | 2000-02-15 | Accumed International, Inc. | System and method for automatically detecting malignant cells and cells having malignancy-associated changes |
-
1996
- 1996-05-10 US US08/644,893 patent/US5889881A/en not_active Expired - Lifetime
-
1997
- 1997-05-01 CA CA002253850A patent/CA2253850C/en not_active Expired - Fee Related
- 1997-05-01 WO PCT/CA1997/000301 patent/WO1997043732A1/en active Application Filing
- 1997-05-01 JP JP09540332A patent/JP2000510266A/en active Pending
- 1997-05-01 AU AU23779/97A patent/AU2377997A/en not_active Abandoned
- 1997-05-01 EP EP97919229A patent/EP0901665A1/en not_active Ceased
-
1999
- 1999-03-26 US US09/277,499 patent/US6493460B1/en not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1990010277A1 (en) * | 1989-02-24 | 1990-09-07 | Cell Analysis Systems, Inc. | Method and apparatus for determining a proliferation index of a cell sample |
WO1991015826A1 (en) * | 1990-03-30 | 1991-10-17 | Neuromedical Systems, Inc. | Automated cytological specimen classification system and method |
WO1993016436A1 (en) * | 1992-02-18 | 1993-08-19 | Neopath, Inc. | Method for identifying normal biomedical specimens |
WO1993016442A1 (en) * | 1992-02-18 | 1993-08-19 | Neopath, Inc. | Method for identifying objects using data processing techniques |
EP0595506A2 (en) * | 1992-10-14 | 1994-05-04 | Xillix Technologies Corporation | Automated detection of cancerous or precancerous tissue by measuring malignancy associated changes |
EP0610916A2 (en) * | 1993-02-09 | 1994-08-17 | Cedars-Sinai Medical Center | Method and apparatus for providing preferentially segmented digital images |
WO1996009594A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for automated identification of thick cell groupings on a biological specimen |
WO1996009605A1 (en) * | 1994-09-20 | 1996-03-28 | Neopath, Inc. | Apparatus for the identification of free-lying cells |
Non-Patent Citations (1)
Title |
---|
BERTRAND D ET AL: "BASICS OF VIDEO IMAGE ANALYSIS", TRAC, TRENDS IN ANALYTICAL CHEMISTRY, vol. 10, no. 8, 1 September 1991 (1991-09-01), AMSTERDAM, NL, pages 237 - 243, XP000219625 * |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999036094A2 (en) * | 1998-01-14 | 1999-07-22 | The Board Of Regents Of The University Of Oklahoma | Composition and method for treating metastatic tumors or cancer induced by cells expressing sv40 tumor antigen |
WO1999036094A3 (en) * | 1998-01-14 | 1999-11-18 | Univ Oklahoma | Composition and method for treating metastatic tumors or cancer induced by cells expressing sv40 tumor antigen |
US7945391B2 (en) | 1998-05-09 | 2011-05-17 | Iknonisys, Inc. | Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis |
US7901887B2 (en) | 1998-05-09 | 2011-03-08 | Ikonisys, Inc. | Automated cancer diagnostic methods using fish |
US7835869B2 (en) | 1998-05-09 | 2010-11-16 | Ikonisys, Inc. | Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis |
US7640112B2 (en) | 1998-05-09 | 2009-12-29 | Ikenisys, Inc. | Method and apparatus for computer controlled rare cell, including fetal cell, based diagnosis |
WO2000004497A1 (en) * | 1998-07-14 | 2000-01-27 | The Perkin-Elmer Corporation Pe Biosystems Division | Automatic masking of objects in images |
US6651008B1 (en) | 1999-05-14 | 2003-11-18 | Cytokinetics, Inc. | Database system including computer code for predictive cellular bioinformatics |
US6743576B1 (en) | 1999-05-14 | 2004-06-01 | Cytokinetics, Inc. | Database system for predictive cellular bioinformatics |
EP1203339A4 (en) * | 1999-07-21 | 2006-09-13 | Ppd Biomarker Discovery Scienc | System for microvolume laser scanning cytometry |
EP1203339A1 (en) * | 1999-07-21 | 2002-05-08 | SurroMed, Inc. | System for microvolume laser scanning cytometry |
US8062856B2 (en) | 1999-08-05 | 2011-11-22 | Cellomics, Inc. | System for cell-based screening |
US6986993B1 (en) | 1999-08-05 | 2006-01-17 | Cellomics, Inc. | System for cell-based screening |
US7522757B2 (en) | 1999-11-18 | 2009-04-21 | Ikonisys, Inc. | Method and apparatus for computer controlled cell based diagnosis |
US7346200B1 (en) | 1999-11-18 | 2008-03-18 | Ikonisys, Inc. | Method and apparatus for computer controlled cell based diagnosis |
US6716588B2 (en) | 1999-12-09 | 2004-04-06 | Cellomics, Inc. | System for cell-based screening |
WO2001042786A2 (en) * | 1999-12-09 | 2001-06-14 | Cellomics, Inc. | System for cell based screening : cell spreading |
WO2001042786A3 (en) * | 1999-12-09 | 2002-01-10 | Cellomics Inc | System for cell based screening : cell spreading |
EP1419370A4 (en) * | 2000-01-11 | 2004-05-19 | Richard A Thomas | Nuclear packing efficiency |
EP1419370A2 (en) * | 2000-01-11 | 2004-05-19 | Richard A. Thomas | Nuclear packing efficiency |
WO2001081895A3 (en) * | 2000-04-26 | 2003-03-13 | Cytokinetics Inc | Method and apparatus for predictive cellular bioinformatics |
WO2001081895A2 (en) * | 2000-04-26 | 2001-11-01 | Cytokinetics, Inc. | Method and apparatus for predictive cellular bioinformatics |
US7218764B2 (en) | 2000-12-04 | 2007-05-15 | Cytokinetics, Inc. | Ploidy classification method |
US6876760B1 (en) | 2000-12-04 | 2005-04-05 | Cytokinetics, Inc. | Classifying cells based on information contained in cell images |
WO2002048949A1 (en) * | 2000-12-15 | 2002-06-20 | Cellavision Ab | Method and arrangment for processing digital image information |
US7016787B2 (en) | 2001-02-20 | 2006-03-21 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US6956961B2 (en) | 2001-02-20 | 2005-10-18 | Cytokinetics, Inc. | Extracting shape information contained in cell images |
US7269278B2 (en) | 2001-02-20 | 2007-09-11 | Cytokinetics, Inc. | Extracting shape information contained in cell images |
US7151847B2 (en) | 2001-02-20 | 2006-12-19 | Cytokinetics, Inc. | Image analysis of the golgi complex |
US7657076B2 (en) | 2001-02-20 | 2010-02-02 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
WO2003071469A1 (en) * | 2002-02-22 | 2003-08-28 | Istituto Clinico Humanitas | 'in vitro' diagnostic method for diseases affecting human or animal tissues |
US7596250B2 (en) | 2002-02-22 | 2009-09-29 | Humanitas Mirasole S.P.S. | Method and apparatus for analyzing biological tissue specimens |
WO2003071468A1 (en) * | 2002-02-22 | 2003-08-28 | Istituto Clinico Humanitas | Method and apparatus for analyizing biological tissue specimens |
US7153691B2 (en) * | 2002-11-13 | 2006-12-26 | G6 Science Corp. | Method of identifying and assessing DNA euchromatin in biological cells for detecting disease, monitoring wellness, assessing bio-activity, and screening pharmacological agents |
WO2004044237A1 (en) * | 2002-11-13 | 2004-05-27 | G6 Science Corp. | Method of identifying and assessing dna euchromatin for detecting disease |
US7246012B2 (en) | 2003-07-18 | 2007-07-17 | Cytokinetics, Inc. | Characterizing biological stimuli by response curves |
US7817840B2 (en) | 2003-07-18 | 2010-10-19 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7235353B2 (en) | 2003-07-18 | 2007-06-26 | Cytokinetics, Inc. | Predicting hepatotoxicity using cell based assays |
US7415148B2 (en) * | 2003-08-04 | 2008-08-19 | Raytheon Company | System and method for detecting anomalous targets including cancerous cells |
US7323318B2 (en) | 2004-07-15 | 2008-01-29 | Cytokinetics, Inc. | Assay for distinguishing live and dead cells |
US9528141B2 (en) | 2009-02-19 | 2016-12-27 | National University Corporation Chiba University | Nuclear localization of Src-family tyrosine kinases is required for growth factor-induced euchromatinization |
US10072304B2 (en) | 2009-02-19 | 2018-09-11 | National University Corporation Chiba University | Nuclear localization of Src-family tyrosine kinases is required for growth factor-induced euchromatinization |
GB2497116B (en) * | 2011-12-01 | 2017-11-15 | Inst For Medical Informatics | Micrography |
EP2973408A4 (en) * | 2013-03-15 | 2017-06-14 | Richard Harry Turner | A system and methods for the in vitro detection of particles and soluble chemical entities in body fluids |
US10032270B2 (en) | 2013-03-15 | 2018-07-24 | Richard H. Turner | System and methods for the in vitro detection of particles and soluble chemical entities in body fluids |
Also Published As
Publication number | Publication date |
---|---|
JP2000510266A (en) | 2000-08-08 |
EP0901665A1 (en) | 1999-03-17 |
US5889881A (en) | 1999-03-30 |
US6493460B1 (en) | 2002-12-10 |
CA2253850A1 (en) | 1997-11-20 |
CA2253850C (en) | 2007-09-18 |
AU2377997A (en) | 1997-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6493460B1 (en) | Method and apparatus for automatically detecting malignancy-associated changes | |
US6026174A (en) | System and method for automatically detecting malignant cells and cells having malignancy-associated changes | |
Wu et al. | Live cell image segmentation | |
US5687251A (en) | Method and apparatus for providing preferentially segmented digital images | |
Raimondo et al. | Automated evaluation of Her-2/neu status in breast tissue from fluorescent in situ hybridization images | |
US9436992B2 (en) | Method of reconstituting cellular spectra useful for detecting cellular disorders | |
Tates et al. | The present state of the automated micronucleus test for lymphocytes | |
EP1534114A2 (en) | Computerized image capture of structures of interest within a tissue sample | |
Böcker et al. | Automated comet assay analysis | |
US20090169090A1 (en) | Targeted edge detection method and apparatus for cytological image processing applications | |
EP1579366A1 (en) | Histological assessment of nuclear pleomorphism | |
CA2086785C (en) | Automated detection of cancerous or precancerous tissue by measuring malignancy associated changes (macs) | |
Nazeran et al. | Biomedical image processing in pathology: a review | |
Wang et al. | Investigation of methodologies for the segmentation of squamous epithelium from cervical histological virtual slides | |
AU770570B2 (en) | Method and apparatus for automatically detecting malignancy-associated changes | |
CA2595118A1 (en) | Method and apparatus for automatically detecting malignancy-associated changes | |
Sameti et al. | Classifying image features in the last screening mammograms prior to detection of a malignant mass | |
Loukas et al. | Automated segmentation of cancer cell nuclei in complex tissue sections | |
Marghani et al. | Automated morphological analysis approach for classifying colorectal microscopic images | |
Hajnal et al. | Classifying mammograms by density: rationale and preliminary results | |
Smolle et al. | Automated detection of connective tissue by tissue counter analysis and classification and regression trees | |
Sabino et al. | Chromatin texture characterization using multiscale fractal dimension | |
AU2002354969B2 (en) | Chromatin segmentation | |
Friedrich | Eliminating Defocused Objects From a Cell Collection Used For Early | |
Benke et al. | Application of x-ray instrumentation in medicine: discrimination of neoplasms in radiographs by digital image processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN YU AM AZ BY KG KZ MD RU TJ TM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2253850 Country of ref document: CA Kind code of ref document: A Ref document number: 2253850 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1997919229 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1997919229 Country of ref document: EP |