WO2003105675A2 - Computerized image capture of structures of interest within a tissue sample - Google Patents
Computerized image capture of structures of interest within a tissue sample Download PDFInfo
- Publication number
- WO2003105675A2 WO2003105675A2 PCT/US2003/019206 US0319206W WO03105675A2 WO 2003105675 A2 WO2003105675 A2 WO 2003105675A2 US 0319206 W US0319206 W US 0319206W WO 03105675 A2 WO03105675 A2 WO 03105675A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tissue
- image
- interest
- nuclei
- identification algorithm
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
Definitions
- tissue microarrays for high-throughput screening and analysis of hundreds of tissue specimens on a single microscope slide.
- Tissue microarrays provide benefits over traditional methods that involve processing and staining hundreds of microscope slides because a large number of specimens can be accommodated on one master microscope slide. This approach markedly reduces time, expense, and experimental error.
- a fully automated system is needed that can match or even surpass the performance of a pathologist working at the microscope.
- Existing systems for tissue identification require high-magnification or high-resolution images of the entire tissue sample before they can provide meaningful output.
- An advantageous element for a fully automated system is a device and method for capturing high-resolution images of each tissue sample limited to structures of interest portions of the tissue sample.
- Another advantageous element for a fully automated system is an ability to work without requiring the use of special stains or specific antibody markers, which limit versatility and speed of the throughput.
- the present invention is directed to a device, system, and method.
- An embodiment of the present invention provides a computerized device and method of automatically capturing an image of a structure of interest in a tissue sample.
- the method includes receiving into a computer memory a first pixel data set representing an image of the tissue sample at a low resolution and an identification of a tissue type of the tissue sample, and selecting at least one structure-identification algorithm responsive to the tissue type from a plurality of structure-identification algorithms, at least two of the structure-identification algorithms of the plurality of algorithms being responsive to different tissue types, and each structure-identification algorithm correlating at least one cellular pattern in a given tissue type with a presence of a structure of interest for the given tissue type.
- the method also includes applying the selected at least one structure- identification algorithm to the first pixel data set to determine a presence of the structure of interest in the tissue sample, and capturing a second pixel data set at a higher resolution.
- This computerized device and method provides automated capture of high-resolution images of structures of interest.
- the high-resolution images may be further used in an automated system to understand the human genome, interaction between drugs and tissue, and treat disease, or may be used without further processing.
- Figure 1A illustrates a robotic pathology microscope having a lens focused on a tissue-sample of a tissue microarray mounted on a microscope slide, according to an embodiment of the invention
- Figure 1 B illustrates an auxiliary digital image of a tissue microarray that includes an array level digital image of each tissue sample in the tissue microarray, according to an embodiment of the invention
- Figure 1 C illustrates a digital tissue sample image of the tissue sample acquired by the robotic microscope at a first resolution, according to an embodiment of the invention
- Figure 1 D illustrates a computerized image capture system providing the digital tissue image to a computing device in a form of a first pixel data set at a first resolution, according to an embodiment of the invention
- Figure 2 is a class diagram illustrating several object class families in an image capture application that automatically captures an image of a structure of interest in a tissue sample, according to an embodiment of the invention
- Figure 3 is a diagram illustrating a logical flow of a computerized method of automatically capturing an image of a structure of interest in a tissue sample, according to an embodiment of the invention.
- Figures 4A-G illustrate steps in detecting a structure of interest in a kidney cortex, according to an embodiment of the invention.
- the process used by histologists and pathologists includes visually examining tissue samples containing cells having a fixed relationship to each other and identifying patterns that occur within the tissue.
- Different tissue types have different structures and substructures of interest to an examiner (hereafter collectively "structures of interest”), a structure of interest typically having a distinctive pattern involving constituents within a cell (intracellular), cells of a single type, or involving constituents of multiple cells, groups of cells, and/or multiple cell types (intercellular).
- the distinctive cellular patterns are used to identify tissue types, tissue structures, tissue substructures, and cell types within a tissue. Recognition of these characteristics need not require the identification of individual nuclei, cells, or cell types within the sample, although identification can be aided by use of such methods.
- Individual cell types within a tissue sample can be identified from their relationships with each other across many cells, from their relationships with cells of other types, from the appearance of their nuclei, or other intracellular components.
- Tissues contain specific cell types that exhibit characteristic morphological features, functions, and/or arrangements with other cells by virtue of their genetic programming.
- Normal tissues contain particular cell types in particular numbers or ratios, with a predictable spatial relationship relative to one another. These features tend to be within a fairly narrow range within the same normal tissues between different individuals.
- normal tissues In addition to the cell types that provide a particular organ or tissue with the ability to serve its unique functions (for example, the epithelial or parenchymal cells), normal tissues also have cells that perform functions that are common across organs, such as blood vessels that contain hematologic cells, nerves that contain neurons and Schwann cells, structural cells such as fibroblasts (stromal cells) outside the central nervous system, some inflammatory cells, and cells that provide the ability for motion or contraction of an organ (e.g., smooth muscle). These cells also form patterns that tend to be reproduced within a fairly narrow range between different individuals for a particular organ or tissue, etc.
- Histologists and pathologists typically examine specific structures of interest within each tissue type because that structure is most likely to contain any abnormal states within a tissue sample.
- a structure of interest typically includes the cell types that provide a particular organ or tissue with its unique function.
- a structure of interest can also include portions of a tissue that are most likely to be targets for treatment of drugs, and portions that will be examined for patterns of gene expression. Different tissue types generally have different structures of interest.
- a structure of interest may be any structure or substructure of tissue that is of interest to an examiner.
- cells in a fixed relationship generally means cells that are normally in a fixed relationship in the organism, such as a tissue mass. Cells that are aggregated in response to a stimulus are not considered to be in a fixed relationship, such as clotted blood or smeared tissue.
- FIGS 1 A-D illustrate an image capture system 20 capturing a first pixel data set at a first resolution representing an image of a tissue sample of a tissue microarray, and providing the first pixel data set to a computing device 100, according to an embodiment of the invention.
- FIG. 1 A illustrates a robotic pathology microscope 21 having a lens 22 focused on a tissue-sample section 26 of a tissue microarray 24 mounted on a microscope slide 28.
- the robotic microscope 21 also includes a computer (not shown) that operates the robotic microscope.
- the microscopic slide 28 has a label attached to it (not shown) for identification of the slide, such as a commercially available barcode label.
- the label which will be referred to herein as a barcode label for convenience, is used to associate a database with the tissue samples on the slide.
- Tissue samples such as the tissue sample 26, can be mounted by any method onto the microscope slide 28.
- Tissues can be fresh or immersed in fixative to preserve tissue and tissue antigens, and to avoid postmortem deterioration.
- tissues that have been fresh-frozen, or immersed in fixative and then frozen can be sectioned on a cryostat or sliding microtome and mounted onto microscope microscope slides.
- Tissues that have been immersed in fixative can be sectioned on a vibratome and mounted onto microscope slides.
- Tissues that have been immersed in fixative and embedded in a substance such as paraffin, plastic, epoxy resin, or celloidin can be sectioned with a microtome and mounted onto microscope slides.
- a typical microscope slide has a tissue surface area of about 1250mm 2 .
- the approximate number of digital images required to cover that area, using a 20X objective, is 12,500, which would require approximately 50 gigabytes of data storage space.
- Aspects of the invention are well suited for capturing selected images from tissue samples of multicellular cells in a fixed relationship structures from any living source, particularly animal tissue. These tissue samples may be acquired from a surgical operation, a biopsy, or similar situations where a mass of tissue is acquired.
- aspects of the invention are also suited for capturing selected images from tissue samples of smears, cell smears, and bodily fluids.
- the robotic microscope 21 includes a high-resolution translation stage (not shown).
- Microscope slide 28 containing the tissue microarray 24 is automatically loaded onto the stage of the robotic microscope 21.
- An auxiliary imaging system in the image capture system 20 acquires a single auxiliary digital image of the full microscope slide 28, and maps the auxiliary digital image to locate the individual tissue sample specimens of the tissue microarray 24 on the microscope slide 28.
- FIG. 1 B illustrates an auxiliary digital image 30 of the tissue microarray
- auxiliary tissue sample image 36 of the tissue sample 26 includes an auxiliary tissue sample image 36 of the tissue sample 26 and the barcode.
- the image 30 is mapped by the robotic microscope 21 to determine the location of the tissue sections within the microscope slide 28.
- the barcode image is analyzed by commercially available barcode software, and slide identification information is decoded.
- System 20 automatically generates a sequence of stage positions that allows collection of a microscopic image of each tissue sample at a first resolution. If necessary, multiple overlapping images of a tissue sample can be collected and stitched together to form a single image covering the entire tissue sample.
- Each microscopic image of tissue sample is digitized into a first pixel data set representing an image of the tissue sample at a first resolution that can be processed in a computer system.
- the first pixel data sets for each image are then transferred to a dedicated computer system for analysis.
- system 20 will acquire an identification of the tissue type of the tissue sample. The identification may be provided by data associated with the tissue microarray 24, determined by the system 20 using a method that is beyond the scope of this discussion, or by other means.
- FIG. 1C illustrates a tissue sample image 46 of the tissue sample 26 acquired by the robotic microscope 21 at a first resolution.
- the image of the tissue sample should have sufficient magnification or resolution so that features spanning many cells as they occur in the tissue are detectable in the image.
- a typical robotic pathology microscope 21 produces color digital images at magnifications ranging from 5x to 60x. The images are captured by a digital charge-couple device (CCD) camera and may be stored as 24-bit tagged image file format (TIFF) files. The color and brightness of each pixel may be specified by three integer values in the range of 0 to 255 (8 bits), corresponding to the intensity of the red, green and blue channels respectively (RGB).
- CCD digital charge-couple device
- TIFF 24-bit tagged image file format
- the tissue sample image 46 may be captured at any magnification and pixel density suitable for use with system 20 and algorithms selected for identifying a structure of interest in the tissue sample 26.
- Magnification and pixel density may be considered related. For example, a relatively low magnification and a relatively high-pixel density can produce a similar ability to distinguish between closely spaced objects as a relatively high magnification and a relatively low-pixel density.
- An embodiment of the invention has been tested using 5x magnification and a pixel dimension of a single image of 1024 rows by 1280 columns. This provides a useful first pixel data set at a first resolution for identifying a structure of interest without placing excessive memory and storage demands on computing devices performing structure- identification algorithms.
- the tissue sample image 46 may be acquired from the tissue sample 26 by collecting multiple overlapping images (tiles) and stitching the tiles together to form the single tissue sample image 46 for processing.
- the tissue sample image 46 may be acquired using any method or device. Any process that captures an image with high enough resolution can be used, including methods that utilize other frequencies of electromagnetic radiation other than visible light, or scanning techniques with a highly focused beam, such as an X-ray beam or electron microscopy.
- an image of multiple cells within a tissue sample may be captured without removing the tissue from the organism. There are microscopes that can show the cellular structure of human skin without removing the skin tissue.
- the tissue sample image 46 may be acquired using a portable digital camera to take a digital photograph of a person's skin.
- endoscopic techniques may allow endoscopic acquisition of tissue sample images showing the cellular structure of the wall of the gastrointestinal tract, lungs, blood vessels and other internal areas accessible to such endoscopes.
- invasive probes can be inserted into human tissues and used for in vivo tissue sample imaging. The same methods for image analysis can be applied to images collected using these methods.
- Other in vivo image generation methods can also be used provided they can distinguish features in a multi-cellular image or distinguish a pattern on the surface of a nucleus with adequate resolution. These include image generation methods such as CT scan, MRI, ultrasound, or PET scan.
- FIG. 1D illustrates the system 20 providing the tissue image 46 to a computing device 100 in a form of a first pixel data set at a first resolution.
- the computing device 100 receives the first pixel data set into a memory over a communications link 118.
- the system 20 may also provide an identification of the tissue type from the database associated with the tissue image 46 using the barcode label.
- An application running on the computing device 100 includes a plurality of structure-identification algorithms. At least two of the structure- identification algorithms of the plurality of algorithms are responsive to different tissue types, and each structure-identification algorithm correlating at least one cellular pattern in a given tissue type with a presence of a structure of interest for the given tissue type.
- the application selects at least one structure-identification algorithm responsive to the tissue type, and applies the selected algorithm to determine a presence of a structure of interest for the tissue type.
- the application running on the computing device 100 and the system 20 communicate over the communications link 118 and cooperatively adjust the robotic microscope 21 to capture a second pixel data set at a second resolution.
- the second pixel data set represents an image 50 of the structure of interest.
- the second resolution provides an increased degree to which closely spaced objects in the image can be distinguished from one another over the first resolution.
- the adjustment may include moving the high-resolution translation stage of the robotic microscope 21 into a position for image capture of the structure of interest.
- the adjustment may also include selecting a lens 22 having an appropriate magnification, selecting a CCD camera having an appropriate pixel density, or both, for acquiring the second pixel data set at the higher, second resolution.
- the application running on the computing device 100 and the system 20 cooperatively capture the second data set.
- multiple second pixel data sets may be captured from the tissue image 46.
- the second pixel data set is provided by system 20 to computing device 100 over the communications link 118.
- the second pixel data set can be have a structure-identification algorithm applied to it for location of a structure of interest, or stored in the computing device 100 along with the tissue type and any information produced by the structure-identification algorithm.
- the second pixel data set representing the structure of interest 50 may be captured on a tangible visual medium, such as photo sensitive film in a camera or a computer monitor, or printed from the computing device 100 in any type of visual display, such as a monitor or an ink printer, or provided in any other suitable manner.
- the first pixel data set may then be discarded.
- the captured image can be further used in a fully automated process of localizing gene expression within normal and diseased tissue, and identifying diseases in various stages of progression. Such further uses of the captured image are beyond the scope of this discussion.
- Second pixel data set Capturing a high-resolution image of a structure of interest 50 (second pixel data set) and discarding the low-resolution image (first pixel data set) minimizes the amount of storage required for automated processing. Those portions of the tissue sample 26 having a structure of interest are stored. There is no need to save the low-resolution image (first pixel data set) because relevant structures of interest have been captured in the high-resolution image (second pixel data set).
- FIG. 2 is a class diagram illustrating several object class families 150 in an image capture application that automatically captures an image of a structure of interest in a tissue sample, according to an embodiment of the invention.
- the object class families 150 include a tissue class 160, a utility class 170, and a filter class 180.
- the filter class 180 is also referred to herein as "a plurality of structure- identification algorithms.” While aspects of the application and the method of performing automatic capture of an image of a structure of interest may be discussed in object-orientated terms, the aspects may also be implemented in any manner capable of running on a computing device, such as the computing device object classes CVPObject and CLSBImage that are part of an implementation that was built and tested. Alternatively, the structure identification algorithms may be automatically developed by a computer system using artificial intelligence methods, such as neural networks, as disclosed in U.S. application No. 10/120,206 entitled Computer Methods for Image Pattern Recognition in Organic Material, filed April 9, 2002.
- FIG. 2 illustrates an embodiment of the invention that was built and tested for the tissue types, or tissue subclasses, listed in Table 1.
- the tissue class 160 includes a plurality of tissue type subclasses, one subclass for each tissue type to be processed by the image capture application.
- a portion of the tissue type subclasses illustrated in FIG. 2 are breast 161 , colon 162, heart 163, and kidney cortex 164.
- the structure of interest for each tissue type consists of at least one of the tissue constituents listed in the middle column, and may include some or all of the tissue components.
- An aspect of the invention allows a user to designate which tissue constituents constitute a structure of interest.
- the right-hand column lists one or more members (structure-identification algorithms) of the filter class 180 (the plurality of structure-identification algorithms) that are responsive to the given tissue type.
- a structure of interest for the colon 162 tissue type includes at least one of Epithelium, Muscularis Mucosa, Smooth Muscle, and Submucosa tissue constituents, and the responsive filter class is FilterColonZone.
- the application will call FilterColonZone to correlate at least one cellular pattern formed by the Epithelium, Muscularis Mucosa, Smooth Muscle, and Submucosa tissue constituents to determine a presence of a structure of interest in the colon tissue 162.
- FilterMedian 181 A portion of the filter subclasses of the filter class 180 is illustrated in FIG. 2 as FilterMedian 181 , FilterNuclei 182, FilterGlomDetector 183, and FilterBreastMap 184.
- Table 2 provides a more complete discussion of the filter subclasses of the filter class 180 and discusses several characteristics of each filter subclass.
- the filter class 180 includes both specific tissue-type-filters and general- purpose filters.
- the "filter intermediate mask format" column describes an intermediate mask prior to operator(s) being applied to generate a binary structure mask.
- the application when determining the presence of a structure of interest for the colon 162 tissue type, the application will call the responsive filter class
- An aspect of the invention is that the subfilters of the filter class 180 utilizes features that are intrinsic to each tissue type, and do not require the use of special stains or specific antibody markers.
- Tables 3 and 4 describe additional characteristics of the filter subclasses of the filter class 180.
- This program recognizes glandular tissue (cortex and medulla), and capsule tissue, using nuclei density as the basic criterion.
- the nuclei density is computed from a nuclei mask that has been filtered to remove artifacts, and the result is morphologically processed.
- the glandular tissue area is removed from the total tissue image, and the remaining areas of tissue are tested for the correct nuclei density, followed by morphological processing.
- the resulting map is coded with blue for glandular tissue and green for capsule tissue.
- FilterBladderZone The program recognizes three zones: surface epithelium, smooth muscle and laminalitis. The algorithm first segments the input image into three classes of regions: nuclei, cytoplasm, and white space.
- the "density map" for each class is calculated and used to find the potential locations of the target zones. Zones are labeled with the following gray levels: Surface Epithelium — 50, Smooth Muscle - -- 100, and Lamina Propria — 150 FilterBreastDucts
- the input is a breast image and the output is a binary mask indicating the ducts.
- the routine finds epithelium in the breast by successively filtering the nuclei. The key observation is that epithelia are very hard to separate and rather large. Building on that observation, nuclei are discarded first by size, i.e., the smallest nuclei are eliminated. Larger nuclei are discarded if they are too elongated. Isolated nuclei are also discarded. The remaining nuclei are then joined using center of mass option of FilterJoinComponents. A second pass eliminates components that are thin. The remaining components are classified as ducts.
- FilterBreastMap The input is a breast image and the output is a color map, with blue denoting ducts, green stroma and black adipose or lumen.
- the ducts are found using FilterBreastDucts.
- the remaining area can be stroma, lumen (white space) or adipose.
- the adipose has a beautiful 'lattice-like' structure; its complement is many small lumen areas. Hence growing such areas will encase the adipose.
- the results of this region growing together with the white space yield the complement of the stroma and ducts. Hence the stroma is determined.
- FilterColonZone This program segments the input image into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation result, the "density map" for each class is calculated. Using the density maps produced above, find the potential locations of the "target zones”: epithelium, smooth muscle, submucosa, and muscularis mucosa. Each potential target zone is then analyzed with some tools for local statistics, and morphological operations in order to get a more precise estimation of its location and boundary. Regions are labeled with the following gray levels: Epithelium — 50, Smooth muscle — 100, Submucosa — 150, and Muscularis Mucosa — 200.
- FilterDuctDetector This filter is designed to detect and identify candidate collecting ducts in the kidney medulla. It is completed in three major parts: image layer segmentation, shape filters applied to measure candidate object properties, and finally an analysis test to identify the ducts.
- image layer segmentation involves white-space detection with removal of small areas and then nuclei detection.
- Distance filters are applied to compute the distance between candidate lumen and the closest surrounding nuclei.
- the final analysis identifies ducts that match a specific criteria for distance between nuclei and lumen and nuclei to lumen ratio.
- This filter is designed to detect and identify candidate glomeruli and their corresponding Bowman's capsules. It is completed in three major parts: image layer segmentation, shape filters applied to measure candidate object properties, and finally an analysis test to identify the glomeruli.
- the segmentation involves white-space detection with removal of small areas, nuclei and Vector red detection (if present in the image).
- Shape filters such as compactness and form factor are applied next to measure these properties for each lumen. A radial ring is positioned around each lumen then nuclei density and Vector red density scores are computed.
- the final analysis uses criteria for compactness, form factor, nuclei density and Vector red density to identify candidate glomeruli regions.
- FilterKidneyCortexMap This filter is designed to map the glomeruli, distal and proximal convoluted tubules of the kidney cortex. It calls FilterTubeDetector and FilterGlomDetector, and combines the results to create one structure mapped RGB image with glomeruli in blue, Bowman's capsule as magenta, distal convoluted tubules as green and proximal convoluted tubules as red.
- This filter is designed to map the collecting ducts of the kidney medulla. It calls FilterDuctDetector to create one structure mapped RGB image with ducts and Henle lumen as green and duct lumen as red. FilterLiverMap
- Respiratory epithelium is detected by applying a double threshold to the nuclei density and filtering out blobs of the wrong shape. The result is coded with blue for alveoli and green for epithelium.
- FilterLymphnodeMap Maps the tissue areas corresponding to lightly stained spherical lymphoid follicles surrounded by darkly stained mantle zones.
- the mantle zone of a follicle morphologically corresponds to areas of high nuclei density. Thresholding the nuclei density map is the primary filter applied to approximate the zones.
- the areas corresponding to low nuclei density areas e.g., germinal center and surrounding cortex tissue
- a second segmentation and threshold are applied to the suppressed image to produce the final zones.
- the map is coded blue for mantle zone.
- nasal mucosa tissues There three types of substructures in nasal mucosa tissues are of interest: respiratory epithelium, sub-mucosa glands, and inflammatory cells. The latter can not be detected at 5x magnification.
- respiratory epithelium the image is first segmented into three classes of regions: nuclei, cytoplasm, and white space, and the "density map" is computed for each, followed by morphological operations. Regions are labeled with the following gray levels: Epithelium — 50; Glands — 100 FilterPlacenta This function maps the location of all tissue by computing the complement of the texture-based white-space mask on the Vector red-suppressed image (see
- the output is an 8 bpp mask image.
- FilterProstateMap The input is a prostate image and the output is a color map indicating the glands as blue, stroma as green and epithelium as red.
- the glands are bounded by epithelium.
- the epithelium is found much the same as in breast. Isolated, elongated and smaller nuclei are eliminated.
- the compliment of the image remaining epithelia consists of several components. A component is deemed to be a gland if the nuclear density is sufficiently low; otherwise it is classified a stroma/smooth muscle component and intersected with the tissue mask from FilterTissueMaskto give the stroma.
- This function maps the location of all tissue by computing the complement of the texture-based white-space mask on the Vector red-suppressed image (see FilterSuppressVR) after down-sampling to an equivalent magnification of 1.25x.
- the output is an 8bpp mask image.
- This program recognizes the epidermis layer by selecting tissue regions with nuclei that have a low variance texture in order to avoid "crispy" connective tissue areas. A variance-based segmentation is followed by morphological processing. Regions with too few nuclei are then discarded, giving the epidermis. The resulting mask is written into all channels.
- This program segments the input image into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation result, the "density map” for each class is calculated. Using the density maps produced above, find the potential locations of the "target zones”: epithelium, smooth muscle, submucosa, and muscularis mucosa. Each potential target zone is then analyzed with some tools for local statistics, and morphological operations in order to get a more precise estimation of its location and boundary. Regions are labeled with the following gray levels: Epithelium — 50, Smooth muscle — 100,
- the mantle zone of a splenic follicle morphologically corresponds to areas of high nuclei density common in lymphoid tissues. Thresholding the nuclei density map is the primary filter applied to approximate the zones. To improve detection of the mantle zone, the areas corresponding to low nuclei density areas (e.g., germinal center and surrounding red pulp and other parenchyma tissue) are suppressed in the original image. A second segmentation and threshold are applied to the suppressed image to produce the final zones. The map is coded with blue for white pulp. FilterStomachZone
- FilterTestisMap This filter is designed to map the interstitial region and Leydig cells of the testis.
- the initial step is to segment the image into nuclei and white-space/tissue layer images.
- the nuclei density is computed from the nuclei image and then thresholded.
- the initial interstitial region is found by taking the "exclusive OR" (or the absolute difference) of the tissue/white-space image and the nuclei density image.
- the candidate Leydig cell regions are found by taking the product of the original image and the interstitial region.
- the candidate Leydig cells are found by taking the product of the previous Leydig cell region image and the nuclei density image.
- the final cells are identified by thresholding using a size criteria.
- the resulting structure map shows the interstitial regions as blue and the Leydig cells as green.
- the cortex regions are those of high nuclei density, and can be used to find lymphocytes. Because positive identification of HassalPs corpuscles at 5x magnification is currently not possible, the program produces a map of potential corpuscles for the purpose of ROI selection. Potential corpuscles are regions of low nuclei density that is not white-space and is surrounded by medulla (a region of medium nuclei density). Size and shape filtering is done to reduce false alarms. The result is coded with blue for lymphocytes and green for HassalPs corpuscles.
- FilterThyroidMap Maps the follicles in the Thyroid by selecting nuclei structures that surround areas which are devoid of nuclei and are within the proper size and shape range. An 8bpp follicle mask is produced.
- FilterTonsilMap Maps the tissue areas corresponding to lightly stained spherical lymphoid follicles surrounded by darkly stained mantle zones.
- the mantle zone of a follicle morphologically corresponds to areas of high nuclei density common in lymphoid tissues. Thresholding the nuclei density map is the primary filter applied to approximate the zones.
- Vector red suppression is applied as a pre-processing step to improve nuclei segmentation.
- the areas corresponding to low nuclei density areas e.g., germinal center and surrounding cortex tissue
- a second segmentation and threshold are applied to the suppressed image to produce the final zone.
- the map is coded with blue for mantle zone.
- FilterTubeDetector This filter is designed to detect and identify candidate distal convoluted tubules and proximal convoluted tubules in the kidney cortex. It is completed in three major parts: image layer segmentation, shape filters applied to measure candidate object properties, and finally a analysis test to identify the ducts.
- the segmentation involves white- space detection with removal of small areas and nuclei detection.
- Distance filters are applied to compute the distance between candidate lumen and the closest surrounding nuclei.
- the final analysis identifies distal convoluted tubules that match a specific criteria for distance between nuclei and lumen and nuclei to lumen ratio.
- the rejected candidate tubules are identified as Proximal Convoluted tubules.
- FilterUterusZone This program segments the input image into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation result, the "density map" for each class is calculated. Using the density maps produced above, find the potential locations of the "target zones”: stroma, glands, and muscle. Each potential target zone is then analyzed with some tools for local statistics, and morphological operations in order to get a more precise estimation of its location and boundary. Regions are labeled with the following gray levels: Stroma — 50, Glands — 100, and Muscle --- 150. End of Table 3.
- Table 4 General-Purpose Filters/Algorithms FilterDistanceMap Function to compute distance transform using morphological erosion operations. Works with 8 or 32 bpp images. In the 32 bpp case, the BLUE channel is used. The output is scaled to the [0,255] range. The true maximum distance value is saved in the CLSBImage object. FilterDownSample
- This function down-samples the source image by a constant factor by averaging over pixel blocks.
- the alignment convention is upper-left.
- a sampling factor'of 1 results in no down-sampling. If the source bitmap has dimensions that are not an integer multiple of the sampling factor, the remaining columns or rows are averaged to make the last column or row in the destination bitmap.
- the source bitmap can be 8 or 32 bits per pixel. The result is placed in a new bitmap with the same pixel depth and alpha channel setting as the source bitmap.
- the sampling factor can be set in the constructor or by the function SetSampling. Default is 2.
- FilterDSIntensity This function computes the gray-scale image by average of red, green and blue. The result is placed in a new bitmap with 8 bits per pixel and the same alpha setting as the source. Also provides simultaneous down-sampling by a constant factor (see FitlerDownSample). The sampling factor can be set in the constructor or by the function SetSampling. Default is 1 (no down-sampling). FilterEnhance
- This function enhances the image that was magnified by FilterZoom. It uses the IPL (Intel's Image Processing Library) to smooth the edges.
- This function applies a generic algorithm to segment the epithelial regions. Various parameters are empirically determined for each tissue. Output is an 8bpp mask that marks the epithelium.
- a square structural element of given size is used to erode the nuclei mask subject to a threshold. If the number of ON pixels is less than the threshold, all the pixels inside the element are turned OFF. Otherwise they are left as they were.
- the structural element size and threshold value can be passed to the constructor or set through access functions. Works with 8 or 32 bpp bitmaps. For 32 bpp, the blue channel is used.
- a square structural element of given size is used to dilate the nuclei mask subject to a threshold. If the number of ON pixels is greater than the threshold, all the pixels inside the element are turned ON. Otherwise they are left as they were.
- the structural element size and threshold value can be passed to the constructor or set through access functions. Works with 8 or 32 bpp bitmaps. For 32 bpp, the blue channel is used.
- Fractal descriptors measure the complexity of self-similar structures across different scales.
- the fractal density (FD) mapping measures the local non-uniformity of nuclei distribution and is often termed the fractal dimension.
- One method for implementing the FD is the box-counting approach. We implement one variation of this approach by partitioning the image into square boxes of size LxL and counting the number of N(L) of boxes containing at least a portion of the shape.
- the FD can be calculated as the absolute value of the slope of the line interpolated to a ⁇ og(N(L)) xlog(L) plot.
- FD measurements 2 > FD > 1 typically correspond to the most fractal regions, implying more complex shape information.
- This filter contains 2 methods for joining components; typically this is used to join nuclei.
- the inputs are a binary image, a size, number of passes and an output.
- Centroid Method A square window, W, with edge 2*S +1 is placed at each point in the image. If the center pixel of W is equal to zero then the center of mass of the non-zero pixels is calculated and set to 1.
- FilterMask Function to extract a binary image (mask) from a bitmap by applying a threshold to a given channel of the source bitmap.
- the source bitmap can be 8 or 32 bits per pixel.
- the destination bitmap is 8 bits per pixel (a single plane). Multiple constructors exist to apply thresholds in different ways (see below).
- the destination mask can be optionally inverted.
- Kernel size in the form (width, height) and can be passed to the constructor or set through an access function. Default size is 5x5. A 5x5 kernel results in a square window of size 25, and center (3,3). Works with 8 or 32 bpp bitmaps.
- the program calls FilterSegment to quantize the input image into three gray levels, and applies a color test to the lowest (darkest) level to obtain the nuclei mask.
- Output is an 8 bpp bitmap.
- the constructor takes 3 (optional) parameters that are passed to
- Size Function to change the size of a binary mask. Size can be increased or decreased arbitrarily.
- bitmap When down-sampling, the bitmap is sampled at the appropriate row and column sampling factors.
- up-sampling the new bitmap is created by expanding each pixel into a block of appropriate size and then applying a median filter of the same size to smooth any artifacts.
- the dimensions of the new bitmap can be provided to the constructor or set by the SetNewSize function.
- FilterROISelector Function to select regions of interest (ROIs) from a binary mask bitmap.
- An averaging filter is used to create a figure of merit (FOM) image from the binary mask.
- ROI selector divides the image into a grid such that the number of grid elements is slightly larger than the number of desired ROIs. Each grid element is then subdivided and the element with the highest average FOM is subsequently subdivided until the pixel level is reached, resulting in an ROI center. If ROI dimensions are greater than zero, the centers are then shifted so that no ROI pixels fall outside the image. The ROIs are then scored by calculating the fraction of the ROI pixels that overlap the source binary mask, and then sorted by decreasing score. If either of the ROI dimensions is zero, the FOM values are used as the score. Finally, overlapping ROIs are removed, keeping the higher-scoring ones. The ROI information is placed in the CLSBImage object's ROI list
- FilterSegment Function to segment tissue image into three gray levels using a modified k- means clustering method. Initialization is controlled by three parameters passed to the constructor or set using access functions:
- Center- Parameter to skew the location of the center mean A value of 0.5 places it in the middle point between the dark and bright means.
- VrEstl Gray level difference between red and blue for Vector red test.
- VrEst2 Size of expansion structural element for initial Vector red mask.
- the program also has the option of using LOCAL statistics by dividing the image into overlapping blocks and performing the segmentation on a block-by-block basis.
- the functions Local() and Global() are used to set the behavior.
- the result is returned as a color map with the dark pixels in blue, white- space pixels in green and Vector red pixels in red.
- An optional parameter in the range [0,1] sets the resulting VR level relative to the original, with 0 corresponding to complete suppression and 1 to no suppression. Note: a value of 1 will in general not produce the original image exactly.
- the output is a new RGB image.
- FilterTissueMask Function to compute mask that marks the location where tissue is present in the image.
- a texture map at 5x magnification is used to obtain an initial mask.
- the intensity image is then thresholded, producing a second mask.
- the final tissue mask is obtained by combining the re-sampled texture mask (to the original magnification) and the intensity mask so that a pixel is marked as "tissue” if both the texture is high and the intensity is low. Otherwise it is white-space.
- the gain for the intensity threshold can be set in the constructor (default is 2.0) or through an access function.
- Texture Function to mask white-space in a tissue image.
- 3Color Two methods are provided: Texture and 3Color.
- the Texture method calls FilterTissueMask and inverts the result to provide a white-space mask.
- the 3Color method calls FilterSegment and extracts the image plane associated with white-space. In both cases the output is an 8 bit per pixel bitmap.
- the method is selected by calling the member functions SetMethodTextureQ or
- This function zoom the image with Cubic interpolation (default). It uses the
- FilterColonZone will segment the input image into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation results, the "density map" for each class is calculated. Using the density maps produced above, find the potential locations of the "target zones”: epithelium, smooth muscle, submucosa, and muscularis mucosa. Each potential target zone is then analyzed with some tools for local statistics, and morphological operations in order to get a more precise estimation of its location and boundary. Regions are labeled with the following gray levels: Epithelium — 50, Smooth muscle — 100, Submucosa --- 150, and
- Muscularis Mucosa 200.
- Table 5 discusses yet further characteristics of structure-identification algorithms responsive to each tissue type of the tissue class 160:
- epithelial cells which can be recognized by the spatial arrangement of the nuclei.
- Epithelial cells are often located close to each other. Together with their associated cytoplasm, the epithelial nuclei form rather "large” regions.
- the generic epithelium algorithm first segments the image to obtain a nuclei map and a cytoplasm map.
- Each "nuclei+cytoplasm” region is assigned a distinct label using a connected component labeling algorithm. Based on such labeling, it is possible to remove “nuclei+cytoplasm” regions that are too small, with the "potential epithelial” regions thus outlined.
- the region size threshold is empirically determined for different tissues. After obtaining the "potential epithelial” regions, an object shape operator is applied to remove those regions that have very “spiked” boundaries.
- the Bladder mapping algorithm recognizes three zones: surface epithelium, smooth muscle and lamina intestinal. The algorithm first segments the input image into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation result, the
- the nuclei density map is thresholded using the Otsu method.
- the areas where nuclei density exceeds the threshold value are labeled as potential epithelium regions.
- the background (non-tissue) regions are also labeled by thresholding the nuclei density map and retaining the areas where the nuclei density is under the threshold.
- a size-based filter is applied to clean up the spurious background areas.
- a morphological dilate operation is applied to the background blobs to "fatten" them so that they overlap with the potential epithelium.
- the potential epithelium that intersect, or are otherwise connected with, the background can now be labeled as surface epithelium.
- a size-based filter is applied to clean up the surface epithelium areas.
- the steps are: 1. Label all the tissue areas that are not epithelial regions as the potential muscle regions.
- the lamina intestinal regions are always between the surface epithelium and smooth muscle. They are located by labeling all tissue areas that are neither epithelium or muscle as the potential lamina propria regions, and applying size-based filtering to remove the spurious lamina intestinal. What remains is the estimate for lamina intestinal.
- Ducts and Lobules are small structures that consist of a white-space region surrounded by a ring of epithelial cells. All epithelium in Breast surrounds ducts or lobules, and so they can be found by simply locating the epithelial cells. The overall strategy is to compute the nuclei mask and then separate the epithelial from the non-epithelial nuclei.
- nuclei are discarded first by size, i.e., the smallest nuclei are eliminated. Larger nuclei are discarded if they are too elongated. Isolated nuclei are also discarded. The remaining nuclei are then joined using the center of mass method. A second pass eliminates components that are thin. The remaining components are classified as ducts.
- epithelial and non-epithelial nuclei For tissues such as breast or prostate the principal difference between epithelial and non-epithelial nuclei is that the latter are isolated, so that the boundary of a "typical" neighborhood (window) around a non-epithelial nucleus will not meet any other nuclei. Such a condition translates directly into an algorithm. Given a binary nuclei mask, a window is placed about each nucleus. The values of the boundary of the window are then summed. If that sum is 0 (no pixels are turned “on”), the nucleus is classified as non-epithelial and epithelial if the sum is not zero.
- the remaining area can be stroma, lumen (white space) or adipose.
- the adipose has a "lattice-like" structure; its complement is many small white- space areas. Hence, growing such areas will encase the adipose.
- the results of this region growing together with the white space yield the complement of the stroma and ducts. Hence the stroma is determined.
- target zones epithelium, smooth muscle, submucosa, and muscularis mucosa.
- the Otsu threshold technique is applied to the nuclei density map.
- the regions where the nuclei density exceeds the Otsu threshold value are classified as potential epithelium.
- potential epithelium regions we apply an
- isolated blob removal process, which removes the isolated blobs for a given range of sizes, and within certain ranges of "empty" neighborhood.
- the next step is to invoke a shape filter that removes the blobs that are too “elongated” based on the eigen-axes of their shapes.
- a morphological dilation then smoothes the edges of the remaining blobs. The result of this sequence of operations is the epithelium regions.
- a variance map of the gray-scale copy of the original image is first produced.
- the Otsu threshold is then applied to the variance map. It segments out the potential submucosa and epithelium regions by only retaining the portion of the variance map where the variance exceeds the Otsu threshold values. Since the submucosa regions are disjoint with epithelium, the latter can be removed, and a potential submucosa map is thus produced.
- a size-based filter is then applied to remove blobs under or exceed certain ranges. The final sub-mucosa regions are thus obtained.
- the Otsu threshold is applied to the cytoplasm density map.
- the regions of the map where the density values exceed the threshold value are labeled as the initial estimate for potential muscle regions.
- an isolated blob remover is used to filter out the blobs that are too large or too small and with sufficiently
- the muscularis mucosa regions are always adjacent to the epithelium and submucosa regions.
- the first step is to find the boundaries of the epithelium and perform a region growing operation from these epithelium boundaries.
- the intersections between the regions grown from epithelium and the submucosa are labeled as the muscularis mucosa.
- the image is down-sampled to an equivalent magnification of 1.25x for both speed and to capture the large-scale texture information.
- the Vector red signature is suppressed to avoid false alarms in cases on nonspecific binding to the glass.
- the white-space mask is computed using the Texture-Brightness method and the mask is inverted (positive becomes negative and negative becomes positive).
- a median filter is applied to "smooth out" noisy areas such as tiny white-space holes in tissue and small specks of material in the white-space. This has the effect of improving the quality of the ROI selection results.
- the resulting mask is re-sampled to match the size of the original image.
- FIGS. 4A-G illustrate aspects of FilterKidneyCortexMap discussed below.
- Glomeruli appear in the Kidney Cortex as rounded structures surrounded by narrow Bowman's spaces. Recognition of a glomerulus requires location of the lumen that makes up the Bowman's space along with recognition of specific characteristics of the size and shape of the entire structure. In addition the quantification of CD31 (Vector red) staining information can be used because glomeruli contain capillaries and endothelial cells that typically stain VR-positive.
- CD31 Vector red
- the bulk of the parenchyma tissue between each glomerulus consists of tubules that differ from one another in diameter, size, shape and staining intensity.
- the tubules mainly consist of proximal convoluted tubules with a smaller number of distal convoluted tubules and collecting ducts.
- a DCT may be differentiated from PCT by a larger more clearly defined lumen, more nuclei per cross-section, and smaller sections of length across the parenchyma tissue. The nuclei of the DCT lie close to the lumen and tend to bulge into the lumen.
- the Kidney Cortex processing consists of the following steps: 1. Segmentation the white-space, nuclei, and Vector red regions. Each mask image is preprocessed by applying shape descriptors to eliminate regions that meet a certain criteria.
- the white-space consists of lumen located around the glomeruli (Bowman's capsule), lumen located within tubular structures such as DCT and PCT, and areas within vessels and capillaries. The size, perimeter, distance to neighborhood objects, and density per neighborhood are used to select candidate lumen objects.
- Nuclei density is used to further refine the list of candidate structures after lumen detection is complete. It is measured inside the Bowman's capsule and outside the perimeter of the tubular structures.
- VR density if CD31 staining is applied, is measured within the bowman's capsule and is used as the final discriminating factor for determining the existence of a glomerulus.
- Glomeruli recognition requires four individual measurements on each candidate glomerulus.
- the lumen mask obtained in the segmentation process are preprocessed by eliminating small regions that are typically associated with blood vessels, small portions within tubular structures, and within glomeruli. 1.
- a compactness measurement is performed on each lumen object by measuring the ratio of the size of the lumen to the perimeter.
- a Bowman's ring form factor measurement is obtained by measuring the ratio of the size of a circular ring placed around the lumen to the number of lumen pixels that intersect the bowman's ring.
- the size and diameter of the ring are based on computing bounding box measurements for the candidate lumen (e.g., width, height, and center coordinates).
- the ring is then rotated around the lumen and a form factor measurement is computed for each location. The location with the highest form factor measurement is kept.
- the nuclei density of the ring is calculated as the ratio of nuclei pixels that intersect the form factor ring to the size of the ring.
- Vector red density is calculated as the ratio of the Vector red pixels that intersect the form factor ring to the size of the ring.
- a threshold is applied to each the compactness, form factor, nuclei density, and VR density measurements to determine if the candidate lumen is categorized as a glomerulus.
- DCT recognition is accomplished by a method similar to epithelium detection.
- Each distal convoluted tubule (DCT), collecting duct (CD), and proximal convoluted tubule (PCT) can be modeled as a white space blob surrounded by a number of nuclei.
- the expected size of such a region to be related to a DCT is estimated empirically. Areas too large or too small are typically discarded during the glomeruli recognition process.
- Two methods can be used to complete the DCT recognition process: Lumen area to Nuclei area ratio To decide if a lumen region is associated with a DCT, the nuclear content of an annular region about its boundary is examined.
- the nuclear content e.g., the ratio of nuclear area to total area
- the decision threshold is again is determined empirically.
- Lumen to Nuclei distance criterion In this method, we compute the distance matrix between each white-space object and its neighboring nuclei (within certain radius). If the ratio between the areas of a white-space object and its neighboring nuclei is above a threshold and total area of the nuclei has is above a minimum requirement, they are classified as a DCT. This method can be used to identify PCTs by repeating the procedure with different parameters on the remaining white- space objects.
- This algorithm is designed to detect and identify candidate collecting ducts in the kidney medulla. It is completed in three major parts: image layer segmentation, shape filters applied to measure candidate object properties, and finally a analysis test to identify the ducts.
- image layer segmentation involves white-space detection with removal of small areas and then nuclei detection.
- Distance filters are applied to compute the distance between candidate lumen and the closest surrounding nuclei.
- the final analysis identifies ducts that match specific criteria for distance between nuclei and lumen and nuclei to lumen ratio.
- FilterLiverMap The goal of the liver algorithm is to delineate those areas that correspond to ducts and secondly those areas that comprise a portal triad. Ducts can be determined from a "good" nuclei image. The (boundary of) ducts correspond to the large components in the nuclei image. Throwing away very elongated components filters the set of large components. The remaining components are deemed to be ducts.
- a portal triad consists of a vein, an artery and a duct, though often the artery is not clear. Since one does not generally expect to find nuclei in either the vein or artery, the algorithm finds areas of the appropriate size without nuclei that are near ducts found previously. These nuclear free areas are estimated in two ways.
- the brightness segmentation algorithm produces a white space image.
- the image is filtered for areas of the appropriate size and shape to be arteries or veins.
- Nuclei-free areas are also estimated in a manner analogous to that discussed for finding glands in the prostate (see Prostate - Glands).
- Each of the areas (the duct-area and the nuclei-free area), which are disjoint, are then expanded. The intersection of the expanded regions is taken as the center of a ROI for a portal triad.
- Alveoli detection is done by morphological filtering of the tissue mask (see
- FilterTissueMask The goal of the algorithm is to filter the tissue mask so that only tissue with a web-like shape remains after processing.
- the steps are as follows: 1. The image is initially down-sampled to an effective magnification of 2.5x in order to maximize execution speed.
- the tissue mask is calculated using the Texture-Brightness method.
- a median filter is applied to suppress the noise.
- the image is inverted and a morphological close operation is performed using a disk structural element. This removes the alveolar tissue from the image.
- a guard band is placed around remaining tissue areas by dilation an a second size filter is applied.
- the resulting mask is combined with the initial tissue mask, producing the alveoli tissue.
- the image is re-sampled to its original size.
- Respiratory epithelium is detected by applying a "double threshold" to the nuclei density and filtering out areas of the wrong shape.
- the steps are as follows: 1.
- the nuclei mask is computed (see FilterNuclei) and intersected with the complement of the alveoli mask. This reduces the search for epithelium to non-alveolar tissue.
- the nuclei density map is computed using an averaging filter, and a threshold is applied to segment the areas with higher density. Because the nuclei segmentation can occasionally misestimate the nuclei, the threshold is determined relative to the trimmed mean of the density. The procedure is then repeated on the resulting mask (using a fixed threshold) to find areas that have high concentration of high nuclei density areas. These are the potential epithelial areas.
- a morphological close operation is applied to join potential epithelial areas.
- a shape filter is applied to remove areas that are outside of the desired size range, or that are too rounded.
- a more stringent shape criterion is used for areas that are closer to the top of the size range.
- the Vector red signature is suppressed to avoid false alarms in case of non-specific antibody binding to the glass.
- the white-space mask is computed using the Texture-Brightness method and the mask is inverted (positive becomes negative and negative becomes positive).
- a median filter is applied to "smooth out" noisy areas such as tiny white-space holes in tissue and small specks of material in the white-space. This has the effect of improving the quality of the ROI selection results.
- the size of the median filter window is proportional to the image magnification, where the proportionality constant can be adjusted per tissue.
- Glands are recognized by the epithelium ring that surrounds them.
- the procedure for gland detection involves a two-step process where a sequence of morphological operations is followed by a sequence of tests.
- For the first step we start with the nuclei mask and obtain candidate gland regions by the following sequence of algorithmic operations: 1.
- the nuclei are expanded using the procedure discussed above. This has the effect of connecting nuclei that are close together.
- a clean-up algorithm is run to remove expanded nuclei that are not epithelial.
- the second step consists of labeling the morphological components and performing two tests. For each labeled component, we compute:
- Stroma detection is performed by expanding the detected glands to account for the epithelial cells and inverting the image. In principle this produces a stroma mask.
- the above algorithm misses a gland, then the area will incorrectly be labeled as stroma and can cause the ROI selector to locate a stroma ROI in a gland.
- the initial stroma mask is intersected with the complement of the white-space (the tissue mask). This removes the white-space and the interior of any missed glands.
- This algorithm is designed to map the interstitial region and Leydig cells of the testis.
- the initial step is to segment the image into nuclei and white-space/tissue layer images.
- nuclei density is computed from the nuclei image and then thresholded.
- the initial interstitial region is found by taking the "exclusive OR" (or the absolute difference) of the tissue/white-space image and the nuclei density image.
- the candidate Leydig cell regions are found by taking the product of the original image and the interstitial region.
- the candidate Leydig cells are found by taking the product of the previous Leydig cell region image and the nuclei density image.
- the final cells are identified by thresholding using a size criterion.
- FilterThymusMap The relevant features to be recognized in the Thymus are lymphocytes and HassalPs corpuscles. Direct recognition of these features at low magnification is not feasible.
- Lymphocytes are found at high concentration in the cortex region of the thymus.
- an algorithm that identifies the cortex will also mark the lymphocytes with high probability.
- the cortex is recognized by its high nuclei density.
- a threshold is applied to the density map to obtain the high-density areas, followed by median filtering to remove noise and a morphological dilation step to improve coverage and join regions that re close together. Although this method does not insure 100% coverage, it consistently marks sufficient cortex area to locate lymphocytes.
- Hassall's Corpuscles A map of potential corpuscles (for the purpose of ROI selection) can be obtained by finding "gaps" in the thymus medulla, followed by the application of tests to eliminate objects that are not likely to be corpuscles.
- Potential corpuscles are regions of low nuclei density that is not white-space and is surrounded by medulla (a region of medium nuclei density).
- Size and shape filtering is done to reduce false alarms.
- the algorithm steps are: 1. Find the areas of low nuclei density by thresholding the nuclei density map and apply a median filter to reduce noise.
- the single structure of interest in the Thyroid is the follicles.
- This algorithm maps the follicular cells in the Thyroid by selecting nuclei structures that surround areas which are devoid of nuclei and are within the proper size and shape range. This is accomplished in the following steps:
- the nuclei mask is obtained (see FilterNuclei).
- the nuclei are joined with an algorithm that connects components which are close together using the "line” method (see Filter oinComponents). The result is to isolate areas that are either white-space or the interior of follicles.
- the image is inverted and morphologically opened with a large structural element to separate the individual objects. Resulting objects are either follicles or white-space. 4.
- a shape filter is applied in order to remove objects that are not sufficiently rounded to be "normal" looking follicles.
- a guard band is created around the remaining objects using a morphological dilation operation, and the resulting image is combined with the previous by an exclusive union operation (XOR). This results in rings that mark the location of the follicular cells.
- the algorithm first segments the input image into nuclei, cytoplasm, and white space. Based on the segmentation result, the "density map" for each class is calculated.
- Stroma To determine the potential stroma regions, apply the Otsu threshold to segment the nuclei density map. Regions where nuclei density exceeds the Otsu threshold value are labeled as potential stroma regions. A size based filter is then used to clean up the spurious stroma regions. To fill the holes within each blob in the potential stroma map, a morphological closing and a flood fill operations are applied.
- That threshold value is empirically determined. This produces a map in which only the pixels on the epitheliums next to glands are "on". The blobs in the map are dilated so that they will partially overlap with the potential glands estimated previously. 5. For each potential gland, calculate the perimeter (p) and the fraction r of p that overlaps with the map from the previous step. Apply a threshold on p and retain the glands whose r-value exceed the given threshold value. The consequence of this sequence of operations is to remove the potential glands that do not have a sufficient amount of epithelium surrounding them. This removes the vessels that could be mistaken as glands. End of Table 5.
- Table 5 provides additional discussion how the structure-identification algorithm FilterColonZone correlates cellular patterns of colon tissue to determine the presence of one or more of epithelium, smooth muscle, submucosa, and muscularis mucosa tissue constituents.
- Table 6 provides yet further discussion of several filters of the filter class 180 for the extraction or segmentation of basic tissue constituent features.
- d be the average of the darkest D% of the pixels (where D is typically 20), C 2 the average of the "middle” 45%-55%, and C 3 be the average of the top 7% of the sorted values (where T is typically 10).
- a pixel is then placed in one of 3 groups depending on which Q is nearest its gray value.
- the regions 0-20%, 45-55% and 90-100% above work well for many tissues, and can be adaptively (empirically) chosen on a per tissue basis. Due to illumination variation, especially "barrel distortion", the procedure above sometimes produces better results if done locally. To do so, a small window is chosen so that the illumination within it is uniform and this window is moved across the entire image. Two variations on the algorithm have been implemented to accommodate image variability. In the first, C ⁇ is taken to be the average of the bottom 2%, C 3 is the average of the top 2% and C 2 is the sum:
- nuclei segmentation Two approaches to nuclei segmentation have been developed.
- nuclei are segmented (their corresponding pixels are labeled) by a combination of brightness and color information.
- Two binary masks are computed and combined using a logical AND Boolean operation to produce the best possible nuclei mask.
- the first mask is obtained from the dark (lowest brightness level) pixels produced by the brightness segmentation discussed above.
- the second mask is obtained by performing the following test on each pixel in the image:
- B, G and R are the brightness levels for blue, green and red respectively in the
- the first is based on the brightness image segmentation.
- the second uses a texture map in combination with brightness.
- Segment, FilterWhiteSpace This method simply extracts the top brightness level from the result of the brightness segmentation algorithm discussed above. This approach works well in tissue types that tend to have small amounts of white-space uniformly distributed through the image, such as Heart and Skeletal Muscle. Texture-Brightness Method
- tissue and non-tissue areas can be separated by a combination of their texture and brightness.
- the white-space is obtained by the combination of two binary mask images.
- the first mask results from the application of a texture threshold.
- a brightness threshold is computed and applied to obtain a second mask that is then combined with the first.
- a texture map is computed from the brightness image using a variance-based measure (see FilterTextureMap).
- a threshold typically in the range of 0.5 to 1.0, but varying depending on pre-processing steps
- the texture map results in a binary image where the high-texture regions are positive and the low-texture regions negative.
- a simple statistic the mean brightness minus twice the standard deviation.
- Application of this threshold produces a second binary image where the brighter regions are positive and the darker ones negative.
- the white space mask is obtained by creating a third binary mask where a pixel is positive (white-space) if the corresponding pixel from the texture-based mask is negative or if the corresponding pixel from the brightness-based mask is positive.
- tissue texture is distinct from that of the white-space.
- tissue types include Colon, Spleen and Lung.
- Vector red cannot be assumed to be consistently present in most tissues, there are cases where the antibody that is tagged with VR will always bind to certain kinds of cells such as vessel endothelium and glomerulus cells, and so can be used to improve the results from structure recognition by other means.
- VR segmentation can be achieved by a comparison between the brightness of the red channel relative to the blue channel. If the red channel value for a given pixel is greater than the corresponding blue channel value by a set margin, then the pixel is marked as VR.
- Vector Red Suppression Implementation: FilterSuppressVR High levels of Vector red (VR) in a sample can significantly affect the performance of feature extraction algorithms.
- the VR signature in the image can be digitally suppressed to any extent desired.
- the value of 0.1 is used to avoid computing the logarithm of zero while introducing negligible distortion.
- the components used in the mixture model are VR, hematoxilin and white- space.
- the latter is represented by a "gray" signature where all the colors have equal weight.
- To suppress the VR signature we estimate a using the pseudo-inverse method: and multiply the element of a corresponding to VR by a factor in the range [0,1], where a value of 0 corresponds to complete VR suppression and a value of 1 corresponds to no suppression.
- a new optical depth vector is obtained by application of the first equation to the new abundance vector, and the VR-suppressed RGB image is re-formed.
- Table 7 provides additional discussion of alternative embodiments of the tissue mapping tools of the filter class 180 that generate maps of statistics measured from segmented images.
- nuclei density is the density of nuclei at a particular point in the image.
- Two types of nuclei density measures have been developed: a simple linear density and a fractal density.
- the linear density is computed by convolving the binary nuclei mask with an averaging window of a given size. This provides a measure at each pixel of the average fraction of image area that is designated as nuclei. Such information is useful in mapping zones within tissues such as Thymus. Fractal Density
- Fractal descriptors measure the complexity of self-similar structures across different scales.
- the fractal density (FD) mapping measures the local non-uniformity of nuclei distribution and is often termed the fractal dimension.
- One method for implementing the FD is the box-counting approach. We implement one variation of this approach by partitioning the image into square boxes of size LxL and counting the number of N(L) of boxes containing at least one pixel that is labeled as nuclear.
- the FD can be calculated as the absolute value of the slope of the line interpolated to a log(N(L))xlog(L) plot.
- the sequence of box sizes, starting from a given size L over a given pattern in the image, is usually reduced by Yz from one level to the next.
- FD measurements 2 > FD > 1 typically correspond to the most fractal regions, implying more complex shape information.
- the contraction operation follows the same basic procedure as the expansion, but the neighborhood pixels are turned “off” if the number of non-zero pixels is less than the given threshold.
- a step is the selection of Regions of Interest (ROIs) from low-magnification images in order to provide the system with the locations within the tissue where a high-magnification image should be collected for further analysis.
- ROIs Regions of Interest
- Interesting regions in tissues are those associated with structures of interest.
- the ROI selection process begins with a binary mask that has been computed to mark the locations of a particular structure type, such as glands, ducts, etc., in the tissue section.
- a particular structure type such as glands, ducts, etc.
- the algorithms used to create such masks are discussed in elsewhere in this document.
- the mask image is divided into a grater number of approximately equal size sections. For each section, an optimal location is selected for the center of a candidate ROI.
- Each candidate ROI is then "scored" by computing the fraction of pixels within the ROI where the mask has a positive value, indicating to what extent the desired structure is present.
- the ROIs are then sorted by score with an overlap constraint, and the top-scoring ROIs are selected.
- a multi-resolution method is used where the image section if further sub-divided in successive steps. At each step a "best" subsection is selected and the process is repeated until the subsections are pixel- sized. This method does not insure a globally optimum location will be selected each time, but does consistently produce good results. Selection of a "best" subsection at each step requires that a Figure of Merit (FOM) be computed for each subsection at each step.
- a FOM is a value that indicates the "goodness" of something, with a higher number always being better than a lower number. For tissue ROI selection, a reasonable FOM is obtained by filtering the binary mask with an averaging window of size matching the ROI.
- the resulting FOM image is not binary, but rather has values that range from 0 to 1 , depending on the proportion of positive mask pixels within the averaging window.
- the FOM image is simply averaged over all the pixels in the subsection. Although seemingly redundant, this procedure insures that ROI selections will be centered in areas with the broadest possible mask coverage. End of Table 7.
- the utility class 170 of FIG. 2 includes general tools. A portion of the utility subclasses of the utility class 170 is illustrated in FIG, 2 as CBIob 171 , CLogical 172, and CMorph 173. Table 8 discusses several utility subclasses of the utility class 170.
- FIG. 3 is a diagram illustrating a logical flow 200 of a computerized method of automatically capturing an image of a structure of interest in a tissue sample, according to an embodiment of the invention.
- the tissue samples typically have been stained before starting the logical flow 200.
- the tissue samples are stained with a nuclear contrast stain for visualizing cell nuclei, such as Hematoxilin, a purple-blue basic dye with a strong affinity for DNA/RNA-containing structures.
- the tissue samples may have also been stained with a red alkaline phosphatase substrate, commonly known as "fast red” stain, such as Vector ® red (VR) from Vector Laboratories. Fast red stains precipitate near known antibodies to visualize where the protein of interest is expressed.
- a nuclear contrast stain for visualizing cell nuclei
- a purple-blue basic dye with a strong affinity for DNA/RNA-containing structures.
- the tissue samples may have also been stained with a red alkaline phosphatase substrate, commonly known as "fast red” stain
- Such areas in the tissue are sometimes called “Vector red positive” or “fast red positive” areas.
- the fast red signal intensity at a location is indicative of the amount of probe binding at that location.
- the tissue samples often have been stained with fast red for uses of the tissue sample other than determining a presence of a structure of interest, and the fast red signature is usually suppressed by structure-identification algorithms of the invention.
- the logical flow moves to block 205, where a microscopic image of the tissue sample 26 at a first resolution is captured. Also at block 205, a first pixel data set representing the captured-color image of the tissue sample at the first resolution is generated. Further, the block 205 may include adjusting an image-capture device to capture the first pixel data set at the first resolution.
- the logic flow moves to block 210, where the first pixel data set and an identification of a tissue type of the tissue sample are received into a memory of a computing device, such as the memory 104 of the computing device 100.
- the logical flow then moves to block 215 where a user designation of a structure of interest is received. For example, a user may be interested in epithelium tissue constituents of colon tissue.
- the logic flow would receive the user's designation that epithelium is the structure of interest.
- the logic flow moves to block 220, where at least one structure- identification algorithm responsive to the tissue type is selected from a plurality of stored structure-identification algorithms in the computing device. At least two of the structure-identification algorithms of the plurality of algorithms are responsive to different tissue types, and each structure-identification algorithm correlating at least one cellular pattern in a given tissue type with a presence of a structure of interest for the given tissue type.
- the structure-identification algorithms may be any type of algorithm that can be run on a computer system for filtering data, such as the filter class 180 of FIG. 2.
- the logical flow moves next to block 225, where the selected at least one structure-identification algorithm is applied to the first pixel data set representing the image.
- the applied structure-identification algorithm is FilterColonZone. Tables 3 and 5 describe aspects of this filter as segmenting the first pixel data set into three classes of regions: nuclei, cytoplasm, and white space. Based on the segmentation result, a
- Density map for each class is calculated. Using the density maps, the algorithm finds the potential locations of the "target zones” or cellular constituents of interest: epithelium, smooth muscle, submucosa, and muscularis mucosa. Each potential target zone is then analyzed with tools for local statistics, and morphological operations performed in order to get a more precise estimation of its location and boundary. Regions in an intermediate mask are labeled with the following gray levels for the four cellular constituents: epithelium — 50, smooth muscle — 100, submucosa — 150, and muscularis Mucosa --- 200.
- the Otsu threshold technique is applied to the nuclei density map.
- the regions where the nuclei density exceeds the Otsu threshold value are classified as potential epithelium.
- an "isolated blob removal” process is applied, which removes the isolated blobs for a given range of sizes, and within certain ranges of "empty" neighborhood.
- the next step is to invoke a shape filter that removes the blobs that are too “elongated” based on the eigen-axes of their shapes.
- a morphological dilation then smoothes the edges of the remaining blobs.
- the result of this sequence of operations is a set of pixels that correlates closely with the epithelium regions.
- a variance map of the gray-scale copy of the original image is first produced.
- the Otsu threshold is then applied to the variance map. It segments out the potential submucosa and epithelium regions by retaining the portion of the variance map where the variance exceeds the Otsu threshold values. Since the submucosa regions are disjointed with the epithelium, the latter can be removed, and a potential submucosa map is thus produced.
- a size-based filter is then applied to remove blobs under or exceeding certain ranges. A set of pixels that correlates closely with the sub-mucosa regions is thus obtained.
- the Otsu threshold is applied to the cytoplasm density map.
- the regions of the map where the density values exceed the threshold value are labeled as the initial estimate for potential muscle regions.
- an isolated blob remover is used to filter out the blobs that are too large or too small and with sufficiently "empty" neighbor regions. This sequence of operations results in a set of pixels that correlates closely with the final muscle map.
- a binary structure mask is computed from the filter intermediate mask generated by the structure-identification algorithm(s) applied to the first pixel data set.
- the binary structure mask is a binary image where a pixel value is greater than zero if a pixel lies within the structure of interest, and zero otherwise.
- the binary structure mask may be directly generated from the filter intermediate mask. If the filter intermediate mask includes cellular components requiring correlating to determine the presence of the structure of interest, the cellular components, a co- location operator is applied to the intermediate mask to determine whether there is a coincidence, an intersection, a proximity, or the like, between the cellular components of the intermediate mask.
- the binary structure mask will describe and determine a presence of a structure of interest by the intersection or coincidence of the locations of the cellular patterns of at least one of the four constituents constituting the structure of interest.
- the binary structure mask typically will contain a "1" for those pixels in the first data sets where the cellular patterns coincide or intersect and a "0" for the other pixels.
- a minimum number of pixels in the binary structure mask contain a "1 ”
- a structure of interest is determined to exist. If there are no areas of intersection or coincidence, no structure of interest is present and the logical flow moves to an end block E. Otherwise, the logical flow moves to block 230 where at least one region of interest (ROI) having a structure of interest is selected for capture of the second resolution image.
- ROI region of interest
- a filter such as the FilterROISelector discussed in Tables 2, 4, and 7, uses the binary structure mask generated at block 225 marking locations of the cellular constituents comprising the structure of interest to determine a region of interest.
- a region of interest is a location in the tissue sample for capturing a second resolution image of the structure of interest.
- a method of generating a region of interest mask includes dividing the binary structure mask image into a number of approximately equal size sections greater in number than a predetermined number of regions of interest to define candidate regions of interest.
- each candidate region of interest is scored by computing the fraction of pixels within the region of interest where the mask has a positive value, indicating to what extent the desired structure is present.
- the candidate regions of interest are sorted by the score with an overlap constraint. Then, the top- scoring candidate regions of interest are selected as the regions of interest.
- Selecting the region of interest at block 230 may also include selecting optimal locations within each region of interest for capture of the second pixel data set in response to the figure-of-merit process discussed in Tables 3 and/or 7 above.
- a method of selecting optimal locations in response to a figure-of-merit includes dividing each region of interest into a plurality of subsections. Next, a "best" subsection is selected by computing a figure of merit for each subsection.
- the figure of merit is computed filtering the binary structure mask with an averaging window of size matching the region of interest for a resulting figure of merit image that has values ranging from 0 to 1 , depending on the proportion of positive mask pixels within the averaging window; and obtaining a figure of merit for a given subsection by averaging the figure of merit image over all the pixels in the subsection, with a higher number being better than a lower number. Finally, repeating the dividing and selecting steps until the subsections are pixel-sized. The logic flow then moves to block 235, where the image-capture device is adjusted to capture a second pixel data set at a second resolution.
- the image-capture device may be the robotic microscope 21 of FIG. 1.
- the adjusting step may include moving the tissue sample relative to the image-capture device and into an alignment for capturing the second pixel data set.
- the adjusting step may include changing a lens magnification of the image-capture device to provide the second resolution.
- the adjusting step may further include changing a pixel density of the image-capture device to provide the second resolution.
- the logic flow moves to block 240, where the image-capture device captures the second pixel data set in color at the second resolution. If a plurality of regions of interest are selected, the logic flow repeats blocks 235 and 240 to adjust the image-capture device and capture a second pixel data set for each region of interest.
- the logic flow moves to block 245 where the second pixel data set may be saved in a storage device, such in a computer memory or hard drive. Alternatively, the second pixel data set may be saved on a tangible visual medium, such as by printing on paper or exposure to photograph film.
- the logic flow 200 may be repeated until a second pixel data set is captured for each tissue sample on a microscope slide. After capture of the second pixel data set, the logic flow moves to the end block E.
- the logic flow 200 includes an iterative process to capture the second pixel data set for situations where a structure- identification algorithm responsive to the tissue type cannot determine the presence of a structure of interest at the first resolution, but can determine a presence of regions in which the structure of interest might be located.
- a selected algorithm is applied to the first pixel data set and a region of interest is selected in which the structure of interest might be located.
- the image-capture device is adjusted at block 235 to capture an intermediate pixel data set at a resolution higher than the first resolution.
- the process returns to block 210 where the intermediate pixel data set is received into memory, and a selected algorithm is applied to the intermediate pixel data set to determine the presence of the structure of interest at block 225.
- This iterative process may be repeated as necessary to capture the second resolution image of a structure of interest.
- the iterative process of this alternative embodiment may be used in detecting Leydig cells or Hassall's corpuscles, which are often not discernable at the 5X magnification typically used for capture of the first resolution image.
- the intermediate pixel data set may be captured as 20X magnification, and a further pixel data set may be captured at 40X magnification for determination whether a structure of interest is present.
- an existing tissue image database may require winnowing for structures of interest, and possible discard of all or portions of images that do not include the structures of interest.
- An embodiment of the invention similar to the logic flow 200 provides a computerized method of automatically winnowing a pixel data set representing an image of a tissue sample having a structure of interest.
- the logical flow for winnowing a pixel data set includes receiving into a computer memory a pixel data set and an identification of a tissue type of the tissue sample similar to block 205. The logical flow would then move to blocks 220, 225, and 225 to determine a presence of the structure of interest in the tissue sample.
- the tissue image may be saved in block 245 in its entirety, or a location of the structure of interest within the tissue sample may be saved.
- the location may be a sub-set of the pixel data set representing the image that includes the structure of interest may be saved.
- the logic flow may include block 230 for selecting a region of interest, and sub-set of the pixel data set may be saved by saving a region of interest pixel data sub-set.
- An embodiment of the invention was built to validate the method and apparatus of the invention for automatically determining a presence of cellular patterns, or substructures, that make up the structure of interest in a tissue sample for various tissue types.
- An application was written incorporating the embodiment of the invention discussed in conjunction with the above figures, and including the structure-identification algorithms of the filter class 180 of FIG. 2 as additionally discussed in Tables 2-7. The application was run on a computing device, and the validation testing results are contained in Table 8 as follows:
- the testing validated the structure-identification algorithms for the cellular components.
- the various embodiments of the invention may be implemented as a sequence of computer-implemented steps or program modules running on a computing system and/or as interconnected-machine logic circuits or circuit modules within the computing system.
- the implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention.
- the functions and operation of the various embodiments disclosed may be implemented in software, in firmware, in special purpose digital logic, or any combination thereof without deviating from the spirit or scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004512591A JP2005530138A (en) | 2002-06-18 | 2003-06-17 | Computer-aided image capture of important structures in tissue specimens |
CA002492071A CA2492071A1 (en) | 2002-06-18 | 2003-06-17 | Computerized image capture of structures of interest within a tissue sample |
AU2003245561A AU2003245561A1 (en) | 2002-06-18 | 2003-06-17 | Computerized image capture of structures of interest within a tissue sample |
EP03739187A EP1534114A2 (en) | 2002-06-18 | 2003-06-17 | Computerized image capture of structures of interest within a tissue sample |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US38985902P | 2002-06-18 | 2002-06-18 | |
US60/389,859 | 2002-06-18 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2003105675A2 true WO2003105675A2 (en) | 2003-12-24 |
WO2003105675A3 WO2003105675A3 (en) | 2004-03-11 |
Family
ID=29736681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2003/019206 WO2003105675A2 (en) | 2002-06-18 | 2003-06-17 | Computerized image capture of structures of interest within a tissue sample |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1534114A2 (en) |
JP (1) | JP2005530138A (en) |
AU (1) | AU2003245561A1 (en) |
CA (1) | CA2492071A1 (en) |
WO (1) | WO2003105675A2 (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005096225A1 (en) * | 2004-03-27 | 2005-10-13 | Bioimagene, Inc. | Method and system for automated detection of immunohistochemical (ihc) patterns |
JP2006064662A (en) * | 2004-08-30 | 2006-03-09 | Anritsu Sanki System Co Ltd | Foreign matter detection method, foreign matter detection program, and foreign matter detector |
WO2006122251A2 (en) * | 2005-05-10 | 2006-11-16 | Bioimagene, Inc. | Method and system for automated digital image analysis of prostrate neoplasms using morphologic patterns |
EP2002395A2 (en) * | 2006-03-28 | 2008-12-17 | Koninklijke Philips Electronics N.V. | Identification and visualization of regions of interest in medical imaging |
DE102007033793A1 (en) * | 2007-07-19 | 2009-01-22 | Carl Zeiss Imaging Solutions Gmbh | Method and apparatus for microscopically examining a sample, computer program and computer program product |
US7576912B2 (en) | 2004-05-07 | 2009-08-18 | P.A.L.M. Microlaser Technologies Gmbh | Microscope table and insert |
EP2143043A1 (en) * | 2007-05-07 | 2010-01-13 | Amersham Biosciences Corp. | System and method for the automated analysis of cellular assays and tissues |
US7848552B2 (en) | 2004-05-11 | 2010-12-07 | P.A.L.M. Microlaser Technologies Gmbh | Method for processing a material by means of a laser irradiation and control system |
US7941275B2 (en) | 2003-09-10 | 2011-05-10 | Ventana Medical Systems, Inc. | Method and system for automated detection of immunohistochemical (IHC) patterns |
EP2407781A1 (en) * | 2010-07-15 | 2012-01-18 | Olympus Corporation | Cell observation apparatus and observation method |
DE102011084286A1 (en) * | 2011-10-11 | 2013-04-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | METHOD AND DEVICE FOR QUANTIFYING INJURY OF A SKIN TISSUE CUTTING |
US8428887B2 (en) | 2003-09-08 | 2013-04-23 | Ventana Medical Systems, Inc. | Method for automated processing of digital images of tissue micro-arrays (TMA) |
DE102012218382A1 (en) * | 2012-10-09 | 2014-04-10 | Leica Microsystems Cms Gmbh | Method for determining a laser microdissection range and associated laser microdissection system |
GB2514197A (en) * | 2013-05-14 | 2014-11-19 | Pathxl Ltd | Method and apparatus |
US9098736B2 (en) | 2007-09-21 | 2015-08-04 | Leica Biosystems Imaging, Inc. | Image quality for diagnostic resolution digital slide images |
WO2017051190A1 (en) * | 2015-09-23 | 2017-03-30 | Pathxl Limited | Method and apparatus for tissue recognition |
US9799113B2 (en) | 2015-05-21 | 2017-10-24 | Invicro Llc | Multi-spectral three dimensional imaging system and method |
US9842391B2 (en) | 2013-05-14 | 2017-12-12 | Pathxl Limited | Method and apparatus for processing an image of a tissue sample |
JP2018507405A (en) * | 2015-02-06 | 2018-03-15 | ライフ テクノロジーズ コーポレーション | Method and system for biological instrument calibration |
CN110060309A (en) * | 2017-12-06 | 2019-07-26 | 康耐视公司 | The local tone mapping read for symbol |
CN110210308A (en) * | 2019-04-30 | 2019-09-06 | 南方医科大学南方医院 | The recognition methods of biological tissue images and device |
EP3428262A4 (en) * | 2016-03-11 | 2019-11-06 | Nikon Corporation | Image processing device, observation device, and program |
WO2019190576A3 (en) * | 2017-06-13 | 2019-12-19 | The Trustees Of Princeton University | Fully automatic, template -free particle picking for electron microscopy |
CN111008461A (en) * | 2019-11-20 | 2020-04-14 | 中国辐射防护研究院 | Human body digital model design method, system and model for radiation protection |
US10671832B2 (en) | 2015-09-23 | 2020-06-02 | Koninklijke Philips N.V. | Method and apparatus for tissue recognition |
WO2020154203A1 (en) * | 2019-01-22 | 2020-07-30 | Applied Materials, Inc. | Capture and storage of magnified images |
CN111583235A (en) * | 2020-05-09 | 2020-08-25 | 中南大学 | Branch point identification vertex extraction method and system for detecting cellular regularity |
CN112102311A (en) * | 2020-09-27 | 2020-12-18 | 平安科技(深圳)有限公司 | Thyroid nodule image processing method and device and computer equipment |
US11227403B2 (en) | 2017-04-24 | 2022-01-18 | The Trustees Of Princeton University | Anisotropic twicing for single particle reconstruction using autocorrelation analysis |
CN113975660A (en) * | 2021-10-29 | 2022-01-28 | 中国科学院合肥物质科学研究院 | Method and equipment for monitoring displacement of tumor target area |
WO2022034536A1 (en) * | 2020-08-12 | 2022-02-17 | Ods Medical Inc. | Automated raman signal collection device |
US11255785B2 (en) | 2019-03-14 | 2022-02-22 | Applied Materials, Inc. | Identifying fiducial markers in fluorescence microscope images |
US11321839B2 (en) | 2019-09-24 | 2022-05-03 | Applied Materials, Inc. | Interactive training of a machine learning model for tissue segmentation |
CN114812450A (en) * | 2022-04-25 | 2022-07-29 | 山东省路桥集团有限公司 | Machine vision-based asphalt pavement construction uniformity detection and evaluation method |
US11406455B2 (en) | 2018-04-25 | 2022-08-09 | Carl Zeiss Meditec Ag | Microscopy system and method for operating the microscopy system |
US11469075B2 (en) | 2019-03-14 | 2022-10-11 | Applied Materials, Inc. | Identifying fiducial markers in microscope images |
US11624708B2 (en) | 2019-12-17 | 2023-04-11 | Applied Materials, Inc. | Image processing techniques in multiplexed fluorescence in-situ hybridization |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5321145B2 (en) | 2009-03-04 | 2013-10-23 | 日本電気株式会社 | Image diagnosis support apparatus, image diagnosis support method, image diagnosis support program, and storage medium thereof |
JP5547597B2 (en) * | 2010-09-29 | 2014-07-16 | 大日本スクリーン製造株式会社 | Pathological diagnosis support device, pathological diagnosis support method, control program for pathological diagnosis support, and recording medium recording the control program |
US9390313B2 (en) | 2012-04-23 | 2016-07-12 | Nec Corporation | Image measurement apparatus and image measurment method measuring the cell neclei count |
WO2016001223A1 (en) * | 2014-06-30 | 2016-01-07 | Ventana Medical Systems, Inc. | Detecting edges of a nucleus using image analysis |
CN107209937B (en) * | 2015-01-31 | 2021-10-15 | 文塔纳医疗系统公司 | System and method for region of interest detection using slide thumbnail images |
JP7137935B2 (en) * | 2018-02-27 | 2022-09-15 | シスメックス株式会社 | Image analysis method, image analysis device, program, method for manufacturing trained deep learning algorithm, and trained deep learning algorithm |
KR102111533B1 (en) * | 2018-07-11 | 2020-06-08 | 한국생산기술연구원 | Apparatus and method for measuring density of metal product using optical microscope image |
WO2021014557A1 (en) | 2019-07-23 | 2021-01-28 | 日本電信電話株式会社 | Mesh structure facility detection device, mesh structure facility detection method, and program |
CN111458835A (en) * | 2020-04-16 | 2020-07-28 | 东南大学 | Multi-view automatic focusing system of microscope and using method thereof |
WO2023195405A1 (en) * | 2022-04-04 | 2023-10-12 | 京セラコミュニケーションシステム株式会社 | Cell detection device, cell diagnosis support device, cell detection method, and cell detection program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5898797A (en) * | 1994-04-15 | 1999-04-27 | Base Ten Systems, Inc. | Image processing system and method |
US6205348B1 (en) * | 1993-11-29 | 2001-03-20 | Arch Development Corporation | Method and system for the computerized radiographic analysis of bone |
-
2003
- 2003-06-17 CA CA002492071A patent/CA2492071A1/en not_active Abandoned
- 2003-06-17 JP JP2004512591A patent/JP2005530138A/en not_active Withdrawn
- 2003-06-17 WO PCT/US2003/019206 patent/WO2003105675A2/en not_active Application Discontinuation
- 2003-06-17 AU AU2003245561A patent/AU2003245561A1/en not_active Abandoned
- 2003-06-17 EP EP03739187A patent/EP1534114A2/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6205348B1 (en) * | 1993-11-29 | 2001-03-20 | Arch Development Corporation | Method and system for the computerized radiographic analysis of bone |
US5898797A (en) * | 1994-04-15 | 1999-04-27 | Base Ten Systems, Inc. | Image processing system and method |
Cited By (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8428887B2 (en) | 2003-09-08 | 2013-04-23 | Ventana Medical Systems, Inc. | Method for automated processing of digital images of tissue micro-arrays (TMA) |
US8515683B2 (en) | 2003-09-10 | 2013-08-20 | Ventana Medical Systems, Inc. | Method and system for automated detection of immunohistochemical (IHC) patterns |
US7979212B2 (en) | 2003-09-10 | 2011-07-12 | Ventana Medical Systems, Inc. | Method and system for morphology based mitosis identification and classification of digital images |
US7941275B2 (en) | 2003-09-10 | 2011-05-10 | Ventana Medical Systems, Inc. | Method and system for automated detection of immunohistochemical (IHC) patterns |
WO2005096225A1 (en) * | 2004-03-27 | 2005-10-13 | Bioimagene, Inc. | Method and system for automated detection of immunohistochemical (ihc) patterns |
US7576912B2 (en) | 2004-05-07 | 2009-08-18 | P.A.L.M. Microlaser Technologies Gmbh | Microscope table and insert |
US7848552B2 (en) | 2004-05-11 | 2010-12-07 | P.A.L.M. Microlaser Technologies Gmbh | Method for processing a material by means of a laser irradiation and control system |
DE102004023262B8 (en) * | 2004-05-11 | 2013-01-17 | Carl Zeiss Microimaging Gmbh | Method for processing a mass by means of laser irradiation and control system |
DE102004023262B4 (en) * | 2004-05-11 | 2012-08-09 | Carl Zeiss Microimaging Gmbh | Method for processing a mass by means of laser irradiation and control system |
JP2006064662A (en) * | 2004-08-30 | 2006-03-09 | Anritsu Sanki System Co Ltd | Foreign matter detection method, foreign matter detection program, and foreign matter detector |
WO2006122251A3 (en) * | 2005-05-10 | 2007-05-31 | Bioimagne Inc | Method and system for automated digital image analysis of prostrate neoplasms using morphologic patterns |
WO2006122251A2 (en) * | 2005-05-10 | 2006-11-16 | Bioimagene, Inc. | Method and system for automated digital image analysis of prostrate neoplasms using morphologic patterns |
EP2002395A2 (en) * | 2006-03-28 | 2008-12-17 | Koninklijke Philips Electronics N.V. | Identification and visualization of regions of interest in medical imaging |
EP2143043A1 (en) * | 2007-05-07 | 2010-01-13 | Amersham Biosciences Corp. | System and method for the automated analysis of cellular assays and tissues |
EP2143043A4 (en) * | 2007-05-07 | 2011-01-12 | Ge Healthcare Bio Sciences | System and method for the automated analysis of cellular assays and tissues |
EP2130087B1 (en) * | 2007-07-19 | 2014-10-08 | Carl Zeiss Microscopy GmbH | Method and device for microscopically examining a sample, computer program, and computer program product |
WO2009010115A1 (en) | 2007-07-19 | 2009-01-22 | Carl Zeiss Imaging Solutions Gmbh | Method and device for microscopically examining a sample, computer program, and computer program product |
DE102007033793A1 (en) * | 2007-07-19 | 2009-01-22 | Carl Zeiss Imaging Solutions Gmbh | Method and apparatus for microscopically examining a sample, computer program and computer program product |
US9098736B2 (en) | 2007-09-21 | 2015-08-04 | Leica Biosystems Imaging, Inc. | Image quality for diagnostic resolution digital slide images |
US9710694B2 (en) | 2007-09-21 | 2017-07-18 | Leica Biosystems Imaging, Inc. | Image quality for diagnostic resolution digital slide images |
EP2407781A1 (en) * | 2010-07-15 | 2012-01-18 | Olympus Corporation | Cell observation apparatus and observation method |
US8957957B2 (en) | 2010-07-15 | 2015-02-17 | Olympus Corporation | Cell observation apparatus and observation method |
EP2581878A3 (en) * | 2011-10-11 | 2017-12-13 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and apparatus for quantification of damage to a skin tissue section |
DE102011084286A1 (en) * | 2011-10-11 | 2013-04-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | METHOD AND DEVICE FOR QUANTIFYING INJURY OF A SKIN TISSUE CUTTING |
US9804144B2 (en) | 2012-10-09 | 2017-10-31 | Leica Microsystems Cms Gmbh | Method for defining a laser microdissection region, and associated laser microdissection system |
DE102012218382B4 (en) * | 2012-10-09 | 2015-04-23 | Leica Microsystems Cms Gmbh | Method for determining a laser microdissection range and associated laser microdissection system |
DE102012218382A1 (en) * | 2012-10-09 | 2014-04-10 | Leica Microsystems Cms Gmbh | Method for determining a laser microdissection range and associated laser microdissection system |
US9842391B2 (en) | 2013-05-14 | 2017-12-12 | Pathxl Limited | Method and apparatus for processing an image of a tissue sample |
GB2514197A (en) * | 2013-05-14 | 2014-11-19 | Pathxl Ltd | Method and apparatus |
JP2018507405A (en) * | 2015-02-06 | 2018-03-15 | ライフ テクノロジーズ コーポレーション | Method and system for biological instrument calibration |
US9799113B2 (en) | 2015-05-21 | 2017-10-24 | Invicro Llc | Multi-spectral three dimensional imaging system and method |
WO2017051190A1 (en) * | 2015-09-23 | 2017-03-30 | Pathxl Limited | Method and apparatus for tissue recognition |
CN108352062A (en) * | 2015-09-23 | 2018-07-31 | 皇家飞利浦有限公司 | Method and apparatus for tissue identification |
US10565706B2 (en) | 2015-09-23 | 2020-02-18 | Koninklijke Philips N.V. | Method and apparatus for tissue recognition |
US10671832B2 (en) | 2015-09-23 | 2020-06-02 | Koninklijke Philips N.V. | Method and apparatus for tissue recognition |
US11068695B2 (en) | 2016-03-11 | 2021-07-20 | Nikon Corporation | Image processing device, observation device, and program |
EP3428262A4 (en) * | 2016-03-11 | 2019-11-06 | Nikon Corporation | Image processing device, observation device, and program |
US11227403B2 (en) | 2017-04-24 | 2022-01-18 | The Trustees Of Princeton University | Anisotropic twicing for single particle reconstruction using autocorrelation analysis |
US11557034B2 (en) | 2017-06-13 | 2023-01-17 | The Trustees Of Princeton University | Fully automatic, template-free particle picking for electron microscopy |
WO2019190576A3 (en) * | 2017-06-13 | 2019-12-19 | The Trustees Of Princeton University | Fully automatic, template -free particle picking for electron microscopy |
CN110060309A (en) * | 2017-12-06 | 2019-07-26 | 康耐视公司 | The local tone mapping read for symbol |
CN110060309B (en) * | 2017-12-06 | 2023-07-11 | 康耐视公司 | Local tone mapping for symbol reading |
US11406455B2 (en) | 2018-04-25 | 2022-08-09 | Carl Zeiss Meditec Ag | Microscopy system and method for operating the microscopy system |
US11806092B2 (en) | 2018-04-25 | 2023-11-07 | Carl Zeiss Meditec Ag | Microscopy system and method for operating the microscopy system |
WO2020154203A1 (en) * | 2019-01-22 | 2020-07-30 | Applied Materials, Inc. | Capture and storage of magnified images |
CN113439227A (en) * | 2019-01-22 | 2021-09-24 | 应用材料公司 | Capturing and storing magnified images |
US11232561B2 (en) | 2019-01-22 | 2022-01-25 | Applied Materials, Inc. | Capture and storage of magnified images |
US11694331B2 (en) | 2019-01-22 | 2023-07-04 | Applied Materials, Inc. | Capture and storage of magnified images |
US11255785B2 (en) | 2019-03-14 | 2022-02-22 | Applied Materials, Inc. | Identifying fiducial markers in fluorescence microscope images |
US11469075B2 (en) | 2019-03-14 | 2022-10-11 | Applied Materials, Inc. | Identifying fiducial markers in microscope images |
CN110210308B (en) * | 2019-04-30 | 2023-05-02 | 南方医科大学南方医院 | Biological tissue image identification method and device |
CN110210308A (en) * | 2019-04-30 | 2019-09-06 | 南方医科大学南方医院 | The recognition methods of biological tissue images and device |
US11663722B2 (en) | 2019-09-24 | 2023-05-30 | Applied Materials, Inc. | Interactive training of a machine learning model for tissue segmentation |
US11321839B2 (en) | 2019-09-24 | 2022-05-03 | Applied Materials, Inc. | Interactive training of a machine learning model for tissue segmentation |
CN111008461A (en) * | 2019-11-20 | 2020-04-14 | 中国辐射防护研究院 | Human body digital model design method, system and model for radiation protection |
CN111008461B (en) * | 2019-11-20 | 2023-11-14 | 中国辐射防护研究院 | Human body digital model design method, system and model for radiation protection |
US11624708B2 (en) | 2019-12-17 | 2023-04-11 | Applied Materials, Inc. | Image processing techniques in multiplexed fluorescence in-situ hybridization |
US11630067B2 (en) | 2019-12-17 | 2023-04-18 | Applied Materials, Inc. | System for acquisition and processing of multiplexed fluorescence in-situ hybridization images |
US11783916B2 (en) | 2019-12-17 | 2023-10-10 | Applied Materials, Inc. | System and method for acquisition and processing of multiplexed fluorescence in-situ hybridization images |
CN111583235A (en) * | 2020-05-09 | 2020-08-25 | 中南大学 | Branch point identification vertex extraction method and system for detecting cellular regularity |
CN111583235B (en) * | 2020-05-09 | 2023-04-18 | 中南大学 | Branch point identification vertex extraction method and system for detecting cellular regularity |
WO2022034536A1 (en) * | 2020-08-12 | 2022-02-17 | Ods Medical Inc. | Automated raman signal collection device |
CN112102311B (en) * | 2020-09-27 | 2023-07-18 | 平安科技(深圳)有限公司 | Thyroid nodule image processing method and device and computer equipment |
CN112102311A (en) * | 2020-09-27 | 2020-12-18 | 平安科技(深圳)有限公司 | Thyroid nodule image processing method and device and computer equipment |
CN113975660A (en) * | 2021-10-29 | 2022-01-28 | 中国科学院合肥物质科学研究院 | Method and equipment for monitoring displacement of tumor target area |
CN113975660B (en) * | 2021-10-29 | 2024-04-30 | 中国科学院合肥物质科学研究院 | Tumor target area displacement monitoring method and equipment |
CN114812450A (en) * | 2022-04-25 | 2022-07-29 | 山东省路桥集团有限公司 | Machine vision-based asphalt pavement construction uniformity detection and evaluation method |
CN114812450B (en) * | 2022-04-25 | 2023-10-20 | 山东省路桥集团有限公司 | Asphalt pavement construction uniformity detection and evaluation method based on machine vision |
Also Published As
Publication number | Publication date |
---|---|
EP1534114A2 (en) | 2005-06-01 |
WO2003105675A3 (en) | 2004-03-11 |
AU2003245561A8 (en) | 2003-12-31 |
AU2003245561A1 (en) | 2003-12-31 |
CA2492071A1 (en) | 2003-12-24 |
JP2005530138A (en) | 2005-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060127880A1 (en) | Computerized image capture of structures of interest within a tissue sample | |
WO2003105675A2 (en) | Computerized image capture of structures of interest within a tissue sample | |
CN111448569B (en) | Method for storing and retrieving digital pathology analysis results | |
KR20220015368A (en) | Computer-aided review of tumor and postoperative tumor margin assessment within histological images | |
Diamond et al. | The use of morphological characteristics and texture analysis in the identification of tissue composition in prostatic neoplasia | |
US20050123181A1 (en) | Automated microscope slide tissue sample mapping and image acquisition | |
Hamilton et al. | Automated location of dysplastic fields in colorectal histology using image texture analysis | |
CN114972548A (en) | Image processing system and method for displaying multiple images of a biological specimen | |
US20090245612A1 (en) | Automated image analysis | |
US20110286654A1 (en) | Segmentation of Biological Image Data | |
EP3149700A1 (en) | An image processing method and system for analyzing a multi-channel image obtained from a biological tissue sample being stained by multiple stains | |
WO2004044845A2 (en) | Image analysis | |
KR102140385B1 (en) | Cell-zone labeling apparatus and cell-zone detecting system including the same apparatus | |
US11959848B2 (en) | Method of storing and retrieving digital pathology analysis results | |
US11615532B2 (en) | Quantitation of signal in stain aggregates | |
US20230251199A1 (en) | Identifying auto-fluorescent artifacts in a multiplexed immunofluorescent image | |
Oikawa et al. | Pathological diagnosis of gastric cancers with a novel computerized analysis system | |
Vidal et al. | A fully automated approach to prostate biopsy segmentation based on level-set and mean filtering | |
Fuchs et al. | Weakly supervised cell nuclei detection and segmentation on tissue microarrays of renal clear cell carcinoma | |
Sreelekshmi et al. | SwinCNN: An Integrated Swin Trasformer and CNN for Improved Breast Cancer Grade Classification | |
Pławiak-Mowna et al. | On effectiveness of human cell nuclei detection depending on digital image color representation | |
Han et al. | Multi-resolution tile-based follicle detection using color and textural information of follicular lymphoma IHC slides | |
Elmoataz et al. | Automated segmentation of cytological and histological images for the nuclear quantification: an adaptive approach based on mathematical morphology | |
Chaudry | Improving cancer subtype diagnosis and grading using clinical decision support system based on computer-aided tissue image analysis. | |
Neuman et al. | E-mail: urszula. neuman@ gmail. com |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004512591 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2492071 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003739187 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003739187 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2003739187 Country of ref document: EP |