WO2006087526A1 - Apparatus and method for processing of specimen images for use in computer analysis thereof - Google Patents
Apparatus and method for processing of specimen images for use in computer analysis thereof Download PDFInfo
- Publication number
- WO2006087526A1 WO2006087526A1 PCT/GB2006/000498 GB2006000498W WO2006087526A1 WO 2006087526 A1 WO2006087526 A1 WO 2006087526A1 GB 2006000498 W GB2006000498 W GB 2006000498W WO 2006087526 A1 WO2006087526 A1 WO 2006087526A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image mask
- biological specimen
- image
- areas
- layers
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present invention relates to image processing, and more particularly to the processing of digital images of a microscope specimen to identify parts of the image of the specimen for computer analysis.
- the computer analysis of biological specimens performed on digital images of a specimen acquired from scanning the specimen on a microscope slide has advanced the accuracy of diagnostic and prognostic techniques, and is increasingly used for research purposes to identify the effect of drags on biological tissue.
- One example of such computer image analysis techniques is field fraction measurements, which are widely used to determine a quantity of stain taken up by a stained biological specimen, the quantity of stain being associated with an attribute or characteristic of the specimen.
- Another example involves counting of the number of features (e.g. cell nuclei) within a specimen.
- Such computer measurements are vastly superior to the subjective and potentially less accurate corresponding manual analysis previously performed by laboratory technicians or pathologists in analysing a specimen directly using a microscope.
- the computer analysis of a biological specimen image concerns only the analysis of a very specific type of tissue, which corresponds to only part of the specimen.
- the analysis may only be concerned with a certain region of stratified epithelial tissue, such as the basal layer of the epidermis of a skin specimen. If other regions of tissue, or other types of tissue, are present within the biological specimen image analysed, the computer analysis might produce misleading results.
- a laboratory technician will review the digital image of a specimen on a computer monitor prior to computer analysis thereof, and identify areas of the image showing the tissue region of interest for image analysis.
- conventional software applications allow a user to manually draw lines around areas of the specimen image displayed on a computer monitor using a mouse or equivalent.
- the present invention provides a method which automatically performs the above described delineation on an image of a biological specimen by, for example, distinguishing parts of the image corresponding to tissue having a certain morphological structure, such as a region of stratified epithelial tissue, from parts if the image corresponding to other regions of epithelial tissue, connective tissue and other biological and non-biological material.
- a certain morphological structure such as a region of stratified epithelial tissue
- the present invention provides a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers
- the corresponding image masks for the one or more layers may specifically represent the stained areas of the biological specimen or may specifically represent the non-stained areas of the biological image i.e. the corresponding image mask may be a positive or negative image.
- One or more of the image masks may or may not be displayed during the method.
- the method may comprise using autocorrelation techniques on a greyscale image of the biological specimen to determine the spatial frequency of cell nuclei and thereby determine the width dimension of one or more cell layers.
- the method may comprise determining the distance value of an object pixel from one or more object borders to determine the position of the pixel object within the object.
- the method may comprise using a distance transform algorithm to determine the distance of one or more pixels from one or more object borders to determine the position of upper boundary of the basal layer of the epithelial tissue.
- the method may be used to distinguish a basal layer of the epithelial tissue from other areas of the biological specimen.
- the cells in a particular layer may have similar cell widths.
- the cells in different layers may have different cell widths.
- the method may comprise delineating layers of a stratified epithelial tissue using the corresponding image masks for the one or more layers.
- the step of identifying and adapting isolated areas of the image may comprise blurring regions of the image mask representing stained areas of the biological specimen by an amount to increase interconnection of clumped areas of the image mask.
- the step of identifying and adapting isolated areas of the image may comprise measuring a two dimensional area of regions of the image mask representing stained areas of the biological specimen, and adapting the image mask such that areas of the image mask below a threshold size are represented as non-stained areas of the biological specimen.
- the method may comprise adapting the image mask to remove noise from the image mask prior to the identifying and adapting process.
- the method may comprise adapting the image mask to remove areas of the image mask corresponding to isolated cells below a predetermined size prior to the identifying and adapting process.
- the method may comprise using a morphological opening operator prior to the identifying and adapting process.
- the method may comprise using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove noise.
- the method may comprise using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove regions of the image mask representing cells of a size below a predetermined threshold.
- the method may comprise smoothing the boundaries of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
- the method may comprise using a closing morphological operator on regions of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
- the image mask may be a binary image mask.
- the present invention provides an image mask produced by a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish
- the present invention provides computer code arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to
- the present invention provides apparatus arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish
- Figure 1 depicts a typical image of a specimen containing stratified epithelial tissue
- Figure 2 is a flow diagram illustrating the method of an embodiment of the present invention
- Figure 3 is a flow diagram illustrating the method of identifying parts of a specimen image corresponding to different layers of stratified epithelial tissue in accordance with a preferred embodiment of the present invention
- Figure 4 is a black and white image mask for the specimen image of Figure 1 resulting from step 150 of the method of Figure 3 ;
- Figure 5 is a black and white image mask for the specimen image of Figure 1 following step 180 of the method of Figure 3;
- Figure 6 shows the specimen of Figure 1 in which parts of the image corresponding to certain layers within the stratified epithelium have been delineated in accordance with the method of Figure 3.
- Figure 1 shows an image of a typical biological specimen containing stratified epithelium, which has been prepared from a sample of human tissue, taken from an individual for pathology analysis.
- the specimen is a skin tissue section measuring approximately 2mm x 2mm taken from an individual as part of a study of the effects of a new drag.
- a skin sample is taken before, during, and at the end of the drug study and a biomarker is applied to the skin specimen using conventional pathology techniques.
- the biomarker biologically marks cells, or part of cells such as cell nuclei, cell membranes or cytoplasm, on which the effect of the drag is being examined.
- a stain may be used to visibly reveal the presence of the biomarker under the microscope, and thus in the specimen image, if the biomarker itself is not coloured.
- the skin tissue section includes layers of structured epithelial tissue of the epidermis of the skin (to the left hand side of the image), and various underlying cells of the dermis comprising mainly connective tissue (to the right hand side of the image).
- Computer analysis is typically performed on the lower basal layer of epithelial tissue of the epidermis, where cell reproduction takes place, although other epithelial layers of the epidermis, or the whole of the epithelial part of the epidermis, may be analysed.
- the present invention is applicable to other types of stratified epithelial tissue, or other layered tissue types having an identifiable morphological structure.
- the present invention may be used for the analysis of stained biological specimens for other purposes, such as for conventional prognostic and diagnostic analysis.
- the general region of the image containing the epithelium of the epidermis is evident, and corresponds to the region that is most highly stained (i.e. the greyest) and has a high concentration of regularly arranges cell nuclei.
- the manual identification of the boundaries that correspond to the top and bottom of the epithelium is difficult. The boundaries
- the task of accurately drawing the lines around the basal layer typically takes about 30 minutes per specimen.
- the regions of the image to undergo analysis it is likely that such regions may include parts of the image that do not contain epithelial cells and/or epithelial cells of the basal layer of the epidermis.
- the computer analysis may be performed on parts of the specimen image which should not, ideally, be considered in the analysis. This may lead to poor results.
- the present invention aims to automatically identify and delineate parts of an image of a biological specimen, which correspond to one or more specific layers of a stratified tissue structure, for computer analysis.
- Figure 2 shows the steps performed in accordance with a method of the present invention.
- the method is typically performed by a computer program running on a computer receiving digital specimen images acquired from scanning a microscope slide containing the specimen, in accordance with conventional techniques.
- the method receives high resolution image data for the specimen which is to be subject to analysis.
- the analysis technique may be any form of computer analysis used for the purposes of prognosis or diagnosis, drug studies or other pathology related purposes.
- the high resolution image data received at step 10 may be the image data for a complete specimen obtained from the scanning of the microscope slide, or, more preferably, it may be a user or computer determined selected large part of the specimen.
- the method divides the image data received at step 10 into multiple, smaller image portions. Typically, the image is divided into image squares, which may each be more conveniently processed separately, and are of a chosen size to ensure efficient image processing using the techniques described herein.
- the method processes, in turn, each image portion to identify areas of the image contained therein which should undergo computer analysis.
- the image processing performed at step 30 identifies or delineates areas of epithelial tissue of the epidermis within a specimen image, and also a specific layer of the epithelial epidermis, namely the basal layer, using the method of a preferred embodiment of the present invention illustrated in Figure 3 and described below.
- the method performs the conventional computer analysis technique on selected, delineated areas of the image from step 30.
- the method of Figure 2 performs image analysis only on computer- determined areas of the specimen image which are relevant to the computer analysis technique.
- the results of computer analysis are optimised, and the results are consequently more useful to the pathologist.
- Figure 3 illustrates a method for image processing, to identify image areas of one or more layers of stratified epithelial tissue to undergo analysis, in accordance with a preferred embodiment of the present invention.
- the method is typically performed by a computer program running on a computer, which may form part of, or be separate from, the program performing the method illustrated in Figure 2, and may run on the same or a different computer.
- the program is typically stored on a computer readable medium such as magnetic or optical disk, which may be loaded into the memory of the computer on which it is run.
- a computer readable medium such as magnetic or optical disk
- the method receives a first portion of a high resolution image of the biological specimen.
- the image portion is one of multiple, relatively small image portions from a complete specimen, or large part thereof , as provided by step 20 of the method of Figure 2.
- the method performs image thresholding to provide a black and white (i.e. binarised) image from the original colour or greyscale image.
- the automatic thresholding of images is well known in the art and will be familiar to the skilled person. A description of the thresholding technique is described in a paper by N Otsu entitled “A Threshold Selection Method From Grey-Level Histograms", IEEE Trans. Systems, Man and Cybernetics, VoI 9, No 1, pages 377-393, 1979.
- the thresholding of step 120 results in a black and white image mask (not shown), whereby the white areas of the mask represent the most highly marked or stained portions of the original specimen image from Figure 1.
- the image mask includes white areas corresponding the parts of the specimen marked with the biomarker, i.e. cells within the epithelial epidermis, and any other parts of the specimen that have been stained, i.e. connective tissue cells within the dermis.
- the mask may also include white areas corresponding to "noise" resulting from image processing (such as the thresholding step).
- the method filters out or removes objects (white areas) within the image mask which are .considered to be small. This removes from the image mask objects corresponding to small cells and cell nuclei, such as small connective tissue cells of the dermis, as well as noise resulting from the thresholding process.
- This filtering technique may be performed using the "morphological opening operator" well known in the field of image processing. This results in a black and white image mask in which narrow noise lines and non-epithelial smaller cell nuclei present following step 120 are now removed.
- the method filters out or removes from the image, objects which are not part of a regular tissue configuration comprising groups of interconnected cells (referred to generally herein as "clumped cells") typical of epithelial tissue.
- epithelial tissue such as the tissue of the lower layers of the epidermis, typically comprises cells which are clumped or joined together to take on a structured form, as shown in Figure 1.
- clumped cells although in close proximity to one another, will not necessarily all be touching one another in the image mask.
- white regions are blurred in the x and y directions by a particular amount to expand their size.
- neighbouring cells forming part of the clumped structure which were previously not touching in the image mask, will now be touching one other.
- the blurring process will also increase the size of regions not considered to be part of the clumped structure (i.e. isolated regions), they will, in most (if not all) cases lead to these regions remaining as "non- clumped" (i.e. they will still be isolated regions not forming a clumped structure. This is done by ensuring that the blurring process, which is part of the filtering process 140, does not over-expand the size of white regions/objects to produce such errors.
- the filtering process of step 140 measures the two-dimensional area of each object (i.e. white area) within the image mask, and removes such white areas which are below a predefined threshold area.
- the threshold area is predetermined based on the type of specimen, particularly the tissue type of interest, the magnification of the objective lens used to acquire the specimen image, and other known factors.
- the threshold is chosen so that the process removes objects of a size below the normal size for the tissue of interest.
- the measurement of the area of each object within the image is achieved by first identifying each individual object within the image, and then determining the number of pixels (which is proportional to the area, and thus the size, of the object).
- the method next removes the fine detail from the boundaries of the image, in order to provide a smooth outline for the mask. This removes any fine black lines, extending into the epithelial tissue, which may be present as a result of the staining of the specimen and the accuracy of the image processing.
- Step 150 may be performed using the "closing morphological operator" well known in image processing. This results in the black and white image mask corresponding to a portion of the image mask shown in Figure 4, which is saved in memory as the main mask at step 150.
- the boundaries of the image mask of Figure 4 (which represents an image mask for the complete specimen of Figure 1) thus correspond to the bottom of the basal layer of the epidermis, and the top of the epithelium of the epidermis (which excludes the upper layers of the epidermis which comprise denucleated, dead cells).
- the program may proceed from step 150 to step 190 (described below) until all image portions have been processed. Then, the mask of Figure 4 may be used to delineate the relevant part of the image of Figure 1 for computer analysis.
- step 160 the method determines, for each object pixel within the mask, i.e. each white pixel, the distance from the closest black pixel, corresponding to the boundary of the object.
- This process may be performed by any conventional technique.
- the "distance transform algorithm” well known in image processing may be used.
- step 160 produces a "distance value" for each white or object pixel, whereby the distance value is lowest for pixels close to the object boundaries, and highest at the centre of the object, between the boundaries.
- the method determines the approximate number of pixels corresponding to the size of epithelial cells within the object. This may be achieved by determining the spatial frequency of cells and/or cell nuclei within the epithelial areas, and is preferably performed on a greyscale version of the image, using the mask to identify the epithelial areas. Any suitable technique may be used, such as autocorrelation, which measures the distance between repeating signals corresponding to repeating features within the image, and thus provides the spatial frequency of the cell nuclei, which are the distinct, repeating features within the image. The skilled person will appreciate that other methods for determining the cell dimension in the image are possible and contemplated.
- step 180 uses the results of steps 160 and 170 to determine the upper boundary of the basal layer of the epidermis.
- step 180 finds all white, object pixels that have a distance value determined by step 160 corresponding to the pixel distance from the boundary, equivalent to the value determined at step 170.
- step 170 determines that the spatial frequency of cells (i.e. cell size) equates to 30 pixels
- step 180 finds all pixels within the object that are 30 pixels from the object boundary, using the values determined at step 160. This may result in a portion of a mask similar to that of Figure 4, in which the object boundary is reduced by the number of pixels corresponding to the cell size, as shown in Figure 5.
- the mask of Figure 5 (which represents an image mask for the complete specimen of Figure 1), when superimposed on the mask of Figure 4, can be used to delineate the three relevant boundaries of the specimen image of Figure 1, namely the upper and lower basal layer boundaries and the upper layer of the epithelium of the epidermis. For this purpose, the upper boundary of the mask of Figure 5 is ignored.
- step 180 may simply identify and save the object pixels that are the relevant distance from the lower, basal layer boundary, thus identifying a line of pixels corresponding to the upper boundary of the basal layer. This line of pixels may then be combined with the boundaries of the mask of Figure 4 to provide the three boundaries to be superimposed on the specimen image for delineation of regions thereof.
- step 190 the program considers whether there is another image portion to be processed, and if so returns to step 110. Otherwise the program ends at step 200, and computer analysis may subsequently be performed in accordance with step 40 of Figure 2.
- Figure 6 shows the specimen image of Figure 1 on which automatic delineation has been performed in accordance with the preferred embodiment of the present invention.
- the method of the present invention identifies areas of epithelium, and specifically layers of stratified epithelial tissue more precisely and more quickly that manual delineation.
- Computer analysis may be performed on the delineated portions of the complete specimen, or large part thereof, received at step 10 of the method of Figure 2, using conventional techniques as described above in relation to step 40 of Figure 2.
- Various modifications and changes may be made to the described embodiments.
- the described embodiment of the present invention identifies a layer of stratified epithelium that is known to be approximately one cell in thickness, it may equally be used to identify layers having a known, greater thickness. It is intended to include all such variations, modifications and equivalents which fall within the scope of the present invention.
- the present invention encompasses one or more aspects or embodiments of the present invention in all various combinations whether or not specifically claimed or described in the present specification in that combination.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geometry (AREA)
- Molecular Biology (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Investigating Or Analysing Biological Materials (AREA)
Abstract
The present invention provides a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising: thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
Description
APPARATUS AND METHOD FOR PROCESSING OF SPECIMEN IMAGES FOR USE IN COMPUTER ANALYSIS THEREOF
Background
The present invention relates to image processing, and more particularly to the processing of digital images of a microscope specimen to identify parts of the image of the specimen for computer analysis.
The computer analysis of biological specimens performed on digital images of a specimen acquired from scanning the specimen on a microscope slide has advanced the accuracy of diagnostic and prognostic techniques, and is increasingly used for research purposes to identify the effect of drags on biological tissue. One example of such computer image analysis techniques is field fraction measurements, which are widely used to determine a quantity of stain taken up by a stained biological specimen, the quantity of stain being associated with an attribute or characteristic of the specimen. Another example involves counting of the number of features (e.g. cell nuclei) within a specimen. Such computer measurements are vastly superior to the subjective and potentially less accurate corresponding manual analysis previously performed by laboratory technicians or pathologists in analysing a specimen directly using a microscope.
In certain circumstances, the computer analysis of a biological specimen image concerns only the analysis of a very specific type of tissue, which corresponds to only part of the specimen. For example, the analysis may only be concerned with a certain region of stratified epithelial tissue, such as the basal layer of the epidermis of a skin specimen. If other regions of tissue, or other types of tissue, are present within the biological specimen image analysed, the computer analysis might produce misleading results. Thus, conventionally, a laboratory technician will review the digital image of a specimen on a computer monitor prior to computer analysis thereof, and identify areas of the image showing the tissue region of interest for image
analysis. Thus, conventional software applications allow a user to manually draw lines around areas of the specimen image displayed on a computer monitor using a mouse or equivalent.
Unfortunately, the manual identification or delineation of parts of the image of a biological specimen to undergo computer analysis is generally imprecise, subjective and extremely time consuming. The accuracy of identifying parts of the image containing the relevant tissue type, to undergo analysis, is dependent upon the user's skill, such as dexterity in using the mouse, and recognition of the types of cell or tissue to be analysed, as well as the time available to the user. Moreover, a technician may be required to perform delineation on several hundred specimen images at a time, which may lead to poor delineation by the user as he or she becomes tired or bored.
It would therefore be desirable to provide a method for automatically identifying tissue regions within a biological specimen image to undergo computer analysis by image processing, thereby avoiding the problems associated with perfoπning the function manually.
Summary of the Invention
Accordingly, the present invention provides a method which automatically performs the above described delineation on an image of a biological specimen by, for example, distinguishing parts of the image corresponding to tissue having a certain morphological structure, such as a region of stratified epithelial tissue, from parts if the image corresponding to other regions of epithelial tissue, connective tissue and other biological and non-biological material.
According to a first aspect, the present invention provides a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being
characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
The corresponding image masks for the one or more layers may specifically represent the stained areas of the biological specimen or may specifically represent the non-stained areas of the biological image i.e. the corresponding image mask may be a positive or negative image. One or more of the image masks may or may not be displayed during the method.
The method may comprise using autocorrelation techniques on a greyscale image of the biological specimen to determine the spatial frequency of cell nuclei and thereby determine the width dimension of one or more cell layers.
The method may comprise determining the distance value of an object pixel from one or more object borders to determine the position of the pixel object within the object.
The method may comprise using a distance transform algorithm to determine the distance of one or more pixels from one or more object borders to determine the position of upper boundary of the basal layer of the epithelial tissue.
The method may be used to distinguish a basal layer of the epithelial tissue from other areas of the biological specimen.
The cells in a particular layer may have similar cell widths. The cells in different layers may have different cell widths.
The method may comprise delineating layers of a stratified epithelial tissue using the corresponding image masks for the one or more layers.
The step of identifying and adapting isolated areas of the image may comprise blurring regions of the image mask representing stained areas of the biological specimen by an amount to increase interconnection of clumped areas of the image mask.
The step of identifying and adapting isolated areas of the image may comprise measuring a two dimensional area of regions of the image mask representing stained areas of the biological specimen, and adapting the image mask such that areas of the image mask below a threshold size are represented as non-stained areas of the biological specimen.
The method may comprise adapting the image mask to remove noise from the image mask prior to the identifying and adapting process.
The method may comprise adapting the image mask to remove areas of the image mask corresponding to isolated cells below a predetermined size prior to the identifying and adapting process.
The method may comprise using a morphological opening operator prior to the identifying and adapting process.
The method may comprise using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove noise.
The method may comprise using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove regions of the image mask representing cells of a size below a predetermined threshold.
The method may comprise smoothing the boundaries of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
The method may comprise using a closing morphological operator on regions of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
The image mask may be a binary image mask.
According to a second aspect, the present invention provides an image mask produced by a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained
areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
According to a third aspect, the present invention provides computer code arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension;
using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
According to a fourth aspect, the present invention provides apparatus arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
The present invention encompasses one or more of the above-mentioned aspects and/or embodiments in all various combinations whether or not specifically mentioned in that combination.
Description of Figures
Embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 depicts a typical image of a specimen containing stratified epithelial tissue;
Figure 2 is a flow diagram illustrating the method of an embodiment of the present invention;
Figure 3 is a flow diagram illustrating the method of identifying parts of a specimen image corresponding to different layers of stratified epithelial tissue in accordance with a preferred embodiment of the present invention;
Figure 4 is a black and white image mask for the specimen image of Figure 1 resulting from step 150 of the method of Figure 3 ;
Figure 5 is a black and white image mask for the specimen image of Figure 1 following step 180 of the method of Figure 3;
Figure 6 shows the specimen of Figure 1 in which parts of the image corresponding to certain layers within the stratified epithelium have been delineated in accordance with the method of Figure 3.
Specific Embodiments
Figure 1 shows an image of a typical biological specimen containing stratified epithelium, which has been prepared from a sample of human tissue, taken from an individual for pathology analysis. In particular, the specimen is a skin tissue section measuring approximately 2mm x 2mm taken from an individual as part of a study of
the effects of a new drag. A skin sample is taken before, during, and at the end of the drug study and a biomarker is applied to the skin specimen using conventional pathology techniques. The biomarker biologically marks cells, or part of cells such as cell nuclei, cell membranes or cytoplasm, on which the effect of the drag is being examined. A stain may be used to visibly reveal the presence of the biomarker under the microscope, and thus in the specimen image, if the biomarker itself is not coloured.
As shown in Figure 1, the skin tissue section includes layers of structured epithelial tissue of the epidermis of the skin (to the left hand side of the image), and various underlying cells of the dermis comprising mainly connective tissue (to the right hand side of the image). Computer analysis is typically performed on the lower basal layer of epithelial tissue of the epidermis, where cell reproduction takes place, although other epithelial layers of the epidermis, or the whole of the epithelial part of the epidermis, may be analysed.
The skilled person will appreciate that whilst the following description relates particularly to the analysis of the stratified epithelium of the epidermis of the skin, the present invention is applicable to other types of stratified epithelial tissue, or other layered tissue types having an identifiable morphological structure. In addition, the present invention may be used for the analysis of stained biological specimens for other purposes, such as for conventional prognostic and diagnostic analysis.
As is apparent from the image of Figure 1, the general region of the image containing the epithelium of the epidermis is evident, and corresponds to the region that is most highly stained (i.e. the greyest) and has a high concentration of regularly arranges cell nuclei. However, the manual identification of the boundaries that correspond to the top and bottom of the epithelium is difficult. The boundaries
(particularly the lower boundary corresponding to the bottom of the basal layer of the epidermis) are complex and intricate, requiring considerable time and skill for an individual to manually draw the boundaiy with the required degree of precision.
Moreover, the upper boundary of the epithelial tissue, below the upper epidermis which contains dead and denucleated cells), is difficult to clearly distinguish, as if the top of the basal layer, which is approximately one cell in thickness.
For these reasons, the task of accurately drawing the lines around the basal layer typically takes about 30 minutes per specimen. Moreover, due to the subjective and imprecise manual delineation of the regions of the image to undergo analysis, it is likely that such regions may include parts of the image that do not contain epithelial cells and/or epithelial cells of the basal layer of the epidermis. Thus, the computer analysis may be performed on parts of the specimen image which should not, ideally, be considered in the analysis. This may lead to poor results.
Accordingly, the present invention aims to automatically identify and delineate parts of an image of a biological specimen, which correspond to one or more specific layers of a stratified tissue structure, for computer analysis.
Figure 2 shows the steps performed in accordance with a method of the present invention. The method is typically performed by a computer program running on a computer receiving digital specimen images acquired from scanning a microscope slide containing the specimen, in accordance with conventional techniques.
At step 1O5 the method receives high resolution image data for the specimen which is to be subject to analysis. The analysis technique may be any form of computer analysis used for the purposes of prognosis or diagnosis, drug studies or other pathology related purposes. The high resolution image data received at step 10 may be the image data for a complete specimen obtained from the scanning of the microscope slide, or, more preferably, it may be a user or computer determined selected large part of the specimen.
At step 20, the method divides the image data received at step 10 into multiple, smaller image portions. Typically, the image is divided into image squares, which may each be more conveniently processed separately, and are of a chosen size to ensure efficient image processing using the techniques described herein.
At step 30, the method processes, in turn, each image portion to identify areas of the image contained therein which should undergo computer analysis. In a prefeiτed embodiment, the image processing performed at step 30 identifies or delineates areas of epithelial tissue of the epidermis within a specimen image, and also a specific layer of the epithelial epidermis, namely the basal layer, using the method of a preferred embodiment of the present invention illustrated in Figure 3 and described below.
At step 40, the method performs the conventional computer analysis technique on selected, delineated areas of the image from step 30.
Thus, the method of Figure 2 performs image analysis only on computer- determined areas of the specimen image which are relevant to the computer analysis technique. Thus, the results of computer analysis are optimised, and the results are consequently more useful to the pathologist.
Figure 3 illustrates a method for image processing, to identify image areas of one or more layers of stratified epithelial tissue to undergo analysis, in accordance with a preferred embodiment of the present invention. The method is typically performed by a computer program running on a computer, which may form part of, or be separate from, the program performing the method illustrated in Figure 2, and may run on the same or a different computer. The program is typically stored on a computer readable medium such as magnetic or optical disk, which may be loaded into the memory of the computer on which it is run. The skilled person will appreciate, from the following description, that not all of the illustrated method steps are essential to the present invention, although optimal results may be obtained for
certain tissue types using all of the steps of the preferred embodiment illustrated in Figure s.
At step 110, the method receives a first portion of a high resolution image of the biological specimen. Typically, the image portion is one of multiple, relatively small image portions from a complete specimen, or large part thereof , as provided by step 20 of the method of Figure 2.
At step 120, the method performs image thresholding to provide a black and white (i.e. binarised) image from the original colour or greyscale image. The automatic thresholding of images is well known in the art and will be familiar to the skilled person. A description of the thresholding technique is described in a paper by N Otsu entitled "A Threshold Selection Method From Grey-Level Histograms", IEEE Trans. Systems, Man and Cybernetics, VoI 9, No 1, pages 377-393, 1979.
The thresholding of step 120 results in a black and white image mask (not shown), whereby the white areas of the mask represent the most highly marked or stained portions of the original specimen image from Figure 1. Thus, the image mask includes white areas corresponding the parts of the specimen marked with the biomarker, i.e. cells within the epithelial epidermis, and any other parts of the specimen that have been stained, i.e. connective tissue cells within the dermis. The mask may also include white areas corresponding to "noise" resulting from image processing (such as the thresholding step).
At step 130, the method filters out or removes objects (white areas) within the image mask which are .considered to be small. This removes from the image mask objects corresponding to small cells and cell nuclei, such as small connective tissue cells of the dermis, as well as noise resulting from the thresholding process. This filtering technique may be performed using the "morphological opening operator" well known in the field of image processing. This results in a black and white image
mask in which narrow noise lines and non-epithelial smaller cell nuclei present following step 120 are now removed.
Next, at step 140, the method filters out or removes from the image, objects which are not part of a regular tissue configuration comprising groups of interconnected cells (referred to generally herein as "clumped cells") typical of epithelial tissue. In particular, epithelial tissue, such as the tissue of the lower layers of the epidermis, typically comprises cells which are clumped or joined together to take on a structured form, as shown in Figure 1.
It will be noted in Figure 5, that clumped cells, although in close proximity to one another, will not necessarily all be touching one another in the image mask. To identify clumped cells, white regions are blurred in the x and y directions by a particular amount to expand their size. Thus, neighbouring cells forming part of the clumped structure, which were previously not touching in the image mask, will now be touching one other. However, although the blurring process will also increase the size of regions not considered to be part of the clumped structure (i.e. isolated regions), they will, in most (if not all) cases lead to these regions remaining as "non- clumped" (i.e. they will still be isolated regions not forming a clumped structure. This is done by ensuring that the blurring process, which is part of the filtering process 140, does not over-expand the size of white regions/objects to produce such errors.
Then, the filtering process of step 140 measures the two-dimensional area of each object (i.e. white area) within the image mask, and removes such white areas which are below a predefined threshold area. Typically, the threshold area is predetermined based on the type of specimen, particularly the tissue type of interest, the magnification of the objective lens used to acquire the specimen image, and other known factors. The threshold is chosen so that the process removes objects of a size below the normal size for the tissue of interest.
The measurement of the area of each object within the image is achieved by first identifying each individual object within the image, and then determining the number of pixels (which is proportional to the area, and thus the size, of the object).
It will be appreciated that individual objects comprise all white areas where adjacent pixels are touching. This leads to the black and white image mask containing only epithelial tissue and no disconnected (isolated) cells and cell nuclei. Note that image portions containing only connective tissue, such as image portions to the right hand side of the image of Figure 1 may produce a completely black mask at this stage, and the program may optionally include a step, at this stage, which determines whether all pixels of the mask are black, and if so returns to step 110.
At step 150, the method next removes the fine detail from the boundaries of the image, in order to provide a smooth outline for the mask. This removes any fine black lines, extending into the epithelial tissue, which may be present as a result of the staining of the specimen and the accuracy of the image processing. Step 150 may be performed using the "closing morphological operator" well known in image processing. This results in the black and white image mask corresponding to a portion of the image mask shown in Figure 4, which is saved in memory as the main mask at step 150. The boundaries of the image mask of Figure 4 (which represents an image mask for the complete specimen of Figure 1) thus correspond to the bottom of the basal layer of the epidermis, and the top of the epithelium of the epidermis (which excludes the upper layers of the epidermis which comprise denucleated, dead cells).
If computer analysis is to be performed on the whole of the epithelium of the epidermis, the program may proceed from step 150 to step 190 (described below) until all image portions have been processed. Then, the mask of Figure 4 may be used to delineate the relevant part of the image of Figure 1 for computer analysis.
However, in most instances, precise delineation of the basal layer is desirable, in order for analysis to be performed exclusively on the basal layer of the epidermis. Thus, in the preferred embodiment illustrated in Figure 3, further image processing is
performed to identify the basal layer (and/or other relevant layers) of the stratified epithelium of the epidermis.
At step 160, the method determines, for each object pixel within the mask, i.e. each white pixel, the distance from the closest black pixel, corresponding to the boundary of the object. This process may be performed by any conventional technique. For example, the "distance transform algorithm" well known in image processing may be used. Thus, step 160 produces a "distance value" for each white or object pixel, whereby the distance value is lowest for pixels close to the object boundaries, and highest at the centre of the object, between the boundaries.
At step 170, the method determines the approximate number of pixels corresponding to the size of epithelial cells within the object. This may be achieved by determining the spatial frequency of cells and/or cell nuclei within the epithelial areas, and is preferably performed on a greyscale version of the image, using the mask to identify the epithelial areas. Any suitable technique may be used, such as autocorrelation, which measures the distance between repeating signals corresponding to repeating features within the image, and thus provides the spatial frequency of the cell nuclei, which are the distinct, repeating features within the image. The skilled person will appreciate that other methods for determining the cell dimension in the image are possible and contemplated.
At step 180, the method uses the results of steps 160 and 170 to determine the upper boundary of the basal layer of the epidermis. In particular, since the basal layer is one cell in thickness, step 180 finds all white, object pixels that have a distance value determined by step 160 corresponding to the pixel distance from the boundary, equivalent to the value determined at step 170. Thus, for example, if step 170 determines that the spatial frequency of cells (i.e. cell size) equates to 30 pixels, step 180 finds all pixels within the object that are 30 pixels from the object boundary, using the values determined at step 160.
This may result in a portion of a mask similar to that of Figure 4, in which the object boundary is reduced by the number of pixels corresponding to the cell size, as shown in Figure 5. This mask portion is then saved in memory as a second mask, and the program proceeds to step 190. The mask of Figure 5 (which represents an image mask for the complete specimen of Figure 1), when superimposed on the mask of Figure 4, can be used to delineate the three relevant boundaries of the specimen image of Figure 1, namely the upper and lower basal layer boundaries and the upper layer of the epithelium of the epidermis. For this purpose, the upper boundary of the mask of Figure 5 is ignored.
Alternatively, step 180 may simply identify and save the object pixels that are the relevant distance from the lower, basal layer boundary, thus identifying a line of pixels corresponding to the upper boundary of the basal layer. This line of pixels may then be combined with the boundaries of the mask of Figure 4 to provide the three boundaries to be superimposed on the specimen image for delineation of regions thereof.
At step 190, the program considers whether there is another image portion to be processed, and if so returns to step 110. Otherwise the program ends at step 200, and computer analysis may subsequently be performed in accordance with step 40 of Figure 2.
Figure 6 shows the specimen image of Figure 1 on which automatic delineation has been performed in accordance with the preferred embodiment of the present invention. As can be seen by comparing Figure 6 with Figure 1, the method of the present invention identifies areas of epithelium, and specifically layers of stratified epithelial tissue more precisely and more quickly that manual delineation.
Computer analysis may be performed on the delineated portions of the complete specimen, or large part thereof, received at step 10 of the method of Figure 2, using conventional techniques as described above in relation to step 40 of Figure 2.
Various modifications and changes may be made to the described embodiments. For example, whilst the described embodiment of the present invention identifies a layer of stratified epithelium that is known to be approximately one cell in thickness, it may equally be used to identify layers having a known, greater thickness. It is intended to include all such variations, modifications and equivalents which fall within the scope of the present invention.
The present invention encompasses one or more aspects or embodiments of the present invention in all various combinations whether or not specifically claimed or described in the present specification in that combination.
Claims
1. A method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
2. A method according to claim 1, comprising using autocorrelation techniques on a greyscale image of the biological specimen to determine the spatial frequency of cell nuclei and thereby determine the width dimension of one or more cell layers.
3. A method according to claim 1, comprising determining the distance value of an object pixel from one or more object borders to determine the position of the pixel object within the object.
4. A method according to claim I5 comprising using a distance transform algorithm to determine the distance of one or more pixels from one or more object borders to determine the position of upper boundary of the basal layer of the epithelial tissue.
5. A method according to claim I5 wherein the method is used to distinguish a basal layer of the epithelial tissue from other areas of the biological specimen.
6. A method according to claim I5 wherein the cells in a particular layer have similar cell widths.
7. A method according to claim I5 wherein the cells in different layers have different cell widths.
8. A method according to claim I 5 comprising delineating layers of a stratified epithelial tissue using the corresponding image masks for the one or more layers.
9. A method according to claim 1, wherein the step of identifying and adapting isolated areas of the image comprises blurring regions of the image mask representing stained areas of the biological specimen by an amount to increase interconnection of clumped areas of the image mask.
10. A method according to claim I5 wherein the step of identifying and adapting isolated areas of the image comprises measuring a two dimensional area of regions of the image mask representing stained areas of the biological specimen, and adapting the image mask such that areas of the image mask below a threshold size are represented as non-stained areas of the biological specimen.
11. A method according to claim I5 comprising adapting the image mask to remove noise from the image mask prior to the identifying and adapting process.
12. A method according to claim 1, comprising adapting the image mask to remove areas of the image mask corresponding to isolated cells below a predetermined size prior to the identifying and adapting process.
13. A method according to claim 1, comprising using a morphological opening operator prior to the identifying and adapting process.
14. A method according to claim 1, comprising using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove noise.
15. A method according to claim 1, comprising using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove regions of the image mask representing cells of a size below a predetermined threshold.
16. A method according to claim 1, comprising smoothing the boundaries of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
17. A method according to claim 1, comprising using a closing morphological operator on regions of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
18. A method according to claim 1, wherein the image mask is a binary image mask.
19. An image mask produced by a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
20. Computer code arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted 0?
image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
21. Apparatus arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising : thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0503087.9 | 2005-02-15 | ||
GBGB0503087.9A GB0503087D0 (en) | 2005-02-15 | 2005-02-15 | Apparatus and method for processing of specimen images for use in computer analysis thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006087526A1 true WO2006087526A1 (en) | 2006-08-24 |
Family
ID=34385481
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2006/000498 WO2006087526A1 (en) | 2005-02-15 | 2006-02-14 | Apparatus and method for processing of specimen images for use in computer analysis thereof |
Country Status (2)
Country | Link |
---|---|
GB (2) | GB0503087D0 (en) |
WO (1) | WO2006087526A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011072211A2 (en) | 2009-12-11 | 2011-06-16 | Aperio Technologies, Inc. | Improved signal to noise ratio in digital pathology image analysis |
DE102011084286A1 (en) * | 2011-10-11 | 2013-04-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | METHOD AND DEVICE FOR QUANTIFYING INJURY OF A SKIN TISSUE CUTTING |
CN109478230A (en) * | 2016-03-18 | 2019-03-15 | 光学技术注册协会莱布尼兹研究所 | The method for checking distributed objects by segmentation general view image |
JP2022116081A (en) * | 2015-05-28 | 2022-08-09 | アクソジェン コーポレーション | Quantitative structural assay of nerve graft |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK2156370T3 (en) | 2007-05-14 | 2012-01-23 | Historx Inc | Compartment separation by pixel characterization using image data clustering |
JP5593221B2 (en) | 2007-06-15 | 2014-09-17 | ヒストロックス,インコーポレイテッド. | Method and system for standardizing microscope equipment |
CA2604317C (en) | 2007-08-06 | 2017-02-28 | Historx, Inc. | Methods and system for validating sample images for quantitative immunoassays |
CA2596204C (en) | 2007-08-07 | 2019-02-26 | Historx, Inc. | Method and system for determining an optimal dilution of a reagent |
US7978258B2 (en) | 2007-08-31 | 2011-07-12 | Historx, Inc. | Automatic exposure time selection for imaging tissue |
EP2335221B8 (en) | 2008-09-16 | 2016-05-25 | Novartis AG | Reproducible quantification of biomarker expression |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5796862A (en) * | 1996-08-16 | 1998-08-18 | Eastman Kodak Company | Apparatus and method for identification of tissue regions in digital mammographic images |
US6081612A (en) * | 1997-02-28 | 2000-06-27 | Electro Optical Sciences Inc. | Systems and methods for the multispectral imaging and characterization of skin tissue |
WO2004044845A2 (en) * | 2002-11-12 | 2004-05-27 | Qinetiq Limited | Image analysis |
-
2005
- 2005-02-15 GB GBGB0503087.9A patent/GB0503087D0/en not_active Ceased
-
2006
- 2006-02-14 WO PCT/GB2006/000498 patent/WO2006087526A1/en not_active Application Discontinuation
- 2006-02-14 GB GB0602875A patent/GB2423150A/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5796862A (en) * | 1996-08-16 | 1998-08-18 | Eastman Kodak Company | Apparatus and method for identification of tissue regions in digital mammographic images |
US6081612A (en) * | 1997-02-28 | 2000-06-27 | Electro Optical Sciences Inc. | Systems and methods for the multispectral imaging and characterization of skin tissue |
WO2004044845A2 (en) * | 2002-11-12 | 2004-05-27 | Qinetiq Limited | Image analysis |
Non-Patent Citations (4)
Title |
---|
LEE E S ET AL: "Application of computerized image analysis in pigmentary skin diseases.", INTERNATIONAL JOURNAL OF DERMATOLOGY. JAN 2001, vol. 40, no. 1, January 2001 (2001-01-01), pages 45 - 49, XP002379546, ISSN: 0011-9059 * |
ONG S H ET AL: "Image analysis of tissue sections", COMPUTERS IN BIOLOGY AND MEDICINE, NEW YORK, NY, US, vol. 26, no. 3, May 1996 (1996-05-01), pages 269 - 279, XP004532242, ISSN: 0010-4825 * |
RHO N -K ET AL: "Histopathological parameters determining lesion colours in the naevus of Ota: a morphometric study using computer-assisted image analysis", BRITISH JOURNAL OF DERMATOLOGY, vol. 150, no. 6, June 2004 (2004-06-01), pages 1148 - 1153, XP002379547, ISSN: 0007-0963 * |
S.H. ONG ET AL: "Adaptive window based tracking for the detection of membrane structures in kidney electron micrographs", MASCHINE VISIONS AND APPLICATIONS 1993, vol. 6, no. 4, 1993, pages 215 - 223, XP009066091 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011072211A2 (en) | 2009-12-11 | 2011-06-16 | Aperio Technologies, Inc. | Improved signal to noise ratio in digital pathology image analysis |
EP2510494A4 (en) * | 2009-12-11 | 2017-07-19 | Leica Biosystems Imaging, Inc. | Improved signal to noise ratio in digital pathology image analysis |
DE102011084286A1 (en) * | 2011-10-11 | 2013-04-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | METHOD AND DEVICE FOR QUANTIFYING INJURY OF A SKIN TISSUE CUTTING |
EP2581878A3 (en) * | 2011-10-11 | 2017-12-13 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and apparatus for quantification of damage to a skin tissue section |
JP2022116081A (en) * | 2015-05-28 | 2022-08-09 | アクソジェン コーポレーション | Quantitative structural assay of nerve graft |
US11847844B2 (en) | 2015-05-28 | 2023-12-19 | Axogen Corporation | Quantitative structural assay of a nerve graft |
CN109478230A (en) * | 2016-03-18 | 2019-03-15 | 光学技术注册协会莱布尼兹研究所 | The method for checking distributed objects by segmentation general view image |
JP2019510223A (en) * | 2016-03-18 | 2019-04-11 | ライブニッツ−インスティトゥート ヒュア フォトニッシェ テクノロジエン エーファオ | Method for inspecting distributed objects by segmenting overview images |
US20210081633A1 (en) * | 2016-03-18 | 2021-03-18 | Leibniz-Institut Für Photonische Technologien E.V. | Method for examining distributed objects by segmenting an overview image |
US11599738B2 (en) * | 2016-03-18 | 2023-03-07 | Leibniz-Institut Für Photonische Technologien E.V. | Method for examining distributed objects by segmenting an overview image |
Also Published As
Publication number | Publication date |
---|---|
GB2423150A (en) | 2006-08-16 |
GB0503087D0 (en) | 2005-03-23 |
GB0602875D0 (en) | 2006-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006087526A1 (en) | Apparatus and method for processing of specimen images for use in computer analysis thereof | |
JP6604960B2 (en) | Medical image analysis to identify biomarker positive tumor cells | |
Byun et al. | Automated tool for the detection of cell nuclei in digital microscopic images: application to retinal images | |
US9547801B2 (en) | Methods of chromogen separation-based image analysis | |
Bjornsson et al. | Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue | |
US8175369B2 (en) | Multi-nucleated cell classification and micronuclei scoring | |
JP4948647B2 (en) | Urine particle image region segmentation method and apparatus | |
US20070135999A1 (en) | Method, apparatus and system for characterizing pathological specimen | |
US20150186755A1 (en) | Systems and Methods for Object Identification | |
CN112215790A (en) | KI67 index analysis method based on deep learning | |
EP3640837A1 (en) | System for co-registration of medical images using a classifier | |
Kelly et al. | Quantification of neuronal density across cortical depth using automated 3D analysis of confocal image stacks | |
Lezoray et al. | Segmentation of cytological images using color and mathematical morphology | |
US20150260973A1 (en) | Focus determination apparatus, focus determination method, and imaging apparatus | |
US10943350B2 (en) | Automated segmentation of histological sections for vasculature quantification | |
Marzec et al. | Efficient automatic 3D segmentation of cell nuclei for high-content screening | |
Palokangas et al. | Segmentation of folds in tissue section images | |
Khan et al. | Segmentation of single and overlapping leaves by extracting appropriate contours | |
Feng et al. | An advanced automated image analysis model for scoring of ER, PR, HER-2 and Ki-67 in breast carcinoma | |
WO2006085068A1 (en) | Apparatus and method for image processing of specimen images for use in computer analysis thereof | |
Han et al. | Multi-resolution tile-based follicle detection using color and textural information of follicular lymphoma IHC slides | |
CN114299044A (en) | Method and device for interpreting lymphocytes | |
Weingant | Quantification of nuclei in synthetic Ki-67 histology images of the breast: image analysis in digital pathology | |
Bell et al. | Fully automated screening of immunocytochemically stained specimens for early cancer detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06709735 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 6709735 Country of ref document: EP |