GB2423150A - Distinguishing layers of epithelial tissue - Google Patents

Distinguishing layers of epithelial tissue Download PDF

Info

Publication number
GB2423150A
GB2423150A GB0602875A GB0602875A GB2423150A GB 2423150 A GB2423150 A GB 2423150A GB 0602875 A GB0602875 A GB 0602875A GB 0602875 A GB0602875 A GB 0602875A GB 2423150 A GB2423150 A GB 2423150A
Authority
GB
United Kingdom
Prior art keywords
image mask
image
biological specimen
areas
layers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0602875A
Other versions
GB0602875D0 (en
Inventor
John R Maddison
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medical Solutions PLC
Original Assignee
Medical Solutions PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medical Solutions PLC filed Critical Medical Solutions PLC
Publication of GB0602875D0 publication Critical patent/GB0602875D0/en
Publication of GB2423150A publication Critical patent/GB2423150A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • G06T7/602
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The present invention provides a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising: ```thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; ```identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension; ```determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; ```using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; ```and providing corresponding image masks for the one or more layers.

Description

APPARATUS AND METHOD FOR PROCESSING OF SPECIMEN IMAGES
FOR USE IN COMPUTER ANAL'\'SIS THEREOF
Background
The present invention relates to imauc prc)cessmg. and more particular]\ to the processing of digital images of a microscope specimen to identify parts of the image of the specimen for computer analysis.
The computer analysis of biological specimens perfonned on digital images of a specimen acquired from scanning the specimen on a microscope slide has advanced the accuracy of diagnostic and prognostic techniques. and is increasingly used for research purposes to ldentif\T the effect of drugs on biological tissue. One example of such computer image analysis techniques is field fraction measurements, which are 1 5 w'idely used to determine a quantIty of stain taken up by a stained biological specimen. the quantity of stain being associated with an attribute or characteristic of the specimen. Another example involves counting of the number of features (e.g. cell nuclei) within a specimen. Such computer measurements are vastly supenor to the subjective and potentially less accurate corresponding manual analysis previously performed by laboratory technicians or pathologists in analysing a specimen directly using a microscope.
in certain circumstances, the computer analysis of a biological specimen image concerns only the analysis of a very specific type of tissue, which corresponds to only part of the specimen. For example. the analysis may only be concerned with a certain region of stratified epithelial tissue. such as the basal layer of the epidermis of a skin specimen. If other regions of tissue, or other types of tissue, are present within the biological specimen image analysed, the computer analysis might produce misleading results. Thus, corn entionally. a lahoratoi technician will review the digital image of a specimen on a computer monitor prior to computer analysis thereof.
and identify areas of the image showing the tissue region of interest for image ana]\'sis Thus. conventional softw arc applications alIo\\ a user to manualh' draw lines around areas of the specimen image displayed on a computer monitor Using a mouse or equivalent.
Unfortunately, the manual identification or delineation of parts of thc image of a biological specimen to undergo computer analysis is generally imprecise, subjective and extremely time consuming. The accuracy of] dentif','ing parts of the image contaming the relevant tissue type. to undergo analysis. is dependent upon the user's skill, such as dexterity in using the mouse, and recognition of the types of cell or tissue to be analysed, as well as the time available to the user. Moreover, a technician ma be required to perform delineation on several hundred specimen images at a time, which may lead to poor delineation by the user as he or she becomes tired or bored.
It would therefore he desirable to provide a method for automatically 1 5 identi'ing tissue regions within a biological specimen image to undergo computer analysis by image processmg, thereby avoiding the problems associated with performing the functi on manually.
Summary of the Invention
Accordingly, the present invention provides a method which automatically performs the above described delineation on an image of a biological specimen by, for example, distinguishing parts of the image corresponding to tissue having a certain morphological structure, such as a region of stratified epithehal tissue, from parts if the image corresponding to other regions of epithelial tissue, connective tissue and other biological and non-biological material.
According to a first aspect. the present invention provides a method for automatically distinguishing layers of stratified epithelial tissue flom other regions of a biological specimen using computer image analysis. stratified epithelial tissue being characerised h\ layers oi clumped cells, each layer having a thickness conespondine to one or more cell widths. the method comprisine thresholding a stamed biological specimen to provide an image mask of the biological specimen. the image mask distinguishing between stained and non-stamed areas of the biological specimen, the stained areas comprising stratified epthelial tissue: identifvmg isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specinien the adapted 1 0 image mask comprising objects formed from pixels. the obj ects representing stratified epithelial tissue regions. and the pixels each having a width dimension: detenmning the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; 1 5 usmg the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithehal tissue: and providing corresponding image niasks for the one or more layers.
The corresponding image masks for the one or more layers may specifically represent the stained areas of the biological specimen or may specifically represent the non-stained areas of the biological image i.e. the corresponding image mask may be a positive or negative image. One or more of the image masks may or may not he displayed during the method.
The method may comprise using autocorrelation tecimiques on a greyscale image of the biological specimen to deterimne the spatial frequenc\ of cell nuclei and thereby determine the width dimension of one or more cell layers.
The method may comprise determinnig the distance value of an object pixel from one or more object borders to determine the position of the pixel object within the object.
The method ma\ comprise using a distance transform algorithm to determine the distance (f one or more pixels from one o more obieci borders to determine the Position of upper boundar' of the basal layer of the epithelial tissue.
The method ma\ he used to distinguish a basal layer of the epithelial tissue from other areas of the biological specimen.
The cells in a particular layer may have similar cell widths. The cells in 1 0 different layers may have different cell widths.
The method may comprise delineating layers of a stratified epithelial tissue usmg the corresponding image masks for the one or more layers.
The step of identifying and adapting isolated areas of the image may comprise blurring regions of the image mask representing stained areas of the biological specimen by an amount to increase interconnection of clumped areas of the image mask.
The step of identi'ing and adapting isolated areas of the image may comprise measuring a two dimensional area of regions of the image mask representing stained areas of the biological specimen. and adapting the image mask such that areas of the image mask below a threshold size are represented as non-stained areas of the biological specimen.
The method may comprise adapting the image mask to remove noise from the image mask prior to the identi'ing and adapting process.
The method may comprise adapting the image mask to remove areas of the image mask corresponding to isolated cells below a predetermined size prior to the identifying and adapting process.
The method ma comprise usmu a morphological opening Operator prior to the identifvinc and adaptinu process.
The method ma\ comprise using a morphological opening operator prior to the identifying and adaptinu process to adapt regions of the image mask representing stamed areas of the biological specimen to remove noise.
The method may comprise usmg a morphological openin operator prior to 1 0 the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove regions of the image mask representing cells of a size helo a predetermined threshold.
The method may comprise smoothing the boundaries of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
The method may comprise using a closing morphological operator on regions of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
The image mask may be a binary image mask.
According to a second aspect. the present invention provides an image mask produced by a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising thresholding a stained biological specimen to provide an image mask of the biological specimen, the Image mask distinguishing between stained and non-stained areas of the biolocical specimen, the stained areas comprisme stratified epithelial tissue.
identit\'ing isolated areas of the imaue mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of thc biological specimen, the adapted image mask comprising objects formed from pixels, the objects representing stratified epithehal tissue regions. and the pixels each having a width dimension: determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension: using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
1 5 Accordmg to a third aspect. the present invention provides computer code arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis. stratified epithelial tissue being characterised b) layers of clumped cells, each layer having a thickness corresponding to one or more cell widths. the method comprising thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishing between stained and non-stained areas of the biological specimen, the stained areas comprising stratified epithelial tissue; identi'ing isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the Image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask comprising ob1 ects formed from pixels. the objects representing stratified epithelial tissue regions, aiid the pixels each having a width dimension: determining the width dimension of one or more cell layers and using the pixel width dimension to detenmne the number of pixels corresponding to the cell width dimension: usinu the determined number of pixels conespondmu to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue: and providing eonespondin image masks for the one or more layers.
According to a fourth aspect. the present invention provides apparatus anan ged to perform a method for automaticaIJ distinguishing layers of stratified epithelial tissue fi-om other regions of a biological specimen usmg computer image analysis, stratified epithehal tissue being characteiised by layers of clumped cells.
each layer having a thickness corresponding to one or more cell widths, the method 1 0 compi-ising thresholding a stained biological specimen to provide an image mask of the biological specimen. the image mask distinguishing between stained and non-stamed areas of the biological specimen, the stained areas comprising stratified epithelial tissue; 1 5 identifi,'ing isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen, the adapted image mask compi sing ohj ects formed from pixels, the objects representing stratified epithelial tissue regions. and the pixels each having a width dimension: determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width dimension; using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue; and providing corresponding image masks for the one or more layers.
The present invention encompasses one or more of the above-mentioned aspects and.'or embodiments in all vanous combinations whether or not specifically mentioned in that combination.
Pescription of Figures Embodiments of the present invention will no he described. b\ way of example. with reference to the accompanvmg drawings. in which: Figure 1 depicts a typical image of a specimen containing stratified epithelial tissue: Figure 2 is a flow diazram illustrating the method of an embodiment of the 1 0 present invention: Figure 3 is a flow diaEram illustrating the method of identifying parts of a specimen image corresponding to different layers of stratified epithelial tissue m accordance with a preferred embodiment of the present inventiom Figure 4 is a black and white image mask for the specimen image of Figure 1 resulting from step 150 of the method of Figure 3: Figure 5 is a black and white image mask for the specimen image of Figure 1 following step 180 of the method of Figure 3; Figure 6 shows the specimen of Figure 1 in which parts of the image corresponding to certain layers within the stratified epithelium have been delineated in accordance with the method of Figure 3.
Specific Embodiments Figure 1 shows an image of a typical biological specimen containing stratified epitheliurn, which has been prepared from a sample of human tissue, taken from an individual for pathology analysis. In particular, the specimen is a skin tissue section measuring approximately 2mm x 2mm taken from an individual as part of a study of (I the effects of a new drug A skin sample is talen before. clurina. and at the end of the drug studs' and a hiomarker is applied to the skin specimen using conventional patllolog\ techniques. The biomarker biologlcal]\ marks cells. or part of cells such as cell nuclei, cell membranes or cytoplasm, on which the effect of the drug is being S examined A stain may he used to visihh reveal the presence of the hiomarker under the microscope. and thus in the specimen image, if the biomarker itself is not coloured.
As shown in Figure 1, the skin tissue section includes layers of structured 1 0 epithehal tissue of the epidern-iis of the skin (to the left hand side of the image). and various underlying cells of the dennis comprising mainly connective tissue (to the ight hand side of the image). Computer analysis is typically performed on the lower basal layer of epithelial tissue of the epidennis. where cell reproduction takes place.
although other epithelial layers of the epidennis. or the whole of the epithelial part of 1 5 the epidenrns. ma\ be anal sed.
The skilled person will appreciate that whilst the following description relates particularly to the analysis of the stratified epithelium of the epidermis of the skin, the present invention is applicable to other types of stratified epithelial tissue. or other layered tissue types having an identifiable morphological structure. In addition. the present invention may be used for the analysis of stained biological specimens for other purposes, such as for conventional prognostic and diagnostic analysis.
As is apparent from the image of Figure 1. the general region of the image containing the epithelium of the epidermis is evident, and corresponds to the region that is most highly stained (i.e. the greyest) and has a high concentratio of regularly arranges cell nuclei. However, the manual identification of the boundaries that correspond to the top and bottom of the epithelium is difficult. The boundaries (particularly the lower boundary corresponding to the bottom of the basal layer of the 3() epidermis) are complex and intricate, requiring considerable time and skill for an individual to manually draw the houndar'v with the required degree of precision. 1 0
Moreo\ er. the upper boundar\ of the epithelial tissue. helov the upper epidermis which contains dead and dcnucleaied cells), is difficu]i to clearh distinguish, as if the lop of the basal layer, which is approximately one cell in thickness.
For these reasons, the task of accurateI drawing the imes around the basal layer tvplcall\ takes about 30 mmuies per specimen Moreover, due to the subjective and imprecise manual delmeation of the regions othe image to undergo anal sis. it is likely that such regions may include parts of the image that do not contain epithelial cells and/or epithelial cells of the basal layer of the epidermis. Thus. the computer 1 0 analysis may be performed on parts of the specimen image which should not. ideally.
he considered in the analysis. This may lead to poor results.
Accordingi. the present invention aims to automatically identify and delineate parts of an image of a biological specimen. which correspond to one or more specific layers of a stratified tissue structure. for computer analysis.
Figure 2 shows the steps performed in accordance with a method of the present invention. The method is typically performed by a computer program running on a computer receiving digital specimen images acquired from scanning a microscope slide containing the specimen, in accordance with conventional techniques.
At step 1 0, the method receives high resolution image data for the specimen which is to he subject to analysis. The analysis tecirnique may be any form of computer analysis used for the purposes of prognosis or diagnosis. drug studies or other pathology related purposes. The high resolution image data received at step 1 0 may be the image data for a complete specimen obtained from the scanning of the microscope slide, or. more preferably. it may be a user or computer determined selected large part of the specimen. 3 C)
Al step 20. the method divides the image data received at slep 1 (1 into multiple, smaller image portions Tvplcall\ . the image is divided into Image squares.
which ma\ each be more convementlv processed separalel\ . and are of a chosen size to ensure efficient imace processing using the techniques described herein At step 30. the method processes. in turn, each image portion to identif areas of the image contained therein which should undergo computer analysis. In a preferred embodiment. the image processing performed at step 30 identifies or delineates areas of epithehal tissue of the epidermis within a specimen image, and 1 0 also a specific layer of the epithehal epidermis. namely the basal layer. using the method of a preferred embodiment of the present invention illustrated in Figure 3 and described below.
At step 40. the method performs the conventional computer analysis technique on selected, delineated areas of the image from step 30.
Thus. the method of Figure 2 performs image analysis only on computerdetermined areas of the specimen image which are relevant to the computer analysis technique. Thus. the results of computer analysis are optimised. and the results are consequently more useful to the pathologist.
Figure 3 illustrates a method for image processing, to identi' image areas of one or more layers of stratified epithelial tissue to undergo analysis. in accordance with a prefen-ed embodiment of the present invention. The method is typically performed b a computer program running on a computer, which may form part of, or be separate from the program performing the method illustrated in Figure 2, and may run on the same or a different computer. The program is typically stored on a computer readable medium such as magnetic or optical disk. which may be loaded into the memory of the computer on which it is run. The skilled person will appreciate. from the following description, that not all of the illustrated method steps are essential to the present in cntion. although optimal results may he obtained for certain tissue i'pes usine all of the steps of the preèrred embodiment illustrated in Figure 3.
At step 110. the method recci es a first portion of a high resolution image of the biological specimen. Tvpical]\ . the image portion is one of multiple. rclanvelv small image portions from a complete specimen, or large part thereof. as provided by step 20 of the method of Figure 2.
At step 120. the method performs image thresholding to provide a black and white (i.e. hinarised) image fi-om the original colour or gi-eyscale image. The automatic thresholding of images is well known in the art and will be familiar to the skilled person. A description of the thresholdmg technique is described in a paper by N Otsu entitled "A Threshold Selection Method From Grey-Level Histograms". IEEE Trans. Systems. Man and Cybernetics, Vol 9. No 1. pages 377-3 93. 1979.
The thresholdmg of step 120 results in a black and white image mask (not shown), whereby the white areas of the mask represent the most highly marked or stained portions of the original specimen image from Figure 1. Thus, the image mask includes white areas corresponding the parts of the specimen marked with the hiomarker. i.e. cells within the epithelial epidermis, and any other parts of the specimen that have been stained, i. e. connecti\e tissue cells within the dermis. The mask may also include white areas corresponding to "noise" resulting from image processing (such as the thresholding step).
At step 130. the method filters out or removes obiects (white areas) within the image mask which are considered to be small. This removes from the image mask objects corresponding to small cells and cell nuclei, such as small connective tissue cells of the dennis. as well as noise resulting from the thresholding process. This filtering technique max' he performed using the "morphological opening operator" well known in the field of image processing. This results in a black and white image Is mask in which narrow noi-.c lines and non-epithelial smaller cell nuc1e present lollowing step 120 arc now removed Next. at step 140. the method filters out or removes from the image. objects which are nol part of a regular tissue configuration comprising groups of interconnected cells (referred to generally herein as "clumped cells") typical of epithelial tissue. In particular. epithelial tissue, such as the tissue ol the lower layers of the epidermis. typically comprises cells which are clumped or joined together to take on a structured fonm as shown in Figure 1.
It will be noted in Figure 5. that clumped cells, although in close proximity to one another, will not necessarily all be touching one another in the image mask. To identify clumped cells, white regions are blurred in the x and directions by a particular amount to expand their size. Thus. neighbouring cells fonmng part of the clumped structure, which were previousl\ not touching in the image mask. will now he touching one other. However, although the blurring process will also increase the size of regions not considered to be part of the clumped structure (i.e. isolated regions). they will. in most (if not all) cases lead to these regions remaining as "non- clumped" (i.e. they will still be isolated regions not forming a clumped structure.
This is done by ensuring that the blurring process, which is part of the filtering process 140, does not over-expand the size of white regions'ohjects to produce such errors.
Then, the filtering process of step 140 measures the two-dimensional area of each object (i.e. white area) within the image mask. and removes such white areas which are below a predefined iireshold area Typically, the threshold area is predetermined based on the type of specimen. particularly the tissue type of interest.
the magnification of the objective lens used to acquire the specimen image. and other known factors. The threshold is chosen so that the process removes objects of a size below the normal size for the tissue of interest.
The measurement of the area of each object within the unauc i achieved b\ first identif'ing each individual object within the image, and then deiermimng the number of pixels (which is proportional to the area. and thus the size. of the Object).
It will he appreciated that individual objects comprise all vhite areas where adjacent pixels are touching. This leads to the black and white image mask containing only epithelial tissue and no disconnected (isolated) cells and cell nuclei Note that image portions containing onl\ connective tissue. such as image portions to the right hand side of the image of Figure 1 ma produce a completel\ black mask at this stage. and the protiram ma optionally include a step, at this stage. which determines whether all pixels of the mask are black, and if so returns to step 110.
At step 150, the method next removes the fine detail from the boundaries of the image, in order to provide a smooth outline for the mask. This removes any fine black lines, extending into the epithelial tissue. which may he present as a result of the staining of the specimen and the accurac of the image processing Step 150 ma) be perfonned using the "closing morphological operator" well known in image processing. This results in the black and white image mask corresponding to a portion of the image mask shown in Figure 4. which is saved in memory as the main mask at step 1 50. The boundaries of the image mask of Figure 4 (which represents an image mask for the complete specimen of Figure 1) thus correspond to the bottom of the basal layer of the epidenms. and the top of the epithelium of the epidermis (which excludes the upper layers of the epidermis which comprise denucleated, dead cells).
If computer analysis is to he performed on the whole of the epithelium of the epidermis. the program may proceed from step 1 50 to step 1 90 (described below) until all image portions have been processed. Then. the mask of Figure 4 may b used to delineate the relevant part of the image of Figure 1 for computer analysis.
However, in most instances, precise delineation of the basal layer is desirable, in order for analysis to be performed exclusivel) on the basal la\'er of the epidermis Thus. in the preferred embodiment illustrated in Figure 3. further image processing is peribrmcd to identif\' the basal la\ er and 01 other reie\ ant ia\'er) of the stratified epitheli UTh of the epidermis Ai step 160. the method determines, for each object p]\el within the mask. i.e. S each hue pixel, the distance from the closest black pixel. corresponding to the houndau-v of the object. This process max' he performed b\ any conventional technique For example. the "distance transfbrm algorithm" well known in image processing ma\ he used. Thus. step 1 60 produces a "distance value for each white or object pixel. whereby the distance value is lowest for pixels close to the object boundaries, and highest at the centre of the object. between the boundaries.
At step 170. the method determines the approximate number of pixels corresponding to the size of epithelial cells within the object. This may be achieved by determmmg the spatial frequency of cells andor cell nuclei within the epithelial areas. and is preferabl performed on a greyscale ersion of the image. using the mask to identify the epithehal areas. Any suitable tecimique may be used, such as autocorrelation. which measures the distance between repeating signals corresponding to repeating features within the image, and thus provdes the spatial frequency of the cell nuclei, which are the distinct, repeating features within the image. The skilled person will appreciate that other methods for determining the celldimension in the image are possible and contemplated.
At step 1 80. the method uses the results of steps 1 60 and 1 70 to detennine the upper boundary of the basal layer of the epidermis. In particular. since the basal layer is one cell in thickness. step 180 finds all white. object pixels that have a distance value determined by step 1 60 corresponding to the ixe1 distance from the boundary.
equivalent to the value determined at step 1 70. Thus. for example. if step 1 70 determines that the spatial frequency of cells (i.e. cell size) equates to 30 pixels. step finds all pixels within the object that are 30 pixels from the object boundary.
using the values detennined at step 160. 1 6
This mn result in a portion of a mask similar to that of Figure 4. in which the ObJect houndan is reduced b the number oi pixels colTeSpondinu to the cell size. as shown in Figure 5. This mask portion is then saved in memor\ as a second mask, and the program proceeds to step 1 90 The mask of Figure 5 (which represents an image S mask for the complete specimen of Figure 1). when superimposed on the mask of Figure 4, can he used ic delineate the three relevant boundaries of the specimen image of Figure 1. namely the upper and lower basal layer boundaries and the upper layer of the epithelium of the epidermis For this pUrpose. the upper boundary of the mask of Figure 5 is ignored.
Alternatively. step I 8() may simply identify and save the object pixels that are the relevant distance from the lower, basal layer boundary. thus identlf\'ing a line of pixels corresponding to the upper boundary of the basal layer. This line of pixels may then be combined with the boundaries of the mask of Figure 4 to provide the three boundaries to he superimposed on the specimen image for delineation of regions thereof, At step 1 90. the program considers whether there is another image portion to be processed. and if so returns to step 110. Otherwise the program ends at step 200.
and computer analysis ma) subsequently be performed in accordance with step 40 of Figure 2.
Figure 6 shows the specimen image of Figure 1 on which automatic delineation has been performed in accordance with the preferred embodiment of the resent invention. As can be seen h' comparing Figure 6 with Figure 1. the method of the present invention identifies areas of epithelium. and specifically layers of stratified epithelial tissue more precisely and more quickly that manual delineation.
Computer anal\'sls ma)' he performed on the delineated portions of the complete specimen. oi large pail thereof. received at step 1 0 of the method of Figure 2. using COii\ entiona] techniques as described abo\ e n relation to step 40 of Figure 2.
Vanous modifications and changes ma he made to the desciibed embodiments. For example. whilst the deccnhed embodiment of the present invention identifies a layer of stratified epithehum that is known to be approximately one cell in thickness. it mas equally be used to idenulS layers having a known. greater thickness. It is mtended to include all such anations. modifications and equivalents whith fall nthin the scope of the present invention.
The present invention encompasses one or more aspects or embodiments of the present invention in all various combinations whether or not specifically claimed or desciibedm the present specification in that combination. Is

Claims (21)

  1. Claims A method for au1omatia11\ disiinguishmg layers of sraufied
    epithelial tissue from other reuions of a biological specimen using computer imauc analysis. stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more ccl] widths. the method comprismg thresholding a stained biological specimen to provide an image mask of the biological specimen, the image mask distinguishmg between stained and non-stained areas of the biological specimen, the stamed areas comprising stratified epithelial tissue: identifvmg isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stamed areas of the biological specimen. the adapted image mask cornpnsmg objects formed from pixels, the objects representing stratified epithelial tissue regions, and the pixels each having a width dimension: determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels correspondmg to the cell width dimension: using the detemuned number of pixels corresponding to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue: and providing corresponding image masks for the one or more layers.
  2. 2. A method according to claim 1. Comprising using autocorrelation techniques on a grevscale image of the biological specimen to determine the spatial frequency of ccli nuclei and thereb) determine the width dimension of one or more cell layers,
  3. 3. A method according to claim 1. comprising determining the distance value of an object pixel from one or more object borders to determine the position of the pixel object within the object.
    I
  4. 4. A method according to claim I. comprising usine a distance transform algorithm to cleterimne the distance of one or more pixels from one or more obiect borders to dciinine the position ol uppei houndar\ of the basal layer of the epithelial tissue.
  5. A method according to claim 1. whei-ein the method is used to distinguish a basal layer of the epithelial tissue from other areas of the biological specimen.
  6. 6. A method according to claim 1. wherein the cells in a particular layer have similar cell widths.
  7. 7. A method accordmg to claim 1. wherein the cells in different layers have different cell widths.
  8. 8. A method according to claim 1. compnsing delmeating layers of a stratified epithelial tissue using the corresponding image masks for the one or more layers.
  9. 9. A method according to claim 1. wherein the step of identifying and adapting isolated areas of the image comprises blurring regions of the image mask representing stained areas of the biological specimen by an amount to increase interconnection of clumped areas of the image mask.
  10. 1 0. A method according to claim 1, wherein the step of identiiing and adapting isolated areas of the nnage comprises measuring a two dimensional area of regions of the image mask representin stained areas of the biological specimen, and adapting the image nask such that areas of the image mask below a threshold size are represented as non-stained areas of the biological specimen.
  11. 11 A method according to claim I. comprising adapting the image mask to remove noise from the image mask prior to the identif,'ing and adapting process.
  12. 12. A method according to claim I. comprisine adaptin the image mask to remo\ e areas of the image mask corresponding 10 isolated cells belo\ a predetermined size prior to the identifying and adaptine process
  13. 13. A method accordmg to claim I. comprising using a morpholoical opening operator prior to the identifying and adapting process
  14. 14. A method accordmg to claim 1. comprising using a morphological opening operator prior to the identi'ing and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove noise
  15. 15. A method according to claim 1. compnsing using a morphological opening operator prior to the identifying and adapting process to adapt regions of the image mask representing stained areas of the biological specimen to remove regions of the image mask representing cells of a size below a predeterrrnned threshold.
  16. 16. A method according to claim 1, comprismg smoothing the boundaries of the image mask representing stained areas of the biological specimen subsequent to the identifying and adapting process.
  17. 1 7. A method accordmg to claim I, comprising using a closing morphological operator on regions of the image mask representing stained areas of the biological specimen subsequent to the identi'ing and adapting process.
  18. 18. A method according to claim 1. wherein the image mask is a binary image mask.
  19. 1 9. An image mask produced by a method for automaticall\ distinguishing la\ ers of stratified epithelial tissue from other regions of a biological specimen using computer Image analysis, stratified epithelial tissue being characterised b layers of c]umpecl cells, each la\ er ha\'lng a thickness CoITespondine to one or more cell widths. the method compri sme thresholdmg a stained hiolocical specimen to pro ide an image mask of the biological specimen. the image mask distinguishing between stained and non-stamed areas of the biological specimen. the stained areas comprisine stratified epithelial tissue.
    identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen. the adapted 1 0 image mask comprismg objects formed from pixels, the objects representing stratified epithelial tissue regions. and the pixels each having a width dimension; determimng the width dimension of one or more cell layers and usmg the pixel width dimension to determine the number of pixels corresponding to the cell width dimension: using the determined number of pixels correspondmg to the cell width dimension to distinguish between one or more layers of the stratified epithelial tissue: and providing corresponding image masks for the one or more layers.
  20. 20. Computer code arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis. stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising thresholding a stained biological specimen to provide an image mask of the biological specimen. the image mask distinguishing between stamed and non-stained areas of the biological specin'en. the stained areas comprising stratified epithelial tissue: identi'ing isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adaptmg the image mask such that the isolated areas represent non-stained areas of the biological Specimen. the adapted image mask comprising ob1ccts formed from piNels. the obiect represenlinu stratified epitheha] USSLIC rcgions. and the pixels each ha\'mg a width dimension.
    dctermmmu the width chmension of one or more cell la\ ers and using the pixel width dimension to determine the' number of pixels corresponding to the cell width dimension.
    using the determined number of pixels corresponding to the cell width dimension to distinguish between one or more layers ofthc stratified epithehal tissue: and providing corresponding image masks for the one or more la\'ers.
  21. 21. Apparatus arranged to perform a method for automatically distinguishing layers of stratified epithelial tissue from other regions of a biological specimen using computer image analysis, stratified epithelial tissue being characterised by layers of clumped cells, each layer having a thickness corresponding to one or more cell widths, the method comprising I S tiu-esholding a stained biological specimen to pros ide an image mask of the biological specimen, the image mask distmguishm between stained and non-stained areas of the biological specimen. the stained areas comprising stratified epithelial tissue: identifying isolated areas of the image mask representing stained areas of the biological specimen in the image mask and adapting the image mask such that the isolated areas represent non-stained areas of the biological specimen. the adapted image mask comprising objects formed from pixels. the objects representing stratified epithehal tissue regions, and the pixels each having a width dimension: determining the width dimension of one or more cell layers and using the pixel width dimension to determine the number of pixels corresponding to the cell width di' iension: using the determined number of pixels corresponding to the eel] width dimension to distinguish between one or more la\ ers of the stratified epithelial tissue: and providing corresponding image masks for the one or more layers.
GB0602875A 2005-02-15 2006-02-14 Distinguishing layers of epithelial tissue Withdrawn GB2423150A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB0503087.9A GB0503087D0 (en) 2005-02-15 2005-02-15 Apparatus and method for processing of specimen images for use in computer analysis thereof

Publications (2)

Publication Number Publication Date
GB0602875D0 GB0602875D0 (en) 2006-03-22
GB2423150A true GB2423150A (en) 2006-08-16

Family

ID=34385481

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB0503087.9A Ceased GB0503087D0 (en) 2005-02-15 2005-02-15 Apparatus and method for processing of specimen images for use in computer analysis thereof
GB0602875A Withdrawn GB2423150A (en) 2005-02-15 2006-02-14 Distinguishing layers of epithelial tissue

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB0503087.9A Ceased GB0503087D0 (en) 2005-02-15 2005-02-15 Apparatus and method for processing of specimen images for use in computer analysis thereof

Country Status (2)

Country Link
GB (2) GB0503087D0 (en)
WO (1) WO2006087526A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7907271B2 (en) 2007-06-15 2011-03-15 Historx, Inc. Method and system for standardizing microscope instruments
US7978258B2 (en) 2007-08-31 2011-07-12 Historx, Inc. Automatic exposure time selection for imaging tissue
US8121365B2 (en) 2007-08-07 2012-02-21 Historx, Inc. Method and system for determining an optimal dilution of a reagent
US8160348B2 (en) 2007-08-06 2012-04-17 Historx, Inc. Methods and system for validating sample images for quantitative immunoassays
US8335360B2 (en) 2007-05-14 2012-12-18 Historx, Inc. Compartment segregation by pixel characterization using image data clustering
US9240043B2 (en) 2008-09-16 2016-01-19 Novartis Ag Reproducible quantification of biomarker expression
WO2017158560A1 (en) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5786110B2 (en) 2009-12-11 2015-09-30 ライカ バイオシステムズ イメージング インコーポレイテッドAperio Technologies, Inc. Improvement of signal-to-noise ratio in digital pathological image analysis
DE102011084286A1 (en) * 2011-10-11 2013-04-11 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. METHOD AND DEVICE FOR QUANTIFYING INJURY OF A SKIN TISSUE CUTTING
US9690975B2 (en) * 2015-05-28 2017-06-27 Axogen Corporation Quantitative structural assay of a nerve graft

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004044845A2 (en) * 2002-11-12 2004-05-27 Qinetiq Limited Image analysis

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796862A (en) * 1996-08-16 1998-08-18 Eastman Kodak Company Apparatus and method for identification of tissue regions in digital mammographic images
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004044845A2 (en) * 2002-11-12 2004-05-27 Qinetiq Limited Image analysis

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8335360B2 (en) 2007-05-14 2012-12-18 Historx, Inc. Compartment segregation by pixel characterization using image data clustering
US8655037B2 (en) 2007-05-14 2014-02-18 Historx, Inc. Compartment segregation by pixel characterization using image data clustering
US7907271B2 (en) 2007-06-15 2011-03-15 Historx, Inc. Method and system for standardizing microscope instruments
US8027030B2 (en) 2007-06-15 2011-09-27 Historx, Inc. Method and system for standardizing microscope instruments
US8120768B2 (en) 2007-06-15 2012-02-21 Historx, Inc. Method and system for standardizing microscope instruments
US8160348B2 (en) 2007-08-06 2012-04-17 Historx, Inc. Methods and system for validating sample images for quantitative immunoassays
US8417015B2 (en) 2007-08-06 2013-04-09 Historx, Inc. Methods and system for validating sample images for quantitative immunoassays
US8121365B2 (en) 2007-08-07 2012-02-21 Historx, Inc. Method and system for determining an optimal dilution of a reagent
US7978258B2 (en) 2007-08-31 2011-07-12 Historx, Inc. Automatic exposure time selection for imaging tissue
US9240043B2 (en) 2008-09-16 2016-01-19 Novartis Ag Reproducible quantification of biomarker expression
WO2017158560A1 (en) * 2016-03-18 2017-09-21 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image
US11599738B2 (en) 2016-03-18 2023-03-07 Leibniz-Institut Für Photonische Technologien E.V. Method for examining distributed objects by segmenting an overview image

Also Published As

Publication number Publication date
GB0503087D0 (en) 2005-03-23
GB0602875D0 (en) 2006-03-22
WO2006087526A1 (en) 2006-08-24

Similar Documents

Publication Publication Date Title
GB2423150A (en) Distinguishing layers of epithelial tissue
JP7197584B2 (en) Methods for storing and retrieving digital pathology analysis results
Bjornsson et al. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue
US4175860A (en) Dual resolution method and apparatus for use in automated classification of pap smear and other samples
CN111417958A (en) Deep learning system and method for joint cell and region classification in biological images
US20070135999A1 (en) Method, apparatus and system for characterizing pathological specimen
EP1978485A1 (en) Methods of chromogen separation-based image analysis
US20150186755A1 (en) Systems and Methods for Object Identification
WO2003105675A2 (en) Computerized image capture of structures of interest within a tissue sample
CN114945941A (en) Non-tumor segmentation for supporting tumor detection and analysis
EP3640837A1 (en) System for co-registration of medical images using a classifier
CN112215217B (en) Digital image recognition method and device for simulating doctor to read film
CN115546605A (en) Training method and device based on image labeling and segmentation model
Lo et al. Glomerulus detection on light microscopic images of renal pathology with the faster R-CNN
CN113393454A (en) Method and device for segmenting pathological target examples in biopsy tissues
US10943350B2 (en) Automated segmentation of histological sections for vasculature quantification
Palokangas et al. Segmentation of folds in tissue section images
Feng et al. An advanced automated image analysis model for scoring of ER, PR, HER-2 and Ki-67 in breast carcinoma
Khan et al. Segmentation of single and overlapping leaves by extracting appropriate contours
Silva et al. Development of a quantitative semi-automated system for intestinal morphology assessment in Atlantic salmon, using image analysis
EP3757872A1 (en) Scanning/pre-scanning quality control of slides
RU2659217C1 (en) Method for recognition of the structure of blood blast nuclei and bone marrow using light microscopy in combination with computer data processing for the diagnosis of b- and t-linear acute lymphoblastic leukemia
RU2785607C1 (en) Method for recognising the structure of blood and bone marrow blast nuclei
US20220375604A1 (en) System and method for automation of surgical pathology processes using artificial intelligence
WO2006085068A1 (en) Apparatus and method for image processing of specimen images for use in computer analysis thereof

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)