GB2329014A - Automated identification of tubercle bacilli - Google Patents
Automated identification of tubercle bacilli Download PDFInfo
- Publication number
- GB2329014A GB2329014A GB9719017A GB9719017A GB2329014A GB 2329014 A GB2329014 A GB 2329014A GB 9719017 A GB9719017 A GB 9719017A GB 9719017 A GB9719017 A GB 9719017A GB 2329014 A GB2329014 A GB 2329014A
- Authority
- GB
- United Kingdom
- Prior art keywords
- bacilli
- features
- image
- images
- thresholding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
- G01N21/6458—Fluorescence microscopy
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/6428—Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
Landscapes
- Health & Medical Sciences (AREA)
- Immunology (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Biochemistry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Chemical Kinetics & Catalysis (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
In a method for the automated identification of tubercle bacilli from clinical specimens, an automated optical or fluorescent digital microscope is used to capture digital images from clinical specimens (principally sputum samples). These images are manipulated using image processing techniques such as edge detection, thresholding, feature detection and labelling, construction of shape descriptors for these features. Neural network techniques, statistical classifiers or methods from machine learning are then used to recognise image features corresponding to TB bacilli.
Description
1 Background to the Invention
Tuberculosis is now the world's leading cause of death from a single infectious disease.
In developing countries it is responsible for about 25% of adult deaths from infectious diseases, which is greater than that for diarrhea, malaria and AIDS combined. Current annual mortality from the disease is about 3 million and it is expected to rise to 4 million by 2004. Due to this increasing rate of infection the disease was classified as a global emergency by the WHO in 1993 [1,2,3]. The increased infection rate has resulted from a number of factors: the development of multi-drug resistance by the tubercle bacillus, inadequate programmes for controlling the disease, increased use of immunosuppressive drugs, co-infection with HIV, increased migration and the increasing number of young adults in the world (the group with the highest mortality rate from the disease). Tuberculosis remains a serious health problem in developing countries but it is also on the increase in the developed countries. For example, there has been a 33% increase in Switzerland during 199ss1995 and a 12% increase in the USA over the same period where 1 in 5500 individuals are currently diagnosed as infected with the disease.
Infection with tuberculosis can be verified by viewing sputum samples using a microscope.
The identification of the tubercle bacilli is by no means simple and infection can be missed.
The screening of a large number of patients may require the viewing of hundreds of slides by qualified personnel. This is time-consuming and expensive and a potential drain on resources for medical facilities in developing countries. Consequently the aim of this patent is to give a cheap, quick and simple procedure for the automated recognition of tubercle bacilli from clinical specimens. By introducing this procedure we also expect that an increased number of patients can be handled and a larger volume of follow-up sputum samples can be taken from patients receiving DOT (Directly Observed Therapy) thereby facilitating identification of patients who are responding and those who may be resistant to therapy.
Though the method outlined here could be applicable to detection of TB bacilli from other sites (e.g. lymph nodes, pleura, lung, etc) the principal medium considered will be sputum specimens. Sputum specimens are generally stained to highlight the TB bacilli and there are two approaches. An acid-fast stain, such as the Ziehl Neelsen (ZN) stain, can be used.
In this case the bacilli can be viewed with an optical microscope and appear as thin pinkred rod-like organisms. Alternatively the organisms can be detected using fluorescent
Auramine/Rhodamine and Papanicolaou stains. This approach is more sensitive than an acid fast stain but requires fluorescent microscopy. Our method is applicable to both these modalities.
The proposed method makes use of recently developed techniques from image processing and image recognition. Our approach is similar to methods which have been used in the automated recognition of malignant cells from cervical smears. The latter methods have been patented [4,5,6,7,8] and commercially applied.
[1] "TB. A Global Emergency", WHO Report on the TB Epidemic, WHO Publications 1993.
[2] "Stop TB. at the Source", WHO Report on the Tuberculosis Epidemic, WHO Publications 1995.
[3] "Groups at risk", WHO Report on the Tuberculosis Epidemic, WHO Publications 1995.
[4] "Neural network based automated cytological specimen classification system and method", inventor: M.R. Rutenberg, US Patent 4,965,725 (Oct. 23, 1990).
[5] "Automated cytological specimen classification system and method", inventor: M.R.
Rutenberg et al., US Patent 5,287,272 (Feb. 15, 1994).
[6] "Morphological classification system and method", inventor: R.L. Luck et al. , US
Patent 5,257,182 (Oct. 26, 1993) [7] "Inspection apparatus and method with inspection auditing for images presented on a display", inventor: M.R. Rutenberg, M.Y. Monsey, US Patent 5,333,207 (July 26, 1994).
[8] "Automated specimen classification system and method", inventor: M.E. Boon et al.,
US Patent 5,544,650 (Aug. 13, 1996).
2 Figures
Figure 1: The principal processes involved in the method. An automated slide loader presents slides (e.g. of sputum samples) to a digital microscope. The digital microscope captures images from the slides. Features on the slide are enhanced and labelled by image processing techniques. The labelled features are presented to a recognition system comprising a neural network, or statistical discrimant analysis package or decision-theoretic algorithm. This classifies the features as TB bacilli or non-bacilli features.
Figure 2: The principal steps used in the image processing and feature recognition steps used by the computer.
Figure 3: The direction conventions used by the boundary tracer outlined in step 4.
Figure 4: An illustration of a multi-layered neural network. Input data is presented to the input layer and processed by a hidden layer and the output node. The output classifies the input data as corresponding to a tubercle bacillus feature or a non-bacillus feature.
3 Description of the Preferred and Alternative Em
bodiments 1. Slide loading. First the slides are loaded using a robotic slide loader. After the scanning procedure outlined below they can be sorted into two groups in the slide tray depending on the outcome (see Fig. 1).
2. Image capture. Each slide is scanned and a number of separate digital images are captured using a charge-coupled device (CCD) camera. Either an optical or a fluorescent microscope could be used depending on the method of staining. A variety of microcopes could be used at this stage, for example, those manufactured by Olympus,
Carl Zeiss Inc., Applied Precision Inc., Intracellular Imaging Inc, etc. Next the digital images are processed using a computer and the principal steps involved are laid out below and illustrated in Figure 2.
3. Image segmentation and Enhancement.
Following capture of the image, the features are enhanced using edge detection followed by thresholding to enhance the edges.
3.1 Edge detection. A number of edge detection algorithms could be used [40]. The preferred embodiment is to use a Laplacian operator [4, 8] but Sobel [40], Canny[7] and
Marr-Hildreth operators [22] can be effectively used instead.
3. 2. Thresholding. The second enhancement operation is thresholding. There are a number of different approaches to thresholding and the reviews of Gonzalez [11], Niblack [27] and
Weszka [45] give a good background.
For the colour images (e.g. from ZN staining) the RGB components are thresholded. For monochrome images (e.g. from fluoresecent microscopy) the grey levels are thresholded.
For example, for a monochrome image with 256 grey levels (0 representing black and 255 representing white) and f(z, y) (with 0 < f(z, y) < 255) representing the grey level of the image pixel at (x,y), the binary image after thresholding is given by:
0 if f (x, y) < T g (x, y) = # 1 otherwise Two general types of thresholding are possible. For local thresholding the image is divided into windows with different thresholds in each window. Local thresholding is used if the background changes appreciably across the image. For global thresholding the threshold is the same across the image. We have found that global thresholding is sufficient for the types of image considered.
The other issue is a suitable value for the threshold. The threshold can be found by inspection of the histogram giving the frequency of occurrence of pixel intensities. This gives a good indication of a suitable threshold for the separation of background and object.
The staining technique highlights the bacilli and consequently this histogram is generally bimodal in shape. Consequently the preferred techique is to use the mode method: find the two highest maxima and use the threshold which is the minimum between these peaks. To avoid two local maxima belonging to the same peak a minimum separation in grey levels can be used or the histogram can be smoothed. This is the preferred embodiment but alternate embodiments are possible using more sophisticated thresholding techniques. For example, it is possible to use the thresholding technique of [9, 361 in which the histogram is approximated by two normal distributions, or the iterative threshold selection scheme of Rider and Calvard [32], etc. The review by Sahoo [38] covers a number of further alternatives 115, 21, 26].
After edge detection followed by thresholding the images typically have several types of features present. The bacilli usually appear as slender features, generally straight and rodlike or slightly curved. They are also approximately of the same length. Other features which may appear in the image are longer (frequently curved) filamentamental structures which correspond to the capture of the boundaries of other objects such as lymphocytes and other cellular tissue. In addition there are further small point-like features which are not associated with the bacilli. We will call the latter two categories of features artefacts since they are not associated with the bacilli and distract from the recognition. The word featpire will be used to generically include these artefacts and the bacilli. We have also tried morphological operations such as erosion and dilation to see if these improved performance but we did not find any positive benefit.
4. Boundary tracing and feature labelling. In the steps below we will extract relevant features from the image, derive a number of parameters describing each of these features (step 5) and then present these to the recognition system (step 6). Thus our next step is to detect important features (or sets of connected pixels which we will call regions), labelling each such feature in turn. After labelling a number of regions or features in the image we then find the boundary trace for these regions. Boundary traces are required for some of the shape description routines outlined below. Straightforword 2-pass algorithms for region identification are described in Ch. 6 of Sonka et al. [40]. For boundary tracing we have used the following routine.
Search through the image from the top left corner until a pixel belonging to a new region is found. This pixel is the starting point for starting the boundary trace for the new region. A variable d stores the direction of the previous move along the boundary from the previous location to the present one. For 8-connectivity we initially assign d = 7. We then search the 8 pixels surrounding the current pixel in an anti-clockwise direction with the start direction and subsequent search directions determined according to the convention given in Figure 3. If the current value of d is an even number then the start direction for the search is the direction (d + 7) mod 8 while the start direction is (d + 6) mod 8 if the current value of d is an odd number. The anti-clockwise search through the pixels proceeds through increasing values of d resetting to 0 after d = 7. The first pixel with value 1 is the next pixel on the boundary and the current direction taken is assigned as variable d. This procedure is repeated until we return to the starting pixel.
5. Shape description. Having found the features or regions and their boundaries, the next stage in the procedure is to develop shape descriptors for these features. The parameters from these shape descriptors will be fed into the classification systems mentioned in step 6 below. A number of shape descriptors are possible and we will outline three descriptors for purposes of illustration.
5.1 Fourier descriptors. Fourier descriptors classify the shape using Fourier coefficients (44,12, 29, 31, 37]. First the boundary is found (step 4) and the boundary trace is stored in a counter clockwise direction. In order to use a Fast Fourier Transform (FFT) algorithm this boundary is then divided into N = 2" parts. Of course the boundary may not be a multiple of 2, consequently the number of boundary points is "padded out" using 0's to obtain the requisite multiple of 2 for the FFT. Each point on the boundary with location (:r,) y) is then written as a complex number z = x +jy and the resulting sequence (z(0),z(1),. . . ., - 1)) can be written z(k). The FFT algorithm is a means for finding the Fourier coefficients c" in the one-dimensional Fourier series:
The cn are complex in general (cn = a,+ jbn) and though an and bn are not invariants the magnitudes
are translationally and rotationally invariant shape de scriptors for the selected feature. The calculation of the FFT can be speeded up using dedicated circuitry. A description of suitable FFT algorithms can be found in Press et al.
[30].
5.2 Moment invariants. Another method for describing an object is by means of moments [11, 13, 20, 41]. For images with discrete components these moments have the form:
In step 4 the labelled regions have been found and the boundaries and interiors of these regions have pixel values 1 and the exteriors have pixel values 0. Cosequently we find the moments for each of these regions using:
where the (x,y) are the coordinates of labelled points within that region and the centre of each region is: # = m10/m00
# = m01/m00 The central moments are then defined by:
and are invariant with respect to location. For shape descriptors invariant with respect to rotation and scale changes we use the following parameters called the moment invariants [13i:
#1 = #20 + #02
#2 = (#20 - #02)2 + 4#11 #3 = (#30 - 3#12)2 + (3#21 - #03)2 #4 = (#30 +#12)2 + (#21 + #03)2 #5 = (#30 - 3#12) (#30 + #12) [(#30 + #12)2 - 3(#21 + #03)2] +(3#21 - #03)(#21 + #03) [3(#30 + #12)2 - (#21 + #03)2] #6 = (#20 - #02) [(#30 + #12)2 - (#21 + #03)2] +4#11 (#30 ~ #21)(#21 + #03) #7 = (3#21 - #03)(#30 + #12) [(#30 + #12)2 - 3(#21 + #03)2] +(3#12 - #30)(#21 + #03) [3(#30 + #12)2 - (#21 + #03)2] where #pq = pq/ 00m and m = 1 + (p + q)/2, provided p + q = 2,3, .
Because these invariants are sometimes small in value it is generally advisable to renormalise them. One scheme is to taa:e the log of each #i (i.e. In |#i|) though in the preferred embodiment we have used roots e.g.:
#5 = (#5)#, #6 = (#6) #, #7 = (#7)#, Instead of region moments an alternative embodiment is to use moments based on the boundary trace [40].
5.3 Circularity descriptor. In the images we are considering the bacilli are thin rod-like features with a typical length/breadth ratio. Consequently it is a good idea to use a circularity measure as a further distinguishing descriptor:
area C = 4rr'bot area (boundary length)= Thus for a circle C t 1 whereas a thin rod-like structure gives a circularity measure C near to 0.
5.4 Other descriptors. Other shape descriptors are useful such as those based on the
Hough transform [10, 2, 25, 14, 19, 28] or curvatures [35, 36, 42].
6. Classification of the features in each image. By themselves each of these descriptors can have drawbacks. For example the circularity descriptor mentioned in section 5.3 would seem very appropriate for distinguishing thin rod-like bacilli. However, circularity can be inaccurate for images with binary-valued pixels such as here and where the resolution of the objects can be small. Thus in order to optimise performance the preferred embodiment is to use all the shape descriptors for presentation to the classification systems outlined below.
The remaining task is to classify each feature into bacillus or non-bacillus based on this spectrum of parameters from the shape descriptors. There are a number of classification systems that could be used at this point and they broadly fall into three classes: those based on machine learning, statistical approaches and algorithms derived from neural computing. The preferred embodiment is to use neural computing algorithms (6.1) but alternate embodiments are machine learning algorithms (6.2) and statistical approaches (6.3). A review and comparison of all these different approaches to classification is given in [24, 33, 43].
6.1. Classification using a neural network. In the preferred embodiment a multi-layered perceptron (MLP) neural network is used for the classification. This type of neural network architecture is illustrated in Figure 4. The shape description parameters are presented at the input layer (these are the square nodes in Figure 4). A layer of hidden nodes and the output node process this data and the output node classifies the input data set accordingly. The main advantage of using a neural network is that it is trained from examples, not from a rule, and consequently it will learn the categorisation through presentation of a number of predassified datasets. Neural network algorithms have been extensively reviewed elsewhere [5, 34].
For this type of neural network architecture the learning process consists in iteratively adapting the "weights" or values of the connection strengths (these connections are the links between the processing nodes shown in Figure 4). Thus if T" is the target (desired) classification for presentation of the cloth input dataset I and O the output of the network on presentation of Is' then the objective of the learning process is to minimise the following error E over all the datasets:
This would set up the desired mapping I T Ts' between the shape descriptor parameter sets and the classifications (e.g. Tz = 1 indicates a TB bacillus and Tz = 0 indicates the feature is not a bacillus). This minimisation can be achieved by a gradient descent procedure (called Back Propagation) which systematically adjusts the parameters in the network (the weights) until E is very small and below a tolerance. This gradient descent procedure is generally slow but reliable. An alternative procedure is to use second order learning methods (e.g. Quickprop, Scaled Conjugate Gradient, Bold Driver, etc) to improve the speed of the learning process [39]. The number of hidden nodes in the neural network are found using crossvalidation [5] or Bayesian model selection {23].
Apart from a multi-layer perceptron a large number of other neural computing architectures and algorithms are alternate embodiments, including the use of radial basis function (RBF) networl st53, or constructive algorithms [6], or feature maps [16]. Alternate em bodiments also include the use of Bayesian techniques to estimate the confidence in the classification made by the neural network [23].
6.2. Machine learning rules and trees. Decision tree approaches also provide alternative embodiments for classification (e.g. C4.5, ID3, NewID, Bayes tree, etc [243). Also included are neural decision trees [6] combining decision theoretic approaches with neural computing.
6.3 Statistical approaches. Statistical approaches also provide alternative embodiments for the classification of the parameters from step 5. This includes classical statistical methods such as discrimant analysis [17, 18], and Bayes rule, as well as modern statistical techniques such as k-nearest neighbours, projection pursuit classification, ACE, MARS, etc 124}.
7. Final categorisation. For each image the classifiers in step 6 categorise the labelled features into tubercle bacilli or non-bacilli. The final step is to average the performance for a number of images. Thberculosis is identified if the average number of bacilli features (per image) exceeds a threshold amount for those images with features present.
Claims (1)
- 4 Claims What is claimed is: 1. A method for detecting tuberculosis bacilli in clinical specimens (e.g. from sputum smears), comprising the steps of: (a) using an automated optical or fluorescent digital microscope to obtain digital images of at least part of such clinical specimens; (b) using image processing techniques to label and select significant features from said digital images; (c) to classify said features as indicating the presense of tubercle bacilli using neural network, machine learning or statistical classification methods.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9719017A GB2329014A (en) | 1997-09-05 | 1997-09-05 | Automated identification of tubercle bacilli |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9719017A GB2329014A (en) | 1997-09-05 | 1997-09-05 | Automated identification of tubercle bacilli |
Publications (2)
Publication Number | Publication Date |
---|---|
GB9719017D0 GB9719017D0 (en) | 1997-11-12 |
GB2329014A true GB2329014A (en) | 1999-03-10 |
Family
ID=10818711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9719017A Withdrawn GB2329014A (en) | 1997-09-05 | 1997-09-05 | Automated identification of tubercle bacilli |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2329014A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101852694A (en) * | 2010-04-09 | 2010-10-06 | 上海国际旅行卫生保健中心 | Sputum smear used for acid-fast bacillus microscope examination in proficiency testing activity and preparation method and application thereof |
CN102855641A (en) * | 2012-08-10 | 2013-01-02 | 上海电机学院 | Fruit level classification system based on external quality |
US9522396B2 (en) | 2010-12-29 | 2016-12-20 | S.D. Sight Diagnostics Ltd. | Apparatus and method for automatic detection of pathogens |
US10488644B2 (en) | 2015-09-17 | 2019-11-26 | S.D. Sight Diagnostics Ltd. | Methods and apparatus for detecting an entity in a bodily sample |
EP3608701A1 (en) * | 2018-08-09 | 2020-02-12 | Olympus Soft Imaging Solutions GmbH | Method for providing at least one evaluation method for samples |
US10640807B2 (en) | 2011-12-29 | 2020-05-05 | S.D. Sight Diagnostics Ltd | Methods and systems for detecting a pathogen in a biological sample |
US11099175B2 (en) | 2016-05-11 | 2021-08-24 | S.D. Sight Diagnostics Ltd. | Performing optical measurements on a sample |
US11307196B2 (en) | 2016-05-11 | 2022-04-19 | S.D. Sight Diagnostics Ltd. | Sample carrier for optical measurements |
US11434515B2 (en) | 2013-07-01 | 2022-09-06 | S.D. Sight Diagnostics Ltd. | Method and system for imaging a blood sample |
US11609413B2 (en) | 2017-11-14 | 2023-03-21 | S.D. Sight Diagnostics Ltd. | Sample carrier for microscopy and optical density measurements |
US11733150B2 (en) | 2016-03-30 | 2023-08-22 | S.D. Sight Diagnostics Ltd. | Distinguishing between blood sample components |
US12005443B2 (en) | 2020-10-06 | 2024-06-11 | S.D. Sight Diagnostics Ltd. | Apparatus and method for analyzing a bodily sample |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108285866A (en) * | 2018-01-18 | 2018-07-17 | 华南农业大学 | A kind of incubator with image analysis function |
CN113191670B (en) * | 2021-05-19 | 2023-08-25 | 贵州省气象灾害防御技术中心 | Fine lightning disaster risk evaluation and division method |
WO2023044690A1 (en) * | 2021-09-24 | 2023-03-30 | 中国科学院深圳先进技术研究院 | Bacterial colony classification method |
-
1997
- 1997-09-05 GB GB9719017A patent/GB2329014A/en not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
Proc 8th Annual Conference of IEEE/Engineering in Medicine &Biological Soc. vol 2,1986,pages 1020-22 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101852694A (en) * | 2010-04-09 | 2010-10-06 | 上海国际旅行卫生保健中心 | Sputum smear used for acid-fast bacillus microscope examination in proficiency testing activity and preparation method and application thereof |
US10843190B2 (en) | 2010-12-29 | 2020-11-24 | S.D. Sight Diagnostics Ltd. | Apparatus and method for analyzing a bodily sample |
US9522396B2 (en) | 2010-12-29 | 2016-12-20 | S.D. Sight Diagnostics Ltd. | Apparatus and method for automatic detection of pathogens |
US11584950B2 (en) | 2011-12-29 | 2023-02-21 | S.D. Sight Diagnostics Ltd. | Methods and systems for detecting entities in a biological sample |
US10640807B2 (en) | 2011-12-29 | 2020-05-05 | S.D. Sight Diagnostics Ltd | Methods and systems for detecting a pathogen in a biological sample |
CN102855641A (en) * | 2012-08-10 | 2013-01-02 | 上海电机学院 | Fruit level classification system based on external quality |
US11434515B2 (en) | 2013-07-01 | 2022-09-06 | S.D. Sight Diagnostics Ltd. | Method and system for imaging a blood sample |
US11262571B2 (en) | 2015-09-17 | 2022-03-01 | S.D. Sight Diagnostics Ltd. | Determining a staining-quality parameter of a blood sample |
US11199690B2 (en) | 2015-09-17 | 2021-12-14 | S.D. Sight Diagnostics Ltd. | Determining a degree of red blood cell deformity within a blood sample |
US10663712B2 (en) | 2015-09-17 | 2020-05-26 | S.D. Sight Diagnostics Ltd. | Methods and apparatus for detecting an entity in a bodily sample |
US11914133B2 (en) | 2015-09-17 | 2024-02-27 | S.D. Sight Diagnostics Ltd. | Methods and apparatus for analyzing a bodily sample |
US11796788B2 (en) | 2015-09-17 | 2023-10-24 | S.D. Sight Diagnostics Ltd. | Detecting a defect within a bodily sample |
US10488644B2 (en) | 2015-09-17 | 2019-11-26 | S.D. Sight Diagnostics Ltd. | Methods and apparatus for detecting an entity in a bodily sample |
US11733150B2 (en) | 2016-03-30 | 2023-08-22 | S.D. Sight Diagnostics Ltd. | Distinguishing between blood sample components |
US11307196B2 (en) | 2016-05-11 | 2022-04-19 | S.D. Sight Diagnostics Ltd. | Sample carrier for optical measurements |
US11808758B2 (en) | 2016-05-11 | 2023-11-07 | S.D. Sight Diagnostics Ltd. | Sample carrier for optical measurements |
US11099175B2 (en) | 2016-05-11 | 2021-08-24 | S.D. Sight Diagnostics Ltd. | Performing optical measurements on a sample |
US11614609B2 (en) | 2017-11-14 | 2023-03-28 | S.D. Sight Diagnostics Ltd. | Sample carrier for microscopy measurements |
US11609413B2 (en) | 2017-11-14 | 2023-03-21 | S.D. Sight Diagnostics Ltd. | Sample carrier for microscopy and optical density measurements |
US11921272B2 (en) | 2017-11-14 | 2024-03-05 | S.D. Sight Diagnostics Ltd. | Sample carrier for optical measurements |
EP3608701A1 (en) * | 2018-08-09 | 2020-02-12 | Olympus Soft Imaging Solutions GmbH | Method for providing at least one evaluation method for samples |
US12005443B2 (en) | 2020-10-06 | 2024-06-11 | S.D. Sight Diagnostics Ltd. | Apparatus and method for analyzing a bodily sample |
Also Published As
Publication number | Publication date |
---|---|
GB9719017D0 (en) | 1997-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li et al. | Segmentation of white blood cell from acute lymphoblastic leukemia images using dual-threshold method | |
Moshavash et al. | An automatic and robust decision support system for accurate acute leukemia diagnosis from blood microscopic images | |
US5933519A (en) | Cytological slide scoring apparatus | |
JP6710135B2 (en) | Cell image automatic analysis method and system | |
US5566249A (en) | Apparatus for detecting bubbles in coverslip adhesive | |
US5978497A (en) | Apparatus for the identification of free-lying cells | |
Shafique et al. | Computer-assisted acute lymphoblastic leukemia detection and diagnosis | |
CA2228062C (en) | Robustness of classification measurement | |
US9196036B2 (en) | Device and method for determining objects in a color recording | |
JP5469070B2 (en) | Method and system using multiple wavelengths for processing biological specimens | |
WO1996009606A1 (en) | Field prioritization apparatus and method | |
Memeu et al. | Detection of plasmodium parasites from images of thin blood smears | |
GB2329014A (en) | Automated identification of tubercle bacilli | |
Percannella et al. | A classification-based approach to segment HEp-2 cells | |
JP6733983B2 (en) | Image analysis device | |
Şengür et al. | White blood cell classification based on shape and deep features | |
US8064679B2 (en) | Targeted edge detection method and apparatus for cytological image processing applications | |
Savkare et al. | Automatic blood cell segmentation using K-Mean clustering from microscopic thin blood images | |
WO2021152089A1 (en) | Systematic characterization of objects in a biological sample | |
CN108921172A (en) | Image processing apparatus and method based on support vector machines | |
Gamarra et al. | A study of image analysis algorithms for segmentation, feature extraction and classification of cells | |
Jewani et al. | Detection of diseases via blood analysis using Image processing Techniques | |
Mahdy et al. | Automatic counting of infected white blood cells using multi-level thresholding | |
Bengtsson et al. | High resolution segmentation of cervical cells. | |
Mustare et al. | Development of automatic identification and classification system for malaria parasite in thin blood smears based on morphological techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |