WO2018224852A2 - Combined bright-field and phase-contrast microscopy system and image processing apparatus equipped therewith - Google Patents

Combined bright-field and phase-contrast microscopy system and image processing apparatus equipped therewith Download PDF

Info

Publication number
WO2018224852A2
WO2018224852A2 PCT/HU2018/000025 HU2018000025W WO2018224852A2 WO 2018224852 A2 WO2018224852 A2 WO 2018224852A2 HU 2018000025 W HU2018000025 W HU 2018000025W WO 2018224852 A2 WO2018224852 A2 WO 2018224852A2
Authority
WO
WIPO (PCT)
Prior art keywords
phase
contrast
probability
images
bright
Prior art date
Application number
PCT/HU2018/000025
Other languages
French (fr)
Other versions
WO2018224852A3 (en
Inventor
Gábor BAYER
Original Assignee
77 Elektronika Műszeripari Kft.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to HU1700246 priority Critical
Priority to HUP1700246 priority
Application filed by 77 Elektronika Műszeripari Kft. filed Critical 77 Elektronika Műszeripari Kft.
Publication of WO2018224852A2 publication Critical patent/WO2018224852A2/en
Publication of WO2018224852A3 publication Critical patent/WO2018224852A3/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • G02B21/08Condensers
    • G02B21/14Condensers affording illumination for phase-contrast observation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • G02B21/08Condensers
    • G02B21/12Condensers affording bright-field illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00127Acquiring and recognising microscopic objects, e.g. biological cells and cellular parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/2027Illumination control
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6267Classification techniques

Abstract

The invention is, on the one hand, a combined bright-field and phase-contrast microscopy system, comprising a bright-field illumination unit (10), a phase-contrast illumination unit (20), and a control unit (40) adapted for controlling a switching of the bright-field illumination unit (10) and a switching of phase-contrast illumination unit (20). The bright-field illumination unit (10) and the phase-contrast illumination unit (20) have optical axes (11, 21) different from each other, and the microscopy system further comprises - an optical element (30) providing a common optical axis (31) of illumination both for an illumination arriving from the bright-field illumination unit (10) and for an illumination arriving from the phase-contrast illumination unit (20), and, arranged one after the other along the common optical axis (31) of illumination, a - sample holder (33), - a phase-contrast objective (35) and - an image recorder (50) adapted for taking an image before and an image after each respective switchover between the two illumination types. On the other hand, the invention is an image processing apparatus applying the above microscopy system.

Description

COMBINED BRIGHT-FIELD AND PHASE-CONTRAST MICROSCOPY SYSTEM AND IMAGE PROCESSING APPARATUS EQUIPPED THEREWITH

TECHNICAL FIELD The invention relates to a combined bright-field and phase-contrast microscopy system, and to an image processing apparatus equipped therewith.

BACKGROUND ART

Prior art microscopy systems - by way of example, in the field of biological sample analysis, particularly the analysis of body fluid samples - apply measurement technology solutions based on the so-called bright-field microscopy. Such a solution is disclosed e.g. in WO 2013/104937. In a system according to the document an image that, by default, preferably has 15 grayscale values (bit depth: 8) and a resolution of 1280x960 pixels, is taken of a sample previously filled into a measurement tube, stirred and centrifuged. In the case of each sample, approximately 2.2 μΙ of the native urine sample is analyzed. Before recording the images, focus is separately adjusted in all positions in order to obtain a sharp image. The image of the urine sediment is evaluated utilizing a complex algorithm based on neural networks that receives the recorded image as its single input. The images are of the so-called HPF (high power field) type, i.e. they give a visual impression of looking into a microscope.

With image processing algorithms based on neural networks it is advantageous if further information is also fed to the processing algorithm in addition to the recorded image. According to the above referenced publication, for example, it is not only the image, but also in addition to or instead of the image, one or more transformed images generated by various functions from the image or one or more variants of the image in different resolutions can be used as an input. However, these only carry information based on the originally recorded image, i.e. do not provide relevant additional information for image processing. Solutions wherein images of biological samples are recorded applying the principle of phase-contrast microscopy are also disclosed in the prior art. Phase-contrast apparatuses differ from "ordinary" microscopes in that they consist of a special condenser and a so-called phase-contrast objective. The images that can be obtained applying the phase-contrast method have a very high contrast, and thus they are especially well suited for examining thinned cells that are stuck to a base in a tissue culture, as well as unstained sections with very low thickness (0.1-1 pm). A disadvantageous feature of this method is that a halo is produced especially around high-contrast structures, which deteriorates imaging quality.

In phase-contrast microscopy, a non-transparent plate, on which the blackout layer comprises an annular-shaped transparent region ("annular diaphragm") is placed in front of the condenser lens. Light leaving the condenser in the direction of the specimen therefore passes through the specimen along a conical surface and is then imaged at the focal point of the objective. The phase-contrast objective comprises a plate ("phase plate") having an annular-shaped layer that is adapted for shifting the phase of the (reference) light beam incoming along the conical surface by one-fourth of the wavelength in the positive or negative direction. The refracted and phase-delayed light beam, coming from the optically denser regions of the specimen to be examined, does not pass through the phase annulus, and comes together with the reference beam at the image plane, interfering with it. This results in that the beams cancel out (or, in the case of thicker structures, reinforce) each other, and thus the regions of the object having different refractive index and thickness appear to have different darkness (or occasionally, brightness). The purpose of the phase plate is to provide reasonably large phase difference, and therefore, contrast, in the case of beams with a relatively small phase delay.

Phase-contrast images in themselves are less suited for the above described image processing method because it is primarily the phase transitions that appear in the image (providing extra information), but due to the darkening (cancelling out) of regions free of phase transitions a large amount of the information provided by bright-field microscopy is lost.

The prior art contains solutions wherein mechanical switching between bright-field and phase-contrast imaging is provided, and thus, at least in principle, both types of images could be provided for the purposes of image processing. Such solutions are disclosed, by way of example, in GB 866,437, US 6,479,807 B1 , US 5,731 ,894 and US 4,407,569. However, the known solutions are not suited for the purpose P T/HU2018/000025

- 3 - according to the invention, i.e. for examining biological samples, because during the relatively long switching period the image components (e.g. bacteria) change location. This is because the switching involves moving as many as two optical elements: the annular diaphragm has to be switched to the bright-field diaphragm, and the phase plate also has to be removed from the objective.

DESCRIPTION OF THE INVENTION

According to the invention we have recognized that if a phase-contrast microscopy image is recorded in addition to a bright-field image, in the same position therewith, then additional information can be obtained on the sample, allowing the image processing/evaluation algorithm to process two, instead of one, input images. More input information yields more accurate output information, which in turn results in a significant increase in the reliability of image recognition, and can also allow for appropriately selecting between decision branches for the further course of image processing.

According to a further recognition of the invention the same phase-contrast objective and camera is applied for recording both types of images. The bright-field and phase-contrast images are recorded applying separate illumination systems that are coupled into the same light path. The type of the recorded image depends on which light source - more accurately, the light source applying which diaphragm - is switched on. By switching between the light sources according to a given sequence the two different-type images can be recorded at all positions in a few tenths of a second.

The invention is therefore related to a combined microscope that comprises a phase- contrast and a compromise bright-field microscopy solution, the two illuminations being merged in an Y-like manner, while the part of the light path reaching the sample and going through the microscopy is the same, the objective is the same, i.e. a phase annulus is included also in the bright-field case. Thus, the bright-field image is not a bright-field one in the classical sense, but rather a "compromise" bright-field image, however, our experiments have proven that such a bright-field image is completely appropriate for the applications according to the invention. Therefore, the recognition that has proven to be especially advantageous is that for switching between the two types of microscopy, switching between the diaphragms is performed by switching between the illumination units, the phase plate not being removed from the objective but - as a compromise solution - being left therein. According to our experiments, the small deterioration of quality of the bright field image resulting from this is by far compensated for by the additional information obtainable from the phase-contrast image.

By way of example, the image size can remain at the original 1280x960 pixels. The objective of the invention is therefore to provide a solution wherein a phase-contrast microscopy image is recorded in addition to a bright-field image, in the same position therewith, achieving a more effective operation of the image evaluation algorithm by means of the additional information provided. A further objective of the invention is to provide an image processing solution wherein, thanks to the above described additional information, selection between processing branches is allowed after the image recognition step, and thereby processing can be directed to decision modules trained in a targeted manner. A further objective of the invention is to reduce or eliminate, to the greatest possible extent, the above mentioned drawbacks of prior art systems.

The objectives according to the invention have been achieved by the microscopy system according to claim 1 and the image processing apparatus according to claim 11. Preferred embodiments of the invention are defined in the dependent claims.

BRIEF DESCRIPTION OF THE DRAWINGS

Preferred embodiments of the invention are described below by way of example with reference to the following drawings, where

Fig. 1 shows a schematic diagram of a microscopy system according to the invention,

Fig. 2 shows, side by side, bright-field and phase-contrast images of the same sample,

Fig. 3 shows a digital bright-field image of an urine sample processed by the method according to the invention,

Fig. 4 is the magnification of a marked section of the image shown in Fig. 3, Fig. 5 is a schematic view illustrating the generation of probability maps from bright- field and phase-contrast images,

Fig. 6 is a schematic view of groups representing presumably present elements located in a probability map, and the examination areas associated with the groups,

Fig. 7 is a schematic view illustrating exemplary input information of the decision module of the apparatus according to the invention, and

Fig. 8 is a schematic view of the image processing apparatus according to the invention,

MODES FOR CARRYING OUT THE INVENTION

The present invention is partially based on the solution according to WO 2013/104937 that is incorporated herein by reference in its entirety and its teaching is considered as part of this application, with a special regard to the image processing examples and modes of application disclosed therein. The exemplary combined bright-field and phase-contrast microscopy system that can be seen in Fig. 1 comprises a bright-field illumination unit 10, a phase-contrast illumination unit 20, and a control unit 40 adapted for controlling the switching of the bright-field illumination unit 10 and of the phase-contrast illumination unit 20. The bright-field illumination unit 10 and the phase-contrast illumination unit 20 have different respective optical axes 11 and 21. The microscopy system further comprises an optical element 30 providing a common optical axis 31 for illumination arriving from the bright-field illumination unit 10 and for illumination arriving from the phase-contrast illumination unit 20, and, arranged one after the other along the common optical axis 31 of illumination 31 , a sample holder 33, a phase-contrast objective 35, and an image recorder 50 adapted for taking images before and after each respective switchover between the two illumination types. The latter control function is preferably also implemented by the control unit 40. The control unit 40 can be implemented either as hardware or as software, or as any combination thereof. The sample 34 to be examined can be placed on the sample holder 33. In a manner known per se the phase-contrast objective 35 comprises lenses 36, 38 and a phase plate 37 arranged between the lenses 36, 38. 18 000025

- 6 -

According to the invention it is preferable to apply an image recorder 50, e.g. a camera, controlled to take the images with an interval of maximum 200 ms, preferably of 100 ms, at each switchover. According to our experience, 200 ms is an upper limit above which - due to the displacement of certain elements detectable in the biological sample - the quality of image processing based on dual images may be lower compared to the case where image processing is based on bright-field images. Our experiments indicated that in the case of intervals of 100 ms or lower, the quality of image processing according to the invention increases unexpectedly; this time limit ensures that the location of elements in biological samples remains substantially identical.

The invention is therefore related to a combined microscope that comprises a phase- contrast and a (compromise) bright-field microscopy solution, the two illuminations being merged in a Y-like manner, while the part of the light path reaching the sample and going through the microscopy is the same, the objective is the same, i.e. a phase annulus is included also in the bright-field case. As a result of that, the "bright- field" image is not a bright-field image in the classic sense of the term, but a "compromise" image which has proven to be well-suited for the given application. According to the invention, no mechanical step is required for switching between the two measurement types; in that the invention differs from prior art switching solutions.

In a manner illustrated in Fig. 1 , the optical element 30 providing the common optical axis 31 of illumination is preferably a prism, but of course a semi-transparent mirror or a different optical element with a similar functionality can also be applied.

The two illuminations preferably emit light of the same color, wherein a phase- contrast objective 35 tuned to this color is applied. The color of the light emitted by the illuminations is preferably green, and is implemented by green LED(s) 13, 23. This has the advantage that commercially available phase-contrast objectives 35 are usually also optimized for/tuned to green light (i.e. the middle of the visible spectrum).

The phase-contrast illumination unit 20 is preferably capable of also emitting light of a further color that can be applied for gathering more information for neural network based image processing. This further color is preferably blue, implemented by blue 00025

- 7 -

LED(s) 23; blue light, having a shorter wavelength, can be utilized for producing images with better lateral resolution, even if the phase-contrast objective 35 is optimized for green light.

The recorded images are preferably greyscale images, i.e. only the intensity values are recorded by the given pixels, independently of the color of illumination.

The bright-field illumination unit 10 preferably comprises a light source 12, a light concentration lens 15 and a diaphragm 14 arranged between them and adjusted to bright field illumination. The phase-contrast illumination unit 20 preferably also comprises a light source 22, a light concentration lens 25 and an annular diaphragm 24 arranged between them.

In a manner illustrated in Fig. 1 a condenser lens or condenser lens system 32 is arranged between the optical element 30 adapted for providing the common optical axis 31 of illumination and the sample holder 33. Between the phase-contrast objective 35 and the image recorder 50 there is arranged an imaging lens 39 or an imaging lens system; the function thereof is adjusting the image size to the size of the given chip, by way of example when the camera is replaced. The phase-contrast objective 35 is preferably a positive phase-contrast objective, but a negative phase- contrast objective can of course also be utilized for the purposes of the invention.

The most important features of the novel measurement technology solution implemented applying the above described system are the following: the combined bright-field and phase-contrast microscopy technology provides two images, of a given field of view, i.e. additional information, which yields more accurate results in neural network-based image processing systems. The two images can be applied for generating a so-called "composite image" utilizing a number of different technologies (e.g. by applying several differently colored illuminations, or by combining/unifying the two images recorded by different technologies), the generated composite image containing both phase and intensity information. These two types of information can by way of example be recorded in an RGB image containing 3x1 bytes of data per pixel such that the intensity values (bright-field image) are assigned to one of the RGB colors, while phase information (phase-contrast image) is assigned to another color. Assigning can be performed utilizing any suitable mathematical operation; by way of example the phase value can even be inversely proportional to the intensity of the given color. The user can thus be provided with an expressive visual experience that is characteristic of the technology. According to the invention this composite image (containing extra information) can also be fed to the input of the image processing apparatus; in the context of the present application the two separate images and the composite image generated therefrom are considered equivalent and interchangeable.

Fig. 2 shows a series of image pairs, each pair of images being taken of the same sample, and consisting of a bright-field image 100 and a phase-contrast image 101. Switching between the measurement types can be performed applying either manual or automatic (software) control, by simply turning the illuminations on and off one after the other, and taking an image in both cases (in any order, although the order of taking the images may be fixed across all measurements). An identical arrangement of the elements in the sample can thereby be ensured for both images.

The two images of different types taken of the same field of view are especially advantageous for evaluation as they contain additional information compared to the cases wherein only bright-field or phase-contrast images are taken. This has been proven by training the neural network structure and evaluating the results. During this test a neural network having two image inputs was trained in three different ways: with two bright-field images, two phase-contrast images, and also with one bright-field and one phase-contrast image. In the latter case training was faster, with a better end result (higher rate of recognition with a lower error rate). The structure of the neural network was identical in both cases, i.e. the network and the number of samples were the same. This has proven the applicability and advantageous nature of the basic idea of the invention.

Preferably, the time elapsed between taking the images of the same field of view is not more than 200 ms, more preferably not more than 100 ms. This means that it is imperative for this technique that the switchover between the two illumination types is fast, as it is very important that there is no (relevant) change in the field of view during the interval (time period) between taking the images. It is important to be as fast as possible because, by for example, some components of urine sediment can move on their own, especially bacteria (which are also small in size) but Trichomonas and other components can also be capable of displacement. Fig. 3 shows an exemplary image 100 of a urine sample in association with urine analysis. Fig. 4 shows a magnified version of a marked part of the image 100. In Fig. 4 the visual representation of a number of elements 1 16 to be categorized, i.e. that of the various particles and image elements is shown. According to the invention, an element - shown in image 100 - means a visual appearance of any object which can be recognized and categorized. The preferred embodiment of the invention is described below for urine analysis, i.e. for the processing of a digital image of a urine sample. In the photo of the urine sample, by way of example, the following objects or elements may be subjected to categorization:

- bacterium (BAC);

- squamous epithelial cell (EPI);

- non-squamous epithelial cell (NEC);

- red blood cell (RBC);

- white blood cell (WBC),

- background (BKGND), and

- an edge of a particle.

Of course, in addition to the items above, further elements and objects to be categorized may be discovered in the photos of the urine samples. Including the background, altogether typically 10 to 30 element categories can be set up.

Based on the description above, it is especially beneficial if the background, generally representing the largest surface of a digital photo, i.e. the areas in which other elements do not appear, is featured as a separate categorized image element in the analysis. Hence, in this way, by means of a comprehensive analysis according to the invention, the background can be separated from other elements to be categorized, more efficiently than by the known solutions.

In the bright-field image 100 and in the phase-contrast image 101 shown in Fig. 5, the elements to be categorized carry various visual information. On the basis of the visual information detectable in the images 100, 101 , probability maps 1 1 1 i-n associated with particular predetermined element categories are generated. The images 100, 101 can also be input separately; in that case the number of required inputs corresponds to the combined number of pixels in the two images. If the 2018/000025

- 10 - images 100, 101 are input as a combined composite image, the number of required inputs may equal the number of pixels contained in a single image. Each probability map 1 1 1 shows the presence probability distribution of the element of the given category.

The probability maps 111 i-n can be generated at a pixel size identical with the images 100, 101 , but the probability maps 1 1 1 i-n are preferably smaller. In a preferred embodiment, images 100, 101 (or a composite image) of 1280x960 resolution and accordingly probability maps 1 1 1 -i-n of 160*120 resolution are applied. For generating the probability maps 1 1 1 i-n, preferably a neural network is applied. The teaching of the neural network can take place in any suitable way, for example according to the disclosure of the document WO 2013/104938. For generating the probability maps 1 1 i-n, it is optionally not only the images 100, 101 (or a composite image), but also in addition them one or more transformed images generated by various functions from the image, or one or more variants of the images 100, 01 in different resolutions are used.

The module generating the probability maps 1 1 i-n is called a recognition module (RM) according to the invention, this recognition module comprises inputs receiving the pixels of the images 100, 101 (or the composite image), and outputs providing the probability values of the probability maps 1 1 1 i n.

As a next step, it is examined for each category whether there are presumably present elements in the images 100, 101 . The examination for the presumably present elements is performed preferably for each probability map 1 1 1 as shown in Fig. 6. During the examination, contiguous groups 1 12 of probability values above a predetermined threshold level are looked up, and then it is determined on the basis of the size of the groups 12 and/or of the magnitude of probability values in the group 1 2 whether the group 1 12 corresponds to a presumably present element.

After determining the presumably present elements, an examination area 1 13 is determined, which is preferably positioned in the center of the group 1 12 corresponding to the presumably present element. The examination area 1 13 preferably consists of 5x5 probability values in the probability map 1 1 1 ; in the center or center of gravity of each group 1 12, one examination area 1 13 is determined as shown in the figure. According to the invention, in relation to each examination area 1 13, regarding the presence of the element associated with them, at least one further probability map 1 1 is taken into consideration for making the decision, preferably its probability values being in its examination area 1 13 positioned identically with the examination area 13 mentioned above. In a way shown in Fig. 7, preferably the examination areas 1 1 3i-n identically positioned in all 1 1 1 i-n probability maps are involved in making the decision. In these examination areas 1 13i-n, in the depicted preferred embodiment, there are 5x5 probability values 1 4, which represent in the given area the presence probability of the elements falling into various categories, projected to a 0 to 100 value range.

The analysis of the examination areas 1 13-i-n is implemented with a device called decision module (DM) according to the invention, comprising preferably a neural network having inputs assigned to the pixels in the examination areas 1 13i-n, i.e. to the probability values 1 14. The outputs of the decision module provide preferably the relevant examination area related probability of the elements of each category.

By the joint analysis of the examination areas 1 13i-n, object recognition and categorization can be performed with a higher reliability. This is because in this way the probability values of mutually excluding categories may also be taken into consideration and the contiguous pieces of information in the images 100, 101 can be used in taking the correct decisions.

In an especially preferred embodiment shown in Fig. 7, not only the probability values 1 14 of the examination areas 1 13i-n are used in the analysis of the decision module, but also statistical data 1 15 associated with the elements are taken into account. The data can be statistical data 15 of the type disclosed in the document WO 2013/104937.

Fig. 8 shows a schematic view of the image processing apparatus according to the invention. The image processing apparatus is adapted for the automatic categorization of elements in images 100, 101 of a body biological sample, and comprises - a microscopy system according to the invention, wherein the images 100, 01 of the biological sample are bright-field and phase-contrast images 100, 101 , 2018/000025

- 12 - or composite images generated from pairs of bright-field and phase-contrast images taken utilizing the microscopy system,

- a recognition module RM for generating a respective probability map 1 1 1 i-n for each category on the basis of visual information appearing in the images 100, 101 (or in the composite image), the probability map 1 1 1 showing presence probability distribution of the element 116 of the given category,

- a branching module BM adapted for selecting, based on a combined processing of the probability maps 11 1 i n, one of the so called decision branches, wherein one decision module DM is associated with each of said decision branches, and wherein each decision module DM comprises a neural network trained with patterns characteristic of the given decision branch,

- a calling module CM for locating presumably present elements in the images

100, 101 (or in the composite image) and calling a decision process regarding each presumably present element 1 16, and

- a decision module DM for each branch, adapted for performing the given branch of the decision process and for providing information about the presence of elements 1 16 of the categories on the basis of an analysis of the probability maps 1 1i-n, the decision module DM being adapted for taking into account, in examining the presence of the element 116, at least one further probability map 1 1 other than the probability map 1 1 1 associated with the category of the element 1 16.

Preferably, a respective calling module CM is associated with each of the decision branches, each calling module being adapted for calling the decision module DM located on the same decision branch, but an alternative solution can also be conceived in which the output of the branching module BM is connected to a selector input of a common calling module CM applied for more than one decision module DM.

The combined processing of the probability maps 111 i-n can comprise, by way of example, generating the sum or the average of the probability maps 111 i-n, determining the local peaks thereof, or evaluating the magnitudes of the local peaks. U2018/000025

- 13 -

According to the idea of the invention, therefore, a decision point is included after the recognition module RM, from where processing can go on in multiple directions, or alternatively, "flag"-type, i.e. one-bit yes/no decisions can also be made based on the image recognition results. The appropriate decision branch is selected by a further combined processing of the probability maps 111 i -n based on any suitable condition; the skilled person can devise this combined processing procedure taking into account considerations specific to the given samples or applications, or experimentally. Depending on the image recognition results, routing the evaluation process into different directions can present high potential for the development of evaluation. For example, it makes quite a difference whether the various particles (elements 116) have to be recognized against a clear, homogeneous background (and the decision modules DM based on neural networks can be trained utilizing samples with a homogeneous background) or against a background full of bacteria, mucus, crystalline debris, or other items. The size of squamous epithelial cells typically ranges from 10 m to 100-150 μητι, with the internal texture of the cells highly resembling a background densely filled with bacteria. Therefore, prior art image processing methods give a relatively large number of falsely identified squamous epithelial cells. If, however, images with a clear background can be distinguished from images with a background full of bacteria at the image recognition level, a branching module BM can be applied for directing image processing to the corresponding branches, the number of falsely identified squamous epithelial cells can be reduced by as much as 50-60% with the same or better image recognition quality. More than one decision module DM can be trained utilizing various training image samples that, by way of example, differ in their background. Branching can also be applied for other reasons, by way of example if a sample with a characteristic profile having a characteristic particle content is detected, etc.

If the recognition algorithm can be separately prepared for some special cases, and it is not necessary to handle all cases with a module globally trained for recognizing all required elements, a better end result can be expected. This, however, requires more training. What is important, therefore, is that a decision point is included after the first, general phase of the evaluation process, at which point individual flag-type decisions can also be made, and it is also possible to select from among the alternative paths devised for various typical cases the path to be applied for the 5

- 14 - upcoming phase of evaluation. For each path there is a corresponding decision module DM, and even a different separate final decision module FDM (see below) can be assigned to each decision branch. The final decision module FDM can also be a common one for all of the decision modules DM (if it is expedient), there is no restriction on this feature. Of course, the decision modules DM corresponding to the different decision branches have to be trained with training and test samples characteristic of the given branch.

An example of a flag-type decision is whether a reliable result can be automatically obtained for a given image, or a post-measurement operator review is suggested ("Review flag"). Decisions can also be made on flags of other types (e.g. if there is an artifact, a bubble etc. in the image). Flags can be displayed to the user by the software as a warning or error message, but alternatively they can be recorded in a log file (without user display), depending on the type of information.

The recognition module RM preferably also comprises a neural network, which neural network comprises inputs for receiving the pixels of the images 100, 101 (or the composite image), and outputs for outputting the probability values 14 of the probability maps 1 1 1 i n, which probability maps 1 1 1 i n are lower resolution compared to the images 100, 101.

A calling module CM is provided, which, regarding each presumably present element 1 16, calls the decision module DM for the examination area 1 13 positioned to the presumably present element 1 16 in the probability map 1 1 1 associated therewith, and the decision module DM is adapted for taking into account the identically positioned examination area 1 13 of at least one further probability map 1 1 1.

The decision module DM preferably comprises a neural network having inputs receiving probability values 1 4 being present in examination areas 1 13i-n according to the call, and outputs providing presence probability of the elements 16 falling into each category in the examination areas 1 13i-n according to the call.

A respective final decision making module FDM is connected to each of the decision modules DM, the final decision-maker modules being preferably adapted for accepting the elements 1 16 as being present only above a threshold probability, which threshold probability is preferably determined separately for each category. The images 100 and 101 (or the composite image) or one or more transformed variants thereof or one or more different resolution versions thereof are therefore fed into the recognition module RM. The recognition module RM generates the probability maps 11 i-n on the basis of which the identification of presumably present elements and to them the assigning of the examination areas 113 take place in each category i.e. for each probability map 1 1. A call module CM of the apparatus according to the invention calls the decision module DM to each pin point in each probability map 111. The inputs of the decision module DM are given by the examination areas 1 13i-n in the probability maps 1 1 1 i-n assigned to the pin point. For determining the presence of the element associated with the given examination area 1 13, the decision module DM takes into consideration also the probability values in the other examination areas 113, and on this basis it provides information about the presence of elements associated with each category.

The neural network of the decision module DM preferably yields the presence probability of the elements falling into each of the categories. On the basis of these probabilities, the final decision can be made by a further final decision making module FDM.

The final decision making module FDM preferably functions in a way that it examines the presence probability of the elements, and the elements are only accepted to be present above a predetermined threshold probability. It has been proven during our experiments that it is advisable to define this threshold probability separately for each category, because the elements to be categorized and recognized in the images 100, 101 have typically different probability values.

The essence of plausibility check to be introduced by the final decision making module FDM is that although the number of hits decreases with the increasing of the threshold probability value, the number of missed hits also decreases monotonously along a different curve. For each category, in this case for each particle type, it is advisable to set such a threshold value where the error rate is sufficiently low, and yet the rate of correct recognitions is sufficiently high. In the case of particles which can be recognized with a high reliability and where the occurring of false recognitions is important to avoid, it is worth setting a high threshold value. 18 000025

- 16 -

Furthermore, it may be advisable to determine the threshold probabilities of the final decision making module FDM not only for each category, but also for each application or within the applications separately for each sample. This is because it may happen that one sample i.e. the series of images consisting of the images 100, 101 have similar visual characteristics, and therefore the probabilities of recognitions and categorizations show a similar trend.

The first step of image processing - and the step that requires the most calculations - is running the recognition module RM. This is followed by running the decision module, and finally it is the final decision making module FDM that decides to which of the already identified locations of the image a tag is or is not applied. Consequently, according to the invention, the recognition module RM and the decision module DM are preferably implemented as a neural network. According to the invention, each module of the apparatus can be implemented preferably in the form of a computer program module, but the parts or the whole of these modules may even be realized in a hardware form.

In the description above the invention was presented for the purpose of urine analysis, regarding the images 100, 101 (composite images) prepared for a urine sample, but of course this does not restrict the applicability of the invention to this technical field. The element recognition and categorization according to the invention can be applied advantageously also in further applications mentioned in the introduction, which said applications necessitate image recognition and categorization.

The invention is not limited to the preferred embodiments described in details above, but further variants and modifications are possible within the scope of protection defined by the claims. For example, the invention is not only suitable for processing two-dimensional images, but it can be used also for the analysis of images generated by a three-dimensional imaging process. In this case the probability maps are preferably also three-dimensional maps, and the examination areas are three- dimensional spatial parts of these maps.

Claims

1. A combined bright-field and phase-contrast microscopy system, comprising
- a bright-field illumination unit (10),
- a phase-contrast illumination unit (20), and
- a control unit (40) adapted for controlling a switching of the bright-field illumination unit (10) and a switching of phase-contrast illumination unit (20), characterized in that the bright-field illumination unit (10) and the phase-contrast illumination unit (20) have optical axes (11 , 21) different from each other, and the microscopy system further comprises
- an optical element (30) providing a common optical axis (31) of illumination both for an illumination arriving from the bright-field illumination unit (10) and for an illumination arriving from the phase-contrast illumination unit (20), and, arranged one after the other along the common optical axis (31) of illumination, a
- sample holder (33),
- a phase-contrast objective (35) and
- an image recorder (50) adapted for taking an image before and an image after each respective switchovers between the two illumination types.
2. The system according to claim 1 , characterized by comprising an image recorder (50) controlled to take the images with an interval of maximum 200 ms, preferably of maximum 100 ms, at each switchover.
3. The system according to claim 1 or claim 2, characterized in that the optical element (30) adapted for providing the common optical axis (31) of illumination is a prism or a semi-transparent mirror.
4. The system according to any of claims 1 to 3, characterized in that the illuminations have the same color of light, and the system comprises a phase- contrast objective (35) tuned to this color.
5. The system according to claim 4, characterized in that the color of light emitted by the illuminations is green, being implemented by green LED(s) (13, 23).
6. The system according to claim 4, characterized in that the phase-contrast illumination unit (20) is adapted for also emitting light of an additional color.
7. The system according to claim 6, characterized in that the additional color of emitted light is blue, being implemented by blue LED(s) (23).
8. The system according to any of claims 1 to 7, characterized in that
- the bright-field illumination unit (10) comprises a light source (12), a light concentration lens (15) and a bright-field-adjusted diaphragm (14) arranged between them,
- the phase-contrast illumination unit (20) comprises a light source (22), a light concentration lens (25) and an annular diaphragm (24) arranged between them.
9. The system according to any of claims 1 to 8, characterized in that a condenser lens or a condenser lens system (32) is arranged between the optical element (30) adapted for providing the common extended optical axis (31) of illumination and the sample holder (33).
10. The system according to any of claims 1 to 9, characterized in that an imaging lens (39) or an imaging lens system is arranged between the phase-contrast objective (35) and the image recorder (50).
11. An image processing apparatus for automatic categorization of elements in images ( 00, 01) of a biological sample, the apparatus comprising - a recognition module (RM) for generating a respective probability map (111 i-n) for each category on the basis of visual information appearing in the images (100, 101), the probability map (111) showing a presence probability distribution of the element (116) of the given category,
- a calling module (CM) for locating presumably present elements (116) in the images (100, 101) and calling a decision process regarding each presumably present element (116),
- a decision module (DM) adapted for performing the decision process and for providing information about the presence of elements (116) of the categories on the basis of an analysis of the probability maps (111i-n), the decision module (DM) being adapted for taking into account, in examining the presence of the element ( 16), at least one further probability map ( 11) other than the probability map (111) associated with the category of the element (116),
characterized by further comprising
- a microscopy system according to any of claims 1 to 10, wherein the images (100, 101) of the biological sample are bright-field and phase-contrast images (100, 101) taken by means of the microscopy system, or composite images generated from pairs of these images (100, 101), and
- a branching module (BM) adapted for selecting, based on a combined processing of the probability maps (111 i-n), one of decision branches, wherein a respective decision module (DM) is associated with each of said decision branches, and wherein each decision module (DM) comprises a neural network trained with patterns characteristic of the given decision branch.
12. The apparatus according to claim 11, characterized in that the combined processing of the probability maps (111 i-n) comprises
- generating a sum of the probability maps (111 i-n),
- generating an average of the probability maps (111 i-n),
- determining the number of local peaks in the probability maps (111 i-n), or
- evaluating magnitudes of local peaks in the probability maps (111 i n).
1 3. The apparatus according to claim 1 1 or claim 1 2, characterized in that a respective calling module (CM) is associated with each of the decision branches, each calling module (CM) being adapted for calling the decision module (DM) located on the same decision branch.
14. The apparatus according to claim 1 1 or claim 12, characterized in that the output of the branching module (BM) constitutes an input of a common calling module (CM) associated with multiple decision modules (DM).
1 5. The apparatus according to claim 1 1 , characterized in that the recognition module (RM) comprises a neural network, said neural network having
inputs receiving the pixels of the images (1 00, 1 01 ) or the composite images generated therefrom, and
outputs providing probability values (1 14) of the probability maps (1 1 1 i-n) , the probability maps (1 1 1 i n) having a lower resolution than the images (1 00, 1 01 ) or the composite images.
16. The apparatus according to claim 1 1 , characterized by
- comprising a calling module (CM), which, regarding each presumably present element (1 16) , calls the decision module (DM) for an examination area (1 13) positioned to the presumably present element ( 16) in the probability map (1 1 1 ) associated therewith, and
- the decision module (DM) is adapted for taking into account the identically positioned examination area (1 1 3) of at least one further probability map (1 1 1 ).
17. The apparatus according to claim 16, characterized in that the decision module (DM) comprises a neural network which has
- inputs receiving the probability values (1 14) being present in the examination areas (1 1 3i-n) according to the call, and
- outputs providing presence probability of the elements (1 16) falling into each category in the examined areas (1 1 3i-n) according to the call.
18. The apparatus according to claim 17, characterized by comprising final decision making modules (FDM), each of which being associated with each a respective decision module, and being adapted for accepting the elements (116) as being present only above a threshold probability, which threshold probability is preferably determined separately for each category.
PCT/HU2018/000025 2017-06-09 2018-06-08 Combined bright-field and phase-contrast microscopy system and image processing apparatus equipped therewith WO2018224852A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
HU1700246 2017-06-09
HUP1700246 2017-06-09

Publications (2)

Publication Number Publication Date
WO2018224852A2 true WO2018224852A2 (en) 2018-12-13
WO2018224852A3 WO2018224852A3 (en) 2019-04-25

Family

ID=63254742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/HU2018/000025 WO2018224852A2 (en) 2017-06-09 2018-06-08 Combined bright-field and phase-contrast microscopy system and image processing apparatus equipped therewith

Country Status (1)

Country Link
WO (1) WO2018224852A2 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB866437A (en) 1956-10-19 1961-04-26 Cooke Troughton & Simms Ltd Improvements in or relating to microscopes
US4407569A (en) 1981-07-07 1983-10-04 Carl Zeiss-Stiftung Device for selectively available phase-contrast and relief observation in microscopes
US5731894A (en) 1995-12-19 1998-03-24 Gross; Leo Multi-purpose microscope objective
US6479807B1 (en) 1999-10-26 2002-11-12 Nikon Corporation Microscope
WO2013104937A2 (en) 2012-01-11 2013-07-18 77 Elektronika Mũszeripari Kft. Image processing method and apparatus
WO2013104938A2 (en) 2012-01-11 2013-07-18 77 Elektronika Muszeripari Kft. Neural network and a method for teaching thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE1123134B (en) * 1955-11-29 1962-02-01 Dr Erich Neugebauer Interference microscope
JP4020714B2 (en) * 2001-08-09 2007-12-12 オリンパス株式会社 Microscope
WO2006055413A2 (en) * 2004-11-11 2006-05-26 The Trustees Of Columbia University In The City Of New York Methods and systems for identifying and localizing objects based on features of the objects that are mapped to a vector
US8264694B2 (en) * 2009-03-16 2012-09-11 Ut-Battelle, Llc Quantitative phase-contrast and excitation-emission systems
US8300938B2 (en) * 2010-04-09 2012-10-30 General Electric Company Methods for segmenting objects in images
DE102012005911A1 (en) * 2012-03-26 2013-09-26 Jörg Piper Method for producing high-contrast phase contrast/bright field image of object in microscope, involves creating variable phase-contrast bright-field overlay image by interference of overlapping sub-images in intermediate image plane
US9453995B2 (en) * 2013-12-03 2016-09-27 Lloyd Douglas Clark Electronically variable illumination filter for microscopy
AU2015261891A1 (en) * 2014-05-23 2016-10-13 Ventana Medical Systems, Inc. Systems and methods for detection of biological structures and/or patterns in images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB866437A (en) 1956-10-19 1961-04-26 Cooke Troughton & Simms Ltd Improvements in or relating to microscopes
US4407569A (en) 1981-07-07 1983-10-04 Carl Zeiss-Stiftung Device for selectively available phase-contrast and relief observation in microscopes
US5731894A (en) 1995-12-19 1998-03-24 Gross; Leo Multi-purpose microscope objective
US6479807B1 (en) 1999-10-26 2002-11-12 Nikon Corporation Microscope
WO2013104937A2 (en) 2012-01-11 2013-07-18 77 Elektronika Mũszeripari Kft. Image processing method and apparatus
WO2013104938A2 (en) 2012-01-11 2013-07-18 77 Elektronika Muszeripari Kft. Neural network and a method for teaching thereof

Also Published As

Publication number Publication date
WO2018224852A3 (en) 2019-04-25

Similar Documents

Publication Publication Date Title
Davidson et al. Optical microscopy
French et al. Colocalization of fluorescent markers in confocal microscope images of plant cells
US6427023B1 (en) Optical member inspecting apparatus and method of inspection thereof
ES2301706T3 (en) Method of quantitative videomicroscopy and associated system as well as the sofware information program product.
US7372985B2 (en) Systems and methods for volumetric tissue scanning microscopy
JP5997256B2 (en) Measurement of cell volume and components
North Seeing is believing? A beginners' guide to practical pitfalls in image acquisition
US8486704B2 (en) Methods of chromogen separation-based image analysis
Vonesch et al. The colored revolution of bioimaging
US20080212168A1 (en) Extended depth of field imaging system using chromatic aberration
US20090213214A1 (en) Microscope System, Image Generating Method, and Program for Practising the Same
Sofroniew et al. A large field of view two-photon mesoscope with subcellular resolution for in vivo imaging
RU2402006C1 (en) Device, method and computer program for measuring
JP4670031B2 (en) Apparatus for optical detection of a light beam that has undergone excitation and / or backscattering in a sample
Cox Optical imaging techniques in cell biology
US7772535B2 (en) Method of capturing a focused image via an objective of a microscopy device
CN101278190B (en) Focal position determining method, focal position determining apparatus, feeble light detecting apparatus and feeble light detecting method
US7863552B2 (en) Digital images and related methodologies
JP4288323B1 (en) Microscope device and fluorescence observation method using the same
Hornberg Handbook of machine vision
Sandison et al. Quantitative fluorescence confocal laser scanning microscopy (CLSM)
EP1248947B1 (en) Method and device for characterizing a culture liquid
JP2013535686A (en) Method and apparatus for automated whole blood sample analysis from microscopic images
JP5875812B2 (en) Microscope system and illumination intensity adjustment method
US5521699A (en) Imaging flow cytometer and imaging method having plural optical paths of different magnification power

Legal Events

Date Code Title Description
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)