WO1991020048A1 - Analyse cellulaire utilisant un traitement video et un reseau neuronal - Google Patents

Analyse cellulaire utilisant un traitement video et un reseau neuronal Download PDF

Info

Publication number
WO1991020048A1
WO1991020048A1 PCT/US1991/004410 US9104410W WO9120048A1 WO 1991020048 A1 WO1991020048 A1 WO 1991020048A1 US 9104410 W US9104410 W US 9104410W WO 9120048 A1 WO9120048 A1 WO 9120048A1
Authority
WO
WIPO (PCT)
Prior art keywords
cell
color
cells
vectors
nucleus
Prior art date
Application number
PCT/US1991/004410
Other languages
English (en)
Inventor
Eric T. Espenhahn
Jamie Pereira
Original Assignee
Applied Electronic Vision, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Electronic Vision, Inc. filed Critical Applied Electronic Vision, Inc.
Publication of WO1991020048A1 publication Critical patent/WO1991020048A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Definitions

  • the present invention relates to a system for analyzing and classifying stained cells, viewed under a microscope, that incorporates the video processing techniques and a neural network.
  • Microscopic cellular analysis is customarily done by medical technicians. These technicians study the various types of cell structures and analyze the stain cells under a microscope. In blood analysis, error rates sometimes approach as high as twenty five (25%) percent. These error rates are due to the technician viewing the cells eight hours or more per day, five days a week, and the difficulty of identifying subtle variations in color, shape, size, density and texture of the stained cells, among other things.
  • the Hematrak 590 which was the most successful of these image based systems, was last produced in 1987.
  • the failure of all of these image based systems was related to the inability of the systems to meet price and performance expectations for obtaining blood differentials. Accordingly, there was general disillusionment with image base systems by the mid 1980's.
  • the Hematrak 590 was manufactured by Geometric Data, a division of Smith-Kline Corporation of Wayne, Pennsylvania.
  • the Coulter electronic machine takes whole blood samples, separates the white cells from the red cells and platelets and then processes the white cells using a flow system.
  • the machine passes the white cells in single file through an impedance measurement sensor to produce a three part differential count.
  • the cytochemical systems utilize the same flow mechanism but instead of using impedance flow the machines utilize laser light scattering and absorption patterns of the white blood cells.
  • the Technicon H-l and Coulter VCS produce a six part differential count utilizing these types of systems.
  • the cytochemical and impedance flow systems base their results on classifying a large number of cells, typically on the order of thousands, based upon the known distribution of normal white cell populations. This approach however does not provide enough information to accurately classify small distributions of abnormal cells.
  • Most of the instruments provide flags that alert the user for the need to proceed with a vision differential count based upon the presence of abnormal cells. None of them attempt to provide a quantitative measurement of these abnormal cells.
  • the system for microscopic analysis and classification utilizes an automated microscope that can be positioned both laterally and longitudinally as well as focused under the control of positioning commands.
  • the color image signal of the microscopic image of stained cells on a slide are fed to one of a plurality of vision systems.
  • Each vision system includes a frame buffer
  • a host computer assigns tasks to each vision system based upon the vision system's activity level.
  • the vision system identifies a single cell among the plurality of stain cells on the slide by locating color image signals falling within a predetermined color band about a predetermined stained cell color.
  • the vision system calculates vectors representing at least one set of color characteristics of the cell from the identified cell signals. For example, these vectors represent hue, saturation, and intensity histograms of a single cell. In a more comprehensive system, these vectors also include hue, saturation, and intensity histograms of the nucleus of the cell. Other cell features such as whole cell area, area of the nucleus and area of the cell cytoplasm are also used as input data.
  • the neural network has three paths and is a feed forward network of neurons.
  • One path handles vectors representing color characteristics of the cell
  • another path handles vectors representing color characteristics of nucleus
  • the third path handles the miscellaneous cell features such as area and nucleus color texture information.
  • These three paths converge towards an output layer in the network.
  • the output layer classifies the single cell based upon the vectors and miscellaneous information.
  • FIG. 1 illustrates, in block diagram form, the general system in accordance with the principles of the present invention
  • Fig. 2 illustrates, in block diagram form, the functional characteristics of the system
  • Fig. 3 is a block diagram of one vision system
  • Fig. 4 is a general flow chart of the operation of the system
  • Fig. 5A diagrammatically illustrates a blood smear on a slide and the search pattern of the microscope
  • Fig. 5B diagrammatically illustrates the positioning of a plurality of slides on a tray
  • Figs. 6A and 6B show exemplary intensity histograms utilized in the positioning routine for the microscope
  • Fig. 7 is a marked photograph showing red blood cells, white cells and platelets at 1000X microscopic power
  • Figs. 8A, 8B and 8C represent a flow chart of the principal program which includes a scope positioning routine as well as referring to the extraction routine, calculate features routine, classification routine, and display output routine;
  • Fig. 9 diagrammatically illustrates a color band about a predetermined stained cell color that is purple in the example discussed in conjunction with a current application of the present invention
  • Fig. 10 is a flow chart illustrating the steps involved in the extraction routine
  • Fig. 11 is a flow chart illustrating the calculate features routine
  • Fig. 12 is a flow chart illustrating the classification routine
  • Fig. 13 is a diagrammatical representation of a neuron in a neural network
  • Fig. 14 is a diagrammatical representation of the neural network used in conjunction with one application of the present invention.
  • Fig. 15 is a flow chart illustrating the red blood cell classification routine. Detailed Description of the Preferred Embodiments
  • This invention relates to a system for microscopic analysis and classification of stained cells.
  • Fig. 1 is a general system diagram for the present invention.
  • a plurality of slides, one of which is slide 10 are placed in certain positions on slide tray 12.
  • An exemplary, diagrammatic illustration of a blood smear slide is shown in Fig. 5A and the positioning of the slides on a slide tray 12 is shown in Fig. 5B, both of which will be discussed in detail later.
  • a microscope 14 is controllably driven so that the microscope can be moved laterally as shown by double headed arrow 16, as well as longitudinally as shown by the X within a circle 18 in Fig. 1. Additionally, microscope 14 can be moved towards or away from slide tray 12 as well as focused as noted by the curved line 20 having double arrows thereon.
  • scope control 24 is a stepper motor controller and the mechanical drive mechanism, represented by block 26, is a plurality of stepper motors connected to microscope 14.
  • the motors drive the microscope laterally and longitudinally with respect to the slide under study as well as towards and away from the slide. Also, the motors focus the scope with respect to the stained cells on slide.
  • Color image signals representative of the microscopic image of the stained cells are obtained by a video camera 30.
  • the output of the video camera is applied to bus line 32 which, in the preferred embodiment, carries video image frame signals, one frame signal representing the red color image, another representing the blue color image and a third representing the green color image.
  • These video frame signals include timing signals such as vertical and horizontal blanking signals.
  • the color image signals are applied to one of the vision systems, VS j , VS 2 , VS 3 , ... VS n all under the control of host computer 40.
  • the vision systems are coupled to host computer 40 via VME Bus 42 that permits extremely fast transfer of data as well as control and command signals between the various devices.
  • Host computer or CPU 40 also utilizes memory 42, and input/output (I/O) device 44 that are similarly coupled to VME Bus 42.
  • a display monitor 46 is connected to the system via I/O 44 as is keyboard 48.
  • Fig. 2 generally illustrates the functional aspects of the system wherein each of the major components in Fig. 2 operate relatively independent of each other and generally under the control of host computer or CPU 40.
  • one major function of the system is to position and focus microscope 14.
  • a major functional block is the positioner and focus operation 50.
  • a video frame image or color image signals for that video frame are grabbed or stored by one of the frame grabbers in the vision system, for example, VSj.
  • the positioning and focus operations also utilize a vision system. This vision system is then placed in the VS-Used Queue 52.
  • function block 54 is generally identified as comprising a number of extractors, such as extractor x through extractor,,. After the extractors have isolated a particular cell on the slide, and quantified certain color characteristics such as hue, saturation, intensity, and also certain miscellaneous cell information such as cell area, nucleus area, etc. , that information, collectively called herein a "cell block" is placed cell block queue 56.
  • the vision system Since the vision system has now completed its task of isolating and extracting certain information from the video frame, which would entail isolation and extraction of all the single type cells on the video frame, that ⁇ vision system is returned and placed in the VS-Free queue 58. Accordingly, the vision system is then available to the positioner and focus functional block 50 in order to further focus or position the microscope as well as accept another video image.
  • Cell block queue 56 is utilized by a third major functional block 60 that includes a number of classifiers, classifies through n.
  • the classifiers in the present embodiment are configured as software in the host computer and essentially comprise neural ⁇ networks that will be described in detail later. However, since neural networks can also be configured as very large scale integrated circuits (VLSI) , the present invention is not meant to be limited to the software implementation of a neural network but rather encompasses all types of neural networks whether implemented as software or hardware.
  • the classifier assigned to analyze the cell block that generally includes vectors representing color characteristics of a single cell as well as quantified cell features, determines what type of cell is represented by the cell block. For example, a working embodiment of the present invention classifies white blood cells represented by the cell block information.
  • the current embodiment of the present invention identifies a thirteen part differential count, including six normal types of white blood cells and seven abnormal types of white blood cells.
  • the perform differential functional block 64 is the overseer and controller of all the other functional blocks.
  • the perform differential function 64 also displays the identified and classified cell utilizing the cell a
  • the perform differential function monitors the number of different cells found. For example, in the working embodiment of the present invention a two hundred cell, thirteen part differential must be obtained in order to stop the positioning, extraction, and classification of various cells on the blood smear slide. Accordingly, the perform differential function 64 monitors the number of cells classified and stops the other functional processes after the system has identified that specified number cells. Further, perform differential function 64 monitors the overall process, keeps track of the vision systems that are being used as well as the vision systems that are free, and monitors the error ratio and failure flags on all the processes.
  • Fig. 3 is a block diagram of a single vision system as used in a working embodiment of the present invention.
  • vision system 66 is a board placed in a computer frame in order to achieve real time imaging on VME Bus 42.
  • Inputs 68 comprise color image signals for frame A, that is, separate red, blue and green frame A signals (FrA, ⁇ ) as well as timing signals.
  • the vision system 66 includes a color frame grabber, a thirty two bit plane, four flexible 512 X 512 X 8 image buffers (with an optional four image buffers available) , arithmetic logic units (ALU) and statistical processing up to 12.5 million pixels per second, inter- image arithmetic, including subtraction, real time frame averaging, convolutions, morphology, histograms and area profiles, area of interest window processing, and a 68000 on board micro processor.
  • Inputs 68 are applied to line 70 which are fed to input look-up tables (LUTS) 72 and sync stripper/generator 74.
  • ACRTC 76 is a video controller chip or integrated circuit that produces all the timing signals for video acquisition and processing.
  • Arithmetic logic unit (ALU) 78 further conditions and alters the video frame images and places them into frame buffer 80 which is a 512 X 512 X 32 on board video memory.
  • Frame buffer 80 is connected to VME Bus 42 by an internal VRAM Bus 82 to assist in the very fast input and output of data from the frame buffer.
  • Output look-up table (LUTS) and digital to analog convertors (DACS) 84 provide various outputs 86 from the vision system.
  • the vision system also includes a statistical processor 88, an event counter 90, and an interface to MVP-NP 92. These items are connected to an internal processing bus 94 as is ALU 78 frame buffer 80 and other components.
  • MVP-NP is a co-processor 96 which increases the processing speed for neighborhood operations, such as morphological transforms, binary pattern matching, feature extraction and color classification.
  • Co ⁇ processor 96 is coupled to processing bus 94 through digital expansion bus 98.
  • the vision system includes its own independent CPU or processor 110 which in the working embodiment is a Motorola 6800 micro processor.
  • Memory 112 is available to VS-CPU 110 through CPU bus 114.
  • Control logic 116 assists the VS-CPU 110 in controlling the operations of the other hardware and software functions.
  • the vision system is a MVP-VME video board manufactured by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
  • Co-processor 96 is the machine vision accelerator sold in conjunction with the MVP-VME image board. Further details and functional specifications of the Matrix image processing board are available from the specifications on the board.
  • Fig. 4 is a general system flow chart diagram showing the major steps in the particular application of the invention described hereinafter.
  • system specifications table identifies certain specifications utilized in the present application of the invention. These system specifications are meant to be exemplary only.
  • Type CCD color camera Output Separate Red, Green, Blue Resolution 786H X 493V Manufacturer Song RBG camera Image Acquisition System
  • MVP-VME with vision accelerator (MVP-NP) by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
  • step 150 includes changing the microscopic power of the microscope from 200X to 1000X and fine focusing the microscope at the first identified cell location on the slide. These steps are described in detail hereinafter.
  • Step 154 isolates a single cell of a predetermined type, e.g. a white cell, in the video frame image, and step 156 calculates the cell features. Steps 154 and 156 are the extraction steps done by one of the extractors in function block 54 of Fig.
  • step 158 cell features are analyzed by one of the classifiers in Fig. 2.
  • Step 160 calculates and analyzes other cell types, e.g., red blood cells, on the slide under study.
  • Decision step 162 determines whether the differential count has been exceeded.
  • the differential count threshold is set at 100 white cells spanning at least thirteen different classes of white cells. These cells must be identified in a particular blood smear before the system will stop. Also, the differential count also must exceed a certain number of red blood cells and platelets in the blood smear. If the differential count threshold is not exceeded, the no branch is taken from decision step 162 and in step 164, the scope is moved, and the program returns to the isolate single cell step in 154.
  • the system displays results in step 166 to the operator.
  • the operator sees every cell.
  • the cell images are complied in a special display frame and the operator approves or confirms the system's identification of those cells. This confirmation is noted in step 168.
  • system displays only the non-classifiable white cells, that is the white cells having a low probability of classification, and the operator classifies those cells as appropriate.
  • the operator sees every cell classified by the system and approves each cell on an individual basis. Since the type of display and operator confirmation is dependent upon certain commercial aspects of the invention, the display and confirmation steps may be selectable by the operator of the system.
  • the present invention is described in detail in conjunction with analyzing a blood smear.
  • the smear on a slide is diagrammatically illustrated in Fig. 5A.
  • Human blood was stained with Wright stain, which is a standard stain technique for human blood cell microscopic analysis and classification.
  • the predetermined stain cell color from the Wright stain results in red blood cells (R) (see Fig. 7, a 1000X magnification) being colored light red and the white blood cells (W) ranging in color from light blue, purple to orange.
  • the nucleus of the white blood cells (W N ) is dark purple and the white cell cytoplasm (W c ) ranges from light blue to purple dots to light orange.
  • Slide 200 is placed on slide tray 12 at certain locations, in the present embodiment, tray 12 holds 10 slides at predetermined locations.
  • the blood smear slides are normally oblong or oval in shape and extend longitudinally with respect to the slide.
  • Fig. 5A diagrammatically illustrates intensity contour lines A through E wherein A intensity contour line is extremely dark or black and E contour line illustrates the outer most feathered edge of the blood smear that is slightly colored.
  • the slides on slide tray 12 (Fig. B) are not only positioned at certain spots on the tray but also aligned such that the feathered edge of each blood smear points towards, for example, fore edge 210 of tray 12..
  • the feathered edge E of each blood smear could point towards the rear edge of slide tray 12 and the positioning routine (step 150 of Fig. 4) could be altered to sense the feathered edge in that direction.
  • Figs. 8A through 8C illustrates the principal or primary software routine encompassing steps 150, 152, 162, 164, 166 and 168 in Fig. 4;
  • Fig. 10 is the extraction or isolation routine corresponding to step 154 in Fig. 4;
  • Fig. 11 is the calculate features routine corresponding to step 156 in Fig. 4;
  • Fig. 12 is the classify white cell routine corresponding to step 158 in Fig. 4;
  • Fig. 15 is the red blood cell classification routine (including platelet identification) corresponding to step 160 in Fig. 4.
  • the principal program in Fig. 8A begins with the step of positioning the scope above the center point (CTPT) of the slide in step 240. Since all the slides are placed at certain locations on slide tray 12 (Fig. 5B) the system can generally identify and move the scope relative to the slide to position the scope above the center point of the slide. For example, with respect to slide 200, the center point 242 is identified and the scope is positioned thereat. The microscope is focused at 200X in step 244 in this particular application regarding the analysis and classification of cells in a blood smear. In step 246, the scope is moved longitudinally with respect to slide 200 until a bimodal intensity peak ratio is found. Ideally, the intensity of a complementary color band about a stained cell color is utilized.
  • the stain cell color in the present example is a Wright stain which essentially colors the red blood cells (cells R in Fig. 7) red.
  • the blood smear is very dark and almost black due to the great amount of red blood cells in the smear.
  • the complimentary color band for the Wright stained cell color is green.
  • the video image signals are essentially three video frames, one frame having the red color signals FrA,, another frame having the green color signals FrA-. and a third frame having the blue color signals FrA t ,. These frames are kept in frame buffers. Accordingly, there is a red frame buffer, a green frame buffer and a blue frame buffer for frame A (FrA ⁇ b ) .
  • the green buffer is selected because there is * 14 little or no green in the stained red blood cells. Accordingly, the stained red blood cells appear as dark or black spots on the green video frame image (FrA g ) .
  • a histogram of the intensity of the green buffer frame is obtained. Fig. 6A is an example of such a histogram when the scope is positioned at approximately intensity contour line B in Fig. 5A.
  • a histogram is a frequency distribution of the number of pixels having a certain intensity value. In Fig. 6A, the number of pixels in the FrA g that are darker exceeds the number of pixels that are lighter.
  • Peak 211 is the number of pixels having an intensity or brightness level of approximately 80 whereas peak 213 represents a fewer amount of pixels that are brighter (approximately 200 intensity level) . Accordingly, a peak ratio is obtained by comparing the peak number of background pixels to the peak number of pixels under study.
  • the background in the green frame buffer is represented by pixels having a bright or high intensity whereas the cells under study are dark because of the absence of any green in the stained red blood cells and stained white blood cells. Accordingly, the bimodal peak ratio in Fig. 6A is approximately _L,.8.
  • Step 246 moves the microscope longitudinally down the slide 200 as shown by the dash lines 243 until the bimodal peak ratio reaches approximately 0.5.
  • Fig. 6B graphically illustrates this peak ratio.
  • intensity contour line C in Fig. 5A there are a relatively few number of cells within FrA-. as compared with the background.
  • the microscope is moved towards the feathered edge until the program detects the proper bimodal peak ratio in the intensity of a complimentary color band about a stained cell color.
  • the stained cell color in the present invention is the red Wright stain color for the red cells and the white cells.
  • the complimentary color band is found in the green buffer.
  • the system When the system reaches the appropriate bimodal peak ratio, the system causes the microscope to begin a scan pattern as noted in step 468 in Fig. 8A.
  • the scan pattern in the present application, is shown as dash lines in Fig. 5A and includes essentially first lateral movement (opposite the first lateral direction) across the slide, slight longitudinal movement, further lateral movement across the entire feathered edge of the blood smear, additional longitudinal movement and a repetition of these movements until the bimodal peak ratio passes beyond the pre- established bimodal peak ratio range.
  • the particular scan pattern for the scope can be changed.
  • the scan pattern diagrammatically illustrated in Fig. 5A is grossly inaccurate because the microscope moves very small distances in relation to the slide both laterally and longitudinally throughout the scan pattern.
  • the scan pattern is simply tracking the feathered edge of the blood smear and various scanning patterns could be utilized to track this feathered edge.
  • the photograph depicted in Fig. 7 is a 1000X photograph but is generally illustrative of the cell distribution at 200X, that is, the cells are substantially in a single layer along the feathered edge generally between intensity contour line C and contour line E in Fig. 5A.
  • Step 248 obtains the video frame or color image signals FrA-, ⁇ . Specifically, there are three video frame images, one for the red buffer, one for the blue buffer and one for the green buffer. In step 250, the program calculates the hue of FrA H with all three color images.
  • the hue frame FrA H is passed through a threshold filter to identify, in the present application, white cell pixel groups.
  • the thresholding is simply screening or filtering the entire hue frame to identify pixels in a predetermined color band about a predetermined stained cell color.
  • the stain cell color in the present application is the bluish-purple Wright stain color.
  • the predetermined color band is the purple band (shown in Fig. 9) .
  • Fig. 9 illustrates that if a zero hue value is the mid point between the pure blue hue and the pure green hue, the purple band lies approximately at values 30 to 70 (counter clockwise from 0 to 255) .
  • Hue value 255 is immediately to the right of zero value. Accordingly, the hue frame is sent through a color signal filter, thereby passing color image signals falling within the purple band which extends about the predetermined purple stain cell color.
  • step 250 and 252 The purpose of step 250 and 252 is to identify the gross position of cells within the entire frame which, in this application is the red buffer, green buffer and blue buffer.
  • the calculation and creation of the hue frame is mathematical in nature and simply combines the three colors into a single frame in order to reduce processing time. By thresholding that hue frame and mapping pixels in the frame, that is pixels falling within a predetermined color band (purple) , a gross cell location is obtained.
  • mathematical morphology or shape filters are utilized to obtain the center points of the purple pixel groups. The center points are the gross locations of the white blood cells in this particular application.
  • step 256 the center points are stored as ctpt n with respect to a slide reference point and all the center point positions of the grossly identified white blood cells are stored in a list in memory.
  • step 258 the red, green and blue frame buffers are cleared or discarded as well as the hue frame created in step 250.
  • step 260 the microscope in moved a very small amount such that a new video frame is obtained showing different cells.
  • step 262 the program checks the bimodal peak ratio for the new frame FrB gI . Particularly, a bimodal peak ratio of between 0.5 and 0.2 is acceptable. Referring to Fig.
  • step 264 determines whether bimodal peak ratio is acceptable, and if it is, step 266 uses frame B, the new frame, as frame A and returns to step 250.
  • step 268 determines whether the scan pattern has been completed based principally on the failure to obtain a bimodal peak ratio within the prescribed range for the green frame buffer for the next frame. If the scan is not finished, the program jumps from jump point A-2 in Fig. 8B to the same jump point in Fig. 8A immediately preceding the move scope step 260.
  • step 272 the microscope is moved, in step 272, to the location of the first center point ctpt ! on the list generated by step 256. This scanning and gross identification of the white cell positions decreases processing time in the overall system.
  • Fig. 7 is a picture illustrating white blood cells W, red cells R and platelets P. The picture is taken at 1000X. At 1000X, the white blood cells occupy a pixel block of approximately 70 X 70 pixels assuming a video frame of 512 X 512 pixels. At 200X the same white blood cells occupy a pixel block approximately 13 X 13. If high definition television video signals were used, at 20OX the white blood cells would occupy more pixels due to the larger number of pixels in the video frame.
  • the Sobel edge detection filter generates a focus factor or a detection count or value based upon the clarity and sharpness of the edges in the video frame. Accordingly, in step 278, the program compares the previous focus factor for frame FrA g to a new focus factor for the frame when the scopes focus has changed. The routine maximizes the focus factor or the Sobel edge detection count or value by changing the focus of the scope.
  • Steps 280, 282 and 284 are described in detail in conjunction with Figs. 10, 11 and 12.
  • step 280 extracts a white cell from the video frame images and the white cell nucleus from the video frame images Fr , gb and obtains a cell block consisting of a red, green and blue pixel matrix about the white blood cell as well as the nucleus.
  • step 282 calculates the features of the white blood cell from the cell block.
  • Step 284 classifies the white blood cell from the features obtained in step 282.
  • step 286 After the white blood cells have been classified, in step 286, both the white blood cell video image block or image matrix (the red, green and blue buffer portions for that particular white cell) and the classification of that white cell are stored.
  • step 288, the program repeats the extraction step 280, the calculate feature step 282, the classify step 284 and the store step 286 for all white cell center points ctpt n in the high powered window of frame FrA.
  • the scope is moved to the center point of the grossly identified location of the first white blood cell found during the pattern scan.
  • the system operates on that video frame clarifying the focus.
  • the program operates on a single cell located within a high powered window frame A.
  • the high powered window may be smaller than the actual video frame in order to eliminate any white blood cells that are split by the frame border.
  • Each white blood cell is isolated, and its location and color image signals are extracted for the entire cell as well as the nucleus.
  • Features of that extracted white blood cell are obtained and then the white blood cell itself is classified.
  • the cell video information or data and classification data is stored and then the program repeats the extraction, calculation, features and classification and stores data for each white blood cell within the high power window of frame A.
  • Step 300 calculates and obtains a hue frame for all of high power window frame A (FrA H ) .
  • the hue frame is equalized and expanded to full digital scale.
  • the hue frame is a mathematical composite of the red, green and blue buffers for the video frame. Since the hue may not extend the full digital scale of the histogram for the hue values, step 302 expands the histogram wave form to the full dynamic range of the values. This is a normalizing technique.
  • Step 304 passes the hue frame histogram through a threshold filter, the frame as modified by step 302, and creates a binary image. Essentially, every pixel in the frame having a value less than a pre-set value, e.g., 20, is set to (1) in all pixels in the frame greater than or equal to 20 is set to 0. The threshold level for this hue frame could be changed as necessary depending upon the particular cell under study.
  • a morphological open operation is conducted on the binary hue frame.
  • the morphological open operation is a mathematical operation that first erodes the binary image and then dilates the binary image. This morphological open operation removes noise in the hue frame, trims edges and smooths contours.
  • the program identifies a white cell block about all of the white cells in the hue frame image as modified and creates a white cell mask for the hue frame.
  • the white cell mask is identified for cell,.
  • the cell block corresponding to the red, green and blue buffer portions representing each white cell and the mask defining the outer boundaries of each cell are stored.
  • the location of each identified cell is stored along with the RBG color image signals.
  • the color image signals fall within a predetermined color band about a predetermined stain cell color.
  • Hue is simply a mathematical representation of color space from the RBG color image signals. Since the hue frame is screened by a color signal filter, all color image signals passing through that hue filter represent the color image of individual white cells in that video frame.
  • the cell block discussed herein is essentially the RBG portion of the frame buffer and the mask layer showing the location of the white cell in the video frame as well as an identification of what video frame this particular cell was extracted from.
  • the program identifies the nucleus of the white blood cell.
  • the first white cell block color image is transformed into a saturation frame.
  • Saturation is a measure of whiteness of an image and is also mathematically related to the RGB buffer portions for the white blood cell.
  • the saturation frame portion is identified as cell,.
  • the saturation frame portion is cell, averaged to smooth contour edges.
  • the saturation frame portion is passed through a threshold filter such that everything above, for example, saturation value 75 is set to "1" and everything below that saturation threshold value is set to "0". Accordingly, a binary image of the cell block is created based upon a predetermined chromaticness threshold. Chromaticness is the hue and saturation of a color.
  • step 320 morphological close operation is conducted on the binary version of cell,. This morphological close operation first dilates the image and then erodes it in order to fill holes within closed shape figures. Since a nucleus is generally solid (Fig. 7, nucleus W N ) , the morphological close operation closes the solid spaces defined by the nucleus.
  • the cell nucleus mask is defined NUC X for cell,.
  • step 324 the cytoplasm of the cell, that is W c in Fig. 7, is set to "1" and the nucleus of the cell NUC, is set to "2".
  • the cell block for cell is now defined by cytoplasm identified by a "1" and the nucleus defined by a "2" as well as red, green and blue frame portions for the cell.
  • the cell block therefore defines the outer contour of the white cell as well as the contour of the nucleus within the cell and color data for the cell and nucleus.
  • the cytoplasm is simply the difference between the whole cell image mask obtained in step 310 and the nucleus image mask obtained in step 322.
  • the cell block is then the RBG color image signals as well as the mask defining the cell cytoplasm and the cell nucleus.
  • the extract white cell program then ends and the program returns to the next step shown in Fig. 8B, the calculate features of white cell from the cell block in step 282.
  • Fig. 11 is the calculate features routine and begins with step 340 which calculates the hue, saturation and intensity histograms of cell ! , that is C H , C s , and C x for the particular cell block.
  • Step 342 calculates hue, saturation and intensity histograms of the nucleus NUC X (N H , N s , N .
  • These steps constitute a means for obtaining vectors representing at least one set of color characteristics of the single cell from the identified cell signals.
  • the RBG frame portions could be part of the color characteristics.
  • these vectors are a series of numbers representing histogram values.
  • the histograms may be represented by a series of numbers ranging from 0 to 255 in value. However, as noted later in conjunction with Fig. 12, the histogram vector is reduced by taking every fourth value, and reducing the vector approximately down to a 64 value vector.
  • step 344 miscellaneous cell features and cell nucleus features are identified. Also, the area of the cell C A , the area of the nucleus N A , the area of the cell cytoplasm C c is calculated.
  • step 346 a texture matrix is made from the color signals associated with the cell nucleus. Texture matrices for visual images are discussed in detail in a book entitled “Digital Image Processing” by R. Gonzalez and P. Wintz copyrighted in 1987. Essentially, the texture matrix is utilized to determine whether the image is smoo.th or rough. A smooth image will generate a texture matrix having values falling within a relatively narrow range in a diagonal band from the upper left to the lower right of the texture matrix.
  • a rough look is defined by values of a similar range in a diagonal extending from the lower left to the upper right of the texture matrix.
  • the maximum probably N p , element difference moment N M and uniformity factors Nu from the nucleus are also obtained from the texture matrix. If the maximum probability is high, the image is smooth. If the maximum probability is low, then the image is coarse. If the image is coarse, it has a random look to the image.
  • the element difference moment determines whether the texture has some order. If the texture does have order, than the moment value is relatively low. If the texture is random than the moment is a high value. Uniformity is the opposite of entropy and is discussed in detail in "Digital Image Processing" by Gonzalez and Wintz.
  • the cell nucleus features give an indication of the degree of randomness, smoothness, and order within the white cell.
  • Fig. 7 illustrates that various white cell nuclei have different shapes, sizes, positions and textures. These miscellaneous nucleus cell features provide factors which are helpful in classifying the cell.
  • the cell features are stored for white cell ! .
  • the program then enters the classify white cell program step 284 in Fig. 8B.
  • the classify white cell subroutine is shown in flow chart form in Fig. 12.
  • an input vector is created from the features of celli. These input vectors are hue, saturation and intensity histogram values from cell,,, the hue and saturation histogram values from NUCj, and the area of the cell, area of the nucleus, and area of the cell cytoplasm, and the miscellaneous factors such as maximum probability, element difference moment, and uniformity factors from the nucleus.
  • these vectors are propagated through a feed forward neural network. The neural network will be described in detail hereinafter.
  • step 354 the program collects the results of the neural network output, and in step 356, the results are analyzed and the white blood cells are classified by type or category or identified as being as non- classifiable.
  • the white cell class table that follows is exemplary of the classifications that may be assigned to a particular white blood cell.
  • Fig. 14 diagrammatically illustrates the neural network used in this present application of the invention.
  • Fig. 13 diagrammatically illustrates a single neuron in a neural network for explanatory purposes. Neural networks are sometimes called parallel distributed processors and such devices are meant to be encompassed within the scope of the present invention.
  • a neural network is a series of neurons associated in layers, an input layer, one or more hidden layers, and an output layer. Each layer has a plurality of neurons that are connected in a certain fashion to the following layer in a feed forward neural network.
  • Fig. 13 illustrates neuron 410 that has inputs Ij, I 2 , I 3 and I n . Each input has assigned thereto a weight that changes the value of the input.
  • a weight function 412 may double the value of the input. So upon the appearance of a 1 at I lf a value of 2 would be assigned thereto by weight function 412.
  • function 414 is a sigmoidal threshold function which essentially averages the output to a smooth curve from a straight edge threshold function rising as a step from 0 up to 1.
  • the weights for the inputs to the neural network are defined when the neural network is trained to recognize and identify certain known cells. The knowledge of the neural network is stored in the weights. The teaching process or operation sets the weights for each neuron in the network.
  • Fig. 14 diagrammatically illustrates the architecture of the neural network in the present application.
  • This neural network as three paths designated path 510, for the input vectors related to the entire cell color characteristics, path 512, handling the miscellaneous parameters such as the cell area, nucleus area, cytoplasm area, and nucleus features such as texture values, and a third path 514 handling color characteristics of the cell nucleus.
  • path 510 for the input vectors related to the entire cell color characteristics
  • path 512 handling the miscellaneous parameters such as the cell area, nucleus area, cytoplasm area, and nucleus features such as texture values
  • a third path 514 handling color characteristics of the cell nucleus.
  • Each of these paths in relatively independent and they converge towards an output layer.
  • Neural network also includes an input layer 516, a first hidden layer, a second hidden layer, a third hidden layer and an output layer 518.
  • the hue vector for the entire cell is represented by 64 values
  • the saturation vector for the hue cell is also by 64
  • Neuron group 520 is an input layer composed of 192 inputs, 64 inputs each for the hue, saturation, and intensity histograms of the cell.
  • Neuron group or cluster group 522 is fully connected (FC) as shown by the three dots, to neuron group 520.
  • Group 522 of hidden layer one consists of 30 neurons.
  • Hidden layer 2 for path 510 consists of ten neurons in group or cluster 524.
  • neuron cluster 522 is fully connected to neuron group 524.
  • Fully connected means that each input is connected to each neuron in the layer immediately below.
  • Neuron cluster 524 is fully connected to hidden layer 3 consisting of neuron cluster 526 comprising 20 neurons.
  • the second path 512 for the miscellaneous perimeters of cell area, nucleus area, cytoplasm area, maximum probability from texture matrix, element of difference moment, and uniformity factor (7 inputs) is fed to input neuron cluster 528.
  • Neuron cluster 528 is 7 neurons which hold the inputs for that path.
  • Neuron cluster 528 is fully connected to neuron cluster 430 of the first hidden layer that consists of 10 neurons.
  • Neuron cluster 530 is fully connected to hidden layer 3, neuron cluster 526.
  • input layer 516 includes 128 neurons as neuron cluster 532.
  • Cluster 532 is fully connected to the first hidden layer consisting of 20 neurons in cluster 534.
  • Cluster 534 is fully connected to neuron cluster 536 of the second hidden layer for path 514.
  • Neuron cluster 536 is fully connected to hidden layer three, neuron cluster 526.
  • Neuron cluster 526 of hidden layer three is fully connected to output layer 518 which consists, in this application, of 13 neurons in cluster 538.
  • Each neuron in cluster 538 holds an output from the neural network.
  • Each output neuron corresponds to a different type of white cell class and these classes are shown in the white cell class table.
  • output layer may hold: SEG.O; LYMPH 0.61; MONO. 0.30; EOSIN 0.1, etc. Step 356 in the classify white e
  • 25 cell routine (Fig. 12) analyzes these results and either picks the maximum, which in the exemplary case is LYMPH. , or may determine whether one or more of the output neuron values exceeds an output threshold value.
  • the output threshold value may be 0.60.
  • the LYMPH white cell classification would be selected as the classification for that particular white cell. However if two output neurons have a value of more than 0.60, the program would classify this cell as being "non-classifiable". Similarly, if all the output neurons had a value less than 0.60, the white cell would be declared non-classifiable.
  • a three path neural network was chosen because of the complexity and the number of different vectors involved in the cellular analysis. It is possible that a neural network having a single path would be appropriate depending upon the cell under study. Whole cell color vectors are preferably input into a single path network. Also, the number of hidden layers was expanded from 1 hidden layer to 3 hidden layers in order to increase the speed of processing as well as to increase the speed of teaching the neural network to recognize traits of white blood cells.
  • the inputs to the neural network are rotationally and positionally invariant because color characteristics of the entire cell, color characteristics of the cell nucleus, and miscellaneous of the cell features, are extracted by the vision system and the vectors or values representing these characteristics are fed to the neural network.
  • any change in the rotation or the position of the cell may affect the teaching speed of the network and the speed of processing once the network has been properly taught. Accordingly, the system is insensitive to rotational and positional aspects of the white cells.
  • the backward error propagation formula is as follows: the change in weight is equal to the learning rate times the error times the output plus the momentum time the previous change in weight. This is found in a book entitled "Parallel Distributed Processing" by D.E. Rumelhart, et al, 1986.
  • the present network was taught using back propagation with above listed training set. The learning rate was 0.2.
  • the momentum was 0.6, the pattern error was 0.01 and the convergence time was 10.
  • a 100 white cell differential can be found in about ten minutes.
  • the neural network is taught by feeding the network the teaching set and checking the output neurons. The programmer then reviews the cells classified and determines whether the output obtained is correct for that cell. If the output is erroneous, then the weights in the interneural connections are changed based upon the formula.
  • Step 288 repeats the extract cell routine, calculate features routine, classify white cells and store routines for all the cells within the frame window.
  • step 610 masks out the white cells from the red, green and blue frame buffers for frame FrA. It should be noted that these few next steps could be conducted in parallel after the extract white cell and nucleus step 280 in Fig. 8B. Accordingly, it is not necessary that all the steps as shown herein be done sequentially since there are opportunities for parallel processing and multitasking.
  • the hue, in step 612 is calculated for the modified frame A. In step 614, the purple pixel blocks less than pixel block size X are counted.
  • step 616 the small size purple pixel count is discounted or reduced for error and the new value is assigned as the platelet count. According to standard blood smear analysis, a rough count of platelets is made to determine whether sufficient platelets are present in the blood. Accordingly, the platelet count is only an estimate. The discount for error in step 616 is necessary to eliminate noise in the video frame.
  • step 618 the classification and count is made of the red blood cells. Details of step 618 are found in Fig. 15 which is a flow chart of the red blood cell classification routine.
  • the red blood cell classification in Fig. 15 begins with step 710 which utilizes the frame A red, green and blue buffers, modified by masking out the white blood cells.
  • the following red cell class table identifies and describes the 7 different types of red cells classified by the program
  • crenated erythrocytes star or spiked circles.
  • leptocytes target cells (look like bullseye) .
  • drepanocytes sickle cells (look like stringbeans) .
  • poikilocytes with pointed and blunt features look pear shaped, sometime with double bulbs protruding from irregular circle, blob-like appendages) .
  • hypochromic erythrocytes circular with red borders and white centers
  • f. diffusely basophilic or polychromatophilic (look bluish compared to other red cells) .
  • normal red cells round and all red
  • the red cell class table includes the medical name of the cell, the informal name of the cell and a brief description of what the cell looks like. All the cells except for diffusely basophilic (class f) are red due to the Wright stain in the blood smear.
  • Step 712 calculates a hue frame based upon the modified frame A, that is, the frame wherein the white cells are masked out.
  • Step 714 counts the purple pixel blocks greater than Y size. A narrow color band threshold or filter would be utilized in order to identify the diffusely basophilic or class f cells since those cells are blue rather than red.
  • the class f cells are counted.
  • pixel blocks larger than Y size are chosen to eliminate any recognition of the smaller purple platelets in the video frame.
  • the green buffer is utilized and a threshold is taken to create a binary image of the cells.
  • the green buffer is used because red cells have the smallest amount of green values. Accordingly, the red cells in the green buffer appear dark.
  • the thresholding operates on the intensity histogram of the green buffer since the morphology or the shape of the remaining red blood cells is determined by the balance of the red blood cell classification sub-routine which is not sensitive to color differences.
  • the program identifies the red cell blocks by identifying pixel blocks larger than Y size and stores the binary image of the cell blocks in a compact red cell-shaped
  • RC-shape (RC-shape) frame.
  • a special RC-shape frame is utilized in order to conduct morphological open and morphological close operations on the entire frame. These operations use a relatively large amount of processing time. Since the red blood cell shapes do not occupy the entire video frame under study, these shapes can be placed in a special RC-shaped frame and the entire shape frame can be processed thereby reducing intermediate or background noise and shortening the processing time.
  • Decision step 724 determines whether the RC-shape frame is full. If it is not full, the no branch is taken and the program jumps from jump point A-5 to jump point A-5 in Fig. 8C immediately after the classify and count red blood cells 618. As is described in detail later, the general program then cycles through in processes by obtaining a new video frame. Eventually the RC-shape frame becomes full and the yes branch is taken from decision step 724 (Fig. 15) . The next step
  • the morphological open operation is configured as a shape filter that filters out or separates the round shaped cells from those cells that are not round.
  • the cells that are not round in the red cell classification table are class a, star or spiked circle; class c, sickle cells (which look like string beans) and class d which are pear-shaped or have double bulbs protruding from an irregular circle or are blob-like with appendages.
  • the other red blood cells are round.
  • the non-round cells or the cells that drop out after the morphological open operation are identified on the yes branch and a further decision step 730 is made as to whether the cells are shaped as linear strings.
  • step 732 determines whether the cells have spiked edges by a spike edge detection routine.
  • the spike edge detection routine in decision step 736 determines whether the cells are class a, that is, shaped as star or spiked circles.
  • the program counts the spiked circles in step 738. If the cells are class d, pointed with blunt features, these cells are counted in step 740.
  • step 760 conducts a morphological close operation on the remaining red cell shapes.
  • a morphological close operation determines whether the red cell shapes are solid in color, indicating a normal red cell, or whether the cells have a white center or look like a bullseye, that is, with a plurality of inner rings.
  • Decision step 762 separates the open red cell shapes from the closed red cell shapes. The no branch from decision step 762 identifies the number of normal red blood cells, class g, in step 764.
  • step 766 determines whether the center of the cells under study are white. If the center of cell is white, as determined by an appropriate shape identification routine, the yes branch is taken and in step 768, a count is taken of class e cells. If the no branch is taken, target cells are identified as class b cells in step 770. All of these counting steps lead to step 772 which stores the non- classified cell count by comparing any remaining pixel blocks in the RC-frame that have not dropped out through the decision chain. After the red blood cell classification routine ends, the main program picks up in Fig.
  • step 620 discards the center points identified in the gross location identification for the white blood cells within the high power target window of frame A.
  • step 622 the microscope is moved to the next center point on the list beyond the previously identified center points. This location identification was made in step 256 in Fig. 8A.
  • decision step 624 which determines whether the list is done. If the list ctpt n is not done, the program jumps to jump point A-4 in Fig. 8B immediately proceeding extract white cell and nucleus step 280. Alternatively, jump point A-4 may be advanced to a point immediately proceeding the conduct focus routine 276 if it is found that, due to the movement of the microscope, a new focus must be obtained after the scope moves.
  • step 624 determines that the list is done, the program takes the yes branch and step 626 displays all non-classifiable white cell blocks to the operator for assistance in classifying those cells.
  • step 628 the operator inputs data regarding the non-classified white blood cells.
  • step 630 the system displays data regarding all the white cells classified, the red blood cell count, the abnormal red blood cell count and the platelet count.
  • step 632 the date is stored for the slide, the white cell block color data, classification data, red cell count data, both normal and abnormal, the name of the operator and the results of the entire test.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

Système servant à l'analyse microscopique et à la classification et comprenant une pluralité de systèmes visuels (VS1 - VSn) et un ordinateur central (40). Le système visuel assigne des vecteurs représentant une caractéristique chromatique d'une cellule teintée sur une lame (10). Ces vecteurs représentant les histogrammes de teinte, de saturation et d'intensité d'une cellule et/ou du noyau de la cellule. D'autres caractéristiques des cellules, telles que la superficie, la superficie du noyau et la superficie du cytoplasme de la cellule peuvent aussi être utilisées comme vecteurs. Les vecteurs sont introduits dans un réseau neuronal (fig. 14) qui exécute la classification.
PCT/US1991/004410 1990-06-21 1991-06-21 Analyse cellulaire utilisant un traitement video et un reseau neuronal WO1991020048A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US54170590A 1990-06-21 1990-06-21
US541,705 1990-06-21

Publications (1)

Publication Number Publication Date
WO1991020048A1 true WO1991020048A1 (fr) 1991-12-26

Family

ID=24160696

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1991/004410 WO1991020048A1 (fr) 1990-06-21 1991-06-21 Analyse cellulaire utilisant un traitement video et un reseau neuronal

Country Status (2)

Country Link
AU (1) AU8229591A (fr)
WO (1) WO1991020048A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997011350A2 (fr) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. Systeme de fragmentation multispectre assistee par reseau neuronal
WO1997041416A1 (fr) * 1996-04-27 1997-11-06 Boehringer Mannheim Gmbh Procede d'analyse automatisee assistee par ordinateur d'echantillons tissulaires ou d'echantillons de liquides organiques
WO2013037119A1 (fr) * 2011-09-16 2013-03-21 长沙高新技术产业开发区爱威科技实业有限公司 Dispositif et procédé d'analyse de la morphologie érythrocytaire
EP3598195A1 (fr) 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Procédé d'évaluation microscopique
EP3608701A1 (fr) * 2018-08-09 2020-02-12 Olympus Soft Imaging Solutions GmbH Procédé de fourniture d'au moins un procédé d'évaluation pour échantillons

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4129854A (en) * 1976-10-25 1978-12-12 Hitachi, Ltd. Cell classification method
US4965725A (en) * 1988-04-08 1990-10-23 Nueromedical Systems, Inc. Neural network based automated cytological specimen classification system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4129854A (en) * 1976-10-25 1978-12-12 Hitachi, Ltd. Cell classification method
US4965725A (en) * 1988-04-08 1990-10-23 Nueromedical Systems, Inc. Neural network based automated cytological specimen classification system and method
US4965725B1 (en) * 1988-04-08 1996-05-07 Neuromedical Systems Inc Neural network based automated cytological specimen classification system and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IEEE ASSP MAGAZINE, April 1987, LIPPMAN, "An Introduction to Computing with Neural Nets", pages 4-22. *
NEURAL NETWORKS, Vol. 1, No. 1, 1988, DAYHOFF et al., "Segmentation of True Color Microscopic Images Using a Back Propagating Neural Network", page 169. *
Report No. CONF-871175L: RADIOLOGICAL SOCIETY OF NORTH AMERICA INC., 29 November 1987, OLDHAM et al., "Neural Recognition of Mammographic Lesions", page 318. *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997011350A2 (fr) * 1995-09-19 1997-03-27 Morphometrix Technologies Inc. Systeme de fragmentation multispectre assistee par reseau neuronal
WO1997011350A3 (fr) * 1995-09-19 1997-05-22 Morphometrix Techn Inc Systeme de fragmentation multispectre assistee par reseau neuronal
AU726049B2 (en) * 1995-09-19 2000-10-26 Veracel Inc. A neural network assisted multi-spectral segmentation system
US6463425B2 (en) 1995-09-19 2002-10-08 Morphometrix Technologies Inc. Neural network assisted multi-spectral segmentation system
WO1997041416A1 (fr) * 1996-04-27 1997-11-06 Boehringer Mannheim Gmbh Procede d'analyse automatisee assistee par ordinateur d'echantillons tissulaires ou d'echantillons de liquides organiques
US6246785B1 (en) 1996-04-27 2001-06-12 Roche Diagnostics Gmbh Automated, microscope-assisted examination process of tissue or bodily fluid samples
WO2013037119A1 (fr) * 2011-09-16 2013-03-21 长沙高新技术产业开发区爱威科技实业有限公司 Dispositif et procédé d'analyse de la morphologie érythrocytaire
US9170256B2 (en) 2011-09-16 2015-10-27 Ave Science & Technology Co., Ltd Device and method for erythrocyte morphology analysis
EP3598195A1 (fr) 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Procédé d'évaluation microscopique
EP3598194A1 (fr) * 2018-07-20 2020-01-22 Olympus Soft Imaging Solutions GmbH Procédé d'évaluation microscopique
CN110806636A (zh) * 2018-07-20 2020-02-18 奥林巴斯软成像解决方案公司 显微镜分析方法
US11199689B2 (en) 2018-07-20 2021-12-14 Olympus Soft Imaging Solutions Gmbh Method for microscopic analysis
CN110806636B (zh) * 2018-07-20 2024-01-23 奥林巴斯软成像解决方案公司 显微镜分析方法
EP3608701A1 (fr) * 2018-08-09 2020-02-12 Olympus Soft Imaging Solutions GmbH Procédé de fourniture d'au moins un procédé d'évaluation pour échantillons

Also Published As

Publication number Publication date
AU8229591A (en) 1992-01-07

Similar Documents

Publication Publication Date Title
US5933519A (en) Cytological slide scoring apparatus
CA2228062C (fr) Appareil et procede de mesure de robustesse de classements
Singhal et al. Local binary pattern for automatic detection of acute lymphoblastic leukemia
EP1977371B1 (fr) Procede et systeme pour identifier des champs d'eclairage sur une image
US5764792A (en) Method and apparatus for processing images
EP0336608B1 (fr) Système et méthode de classification automatique des cellules basés sur un réseau neuronal
Shirazi et al. Efficient leukocyte segmentation and recognition in peripheral blood image
US6330350B1 (en) Method and apparatus for automatically recognizing blood cells
Gautam et al. Classification of white blood cells based on morphological features
WO1996009606A1 (fr) Procede et appareil de classement de champs par ordre de priorite
WO1992013308A1 (fr) Systeme et procede de classification morphologique
Sobrevilla et al. White blood cell detection in bone marrow images
Khosrosereshki et al. A fuzzy based classifier for diagnosis of acute lymphoblastic leukemia using blood smear image processing
Ghane et al. Classification of chronic myeloid leukemia cell subtypes based on microscopic image analysis
Kumari et al. Performance analysis of support vector machine in defective and non defective mangoes classification
WO1991020048A1 (fr) Analyse cellulaire utilisant un traitement video et un reseau neuronal
Sabino et al. Toward leukocyte recognition using morphometry, texture and color
Francis et al. Screening of bone marrow slide images for leukemia using multilayer perceptron (MLP)
Mui et al. Automated classification of blood cell neutrophils.
Gunasinghe et al. Domain generalisation for glaucoma detection in retinal images from unseen fundus cameras
GB2305723A (en) Cytological specimen analysis system
EP3895060A1 (fr) Classification de noyaux cellulaires
Vejjanugraha et al. An automatic screening method for primary open-angle glaucoma assessment using binary and multi-class support vector machines
Mitsuyama et al. Automatic classification of urinary sediment images by using a hierarchical modular neural network
Bauer et al. Neural Network for Analyzing Prostate Cancer Tissue Microarrays: Problems and Possibilities

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BB BG BR CA FI HU JP KP KR LK MC MG MW NO RO SD SU

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU ML MR NL SE SN TD TG

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase