MXPA98004915A - System and method of mapear limi - Google Patents

System and method of mapear limi

Info

Publication number
MXPA98004915A
MXPA98004915A MXPA/A/1998/004915A MX9804915A MXPA98004915A MX PA98004915 A MXPA98004915 A MX PA98004915A MX 9804915 A MX9804915 A MX 9804915A MX PA98004915 A MXPA98004915 A MX PA98004915A
Authority
MX
Mexico
Prior art keywords
image
pixels
slide
light
pixel
Prior art date
Application number
MXPA/A/1998/004915A
Other languages
Spanish (es)
Inventor
Eran Kaplan
Opher Shapira
Yuval Harary
Daniel Hachnochi
Richard Sf Scott
Original Assignee
Neuromedical Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neuromedical Systems Inc filed Critical Neuromedical Systems Inc
Publication of MXPA98004915A publication Critical patent/MXPA98004915A/en

Links

Abstract

The present invention relates to a method for mapping areas of an object support, said method includes the steps of selectively illuminating the object carrier from a first light source which is oriented generally obliquely towards the surface of the object holder, obtaining a first image of the carrier illuminated by the first light source, selectively illuminating the carrier from a second source of light, providing generally scattered light, obtaining a second image of the carrier illuminated by the second source of light, and generate a map of significant areas based on the first and second images

Description

TISTKK ?. AND MÍJTODO _ > FIELD OF THE INVENTION The present invention relates generally to a system and method for obtaining images and detecting limits in images, in particular to a system and method for mapping limits of a specimen, and more specifically to a specimen. system and method for mapping areas of interest on a slide, such as the areas that are inside a cover slip and that contain specimen material. BACKGROUND OF THE INVENTION [0002] In the medical industry, a specimen is often fixed to a slide to perform various tests and classification functions on the specimen using microscopy. In pathological analyzes, for example, specimens, such as sections of fluid commons and frotig from different places on the body, are typically deposited on a slide with a transparent coverslip or glass cover that possesses the necessary optical properties for microscopy. The cover can be used to fix the specimen on the slide and / or can serve as a protective layer of the specimen or both. Unfortunately, it is difficult to accurately place the coverslip in an exact position on the slide.
In addition, air may be trapped between the slide or the specimen and coverslip, forming unwanted inclusions or bubbles that interfere with the vision of the specimen. A pathological analysis that uses a slide is a Pap smear test. In the Pap smear test, a sample of cellular material is understood to be on a slide, stained and then covered with a glass or plastic coverslip. The Pap frstis is then analyzed using manual or automatic icrocopy to detect the presence of particular cells in the specimen, such as premalignant or malignant cells.
In particular, when performing an automated or semi-automated classification of a specimen on a slide, such as a Pap smear specimen, it is desirable to identify or develop a map of the donors of interest on the specimen slide whose classification is to be performed by Be. For example, it is convenient to inform the classification system of the coverslip limits so that the classification functions are confined to the areas of the slide containing material to be classified. It is also It is desirable to inform the system of the position of air bubbles or similar inclusions so that said zones can be segmented from the analysis. This can reduce the processing time required by the system to analyze the specimen as well as improve the accuracy of some tests.
Nowadays, technicians manually map a slide by digitizing the objects of the slide which are occupied by the undesirable air bubbles and the edges of the coverslip so that said areas are not considered for the purposes of evaluation by a patient. processor or cltdiogo, Bl operator uses a pencil. digitizer to mark the undesirable areas of the slide (ie, air bubbles, air inclusions, streaks and edges of the coverslip) _ This method of manually mapping the specimen has proven to be an effective method of preparing a; automated analysis However, the current manual method is slow and expensive. It would be desirable to be able to automatically map the limits of a specimen. COMPENDIUM OF THE INVENTION The present invention provides an automated limit mopping system and method. The system uses a pair of light benches that direct scattered or obliquely incident light onto the slide to highlight air bubbles trapped under the coverslip and detect the edges of the coverslip. A camera captures the specimen and slide images and a processing system then generates a boundary map of the specimen zones within the edges of the coverslip that are not obscured by air bubbles and the like. The system can also present the results of the mopping 5 so that the user can edit the mapping process. According to an aspect of the present invention, a method of mapping areas of a slide includes the steps of selectively illuminating the slide from a first light source oriented obliquely generally to the surface of the cough holder, obtaining a first image of the slide illuminated by the first light source, illuminating selectively the slide from a Begunda light source that provides generally scattered light, obtain an image of said second image illuminated by the second light source, and generate a map of the areas of significance based on the first and second images. According to another aspect of the invention, a system of * •; < . Slide mapping includes a first light source • oriented generally obliquely to the surface of the 0 slide to create a first image, a second source of light that provides light generally dispersed to the surface of the slide to create a second image, a camera to obtain the first and second images, and a processor to generate a map -of the zones 5. - of significance based on the first and second images. According to another aspect of, the invention, a method of verifying the mapping information relating to a specimen includes the steps of generating a pixel intensity map of the specimen, determining positions of interest in the 0 specimen, assigning to the pixels within the specimen. the positions of interest one of the odd numbers q pairs representative of their intensity, assign to other pixels the other of the odd numbers or pairs representative of their intensity, and present the pixels, presenting the pixels to which odd numbers have been assigned with a characteristic color different from the pixels to which an even number has been assigned, and allowing an operator to change the pixel intensity map. According to another aspect of the invention, a method of detecting the position of bubbles in a slide includes the steps of obtaining a first image of the illuminated slide under a first lighting condition, obtaining a second image of the illuminated slide under a second condition, finding border on the images first and second and combine the edges to form a third image, find delimited areas defined by the edges in the third image, calculate an average of the gray scale intensity for each zone in the second image corresponding to a zone delimited in the third image, and compare the calculated means for each zone with a urríbral based on the intensity of gray scale of a corresponding zone in the first image. ? < According to another aspect of the invention, a method of finding a line 'in an image formed by a plurality of rows and columns of pixels includes the steps of adding the intensity values for multiple pixels in a row with the intensity values for previous pixels in the row and storing the sum for each of said multiple pixels, comparing the stored sums for a plurality % 5 - of said multiple pixels in the same column with a threshold value, and estimating a point in the line as a function of the pixels having stored sums that exceed the threshold. according to another aspect of the invention, a method of finding a line in an image formed by a plurality of rows and columns of pixels includes the steps of adding the intensity values for multiple pixels in a row with the intensity values for previous pixels in the row and storing the sum for each of said multiple pixels, cottparar the stored sums for a plurality of said multiple pixels in the same column with a threshold value, estimating a first point in the line as a function of the pixels having stored sums that exceed the threshold-bisecting the image in multiple sub-images, obtaining the sums of the values intensity for last 5 feet pixeis in a row within the sublimation for a plurality of rows adjacent to the estimated point, compare the sums obtained with the threshold value, and estimate the position of additional points in the line as a function of the pixels that they have sums obtained that exceed the threshold. According to another embodiment of the invention, a method of verifying mapping information for an image includes the steps of presenting a first map of an image having significant differences of meaning from other areas 15 of the image, allowing an operator to change the zones of the image. significance on the screen, and generate a second map according to the first map and the changes made by the \ operator. The foregoing and other features of the invention are fully described below and are pointed out in particular in the claims, the following description and the accompanying drawings giving a certain illustrative embodiment of the invention, which, however, is indicative of one of the various forms in which the principles of the invention can be employed. DESCRIPTION DB THE DRAWINGS In the attached drawings: Figure 1 is a schematic illustration of a boundary mapping system according to the present invention and an automated classification system that uses the boundary map information generated by the limit mopping system. Figure 2 is an illustration of the optical components of the limit mapping system in position for generate an image in which the edge information of the image is highlighted. FIG. 3 TB shows an illustration of the optical components of the array of boundaries in position to generate an image in which information about bubbles or inclusions of the image is highlighted. Figure 4 is a schematic diagram of the representative optical paths illustrating the formation of an image with highlighted edge information. Figure 5 is a schematic diagram of the representative optical paths illustrating the formation of an image with highlighted information on bubbles and inclusions. And Figure 6 is a schematic illustration of a bisection operation of the invention to find the edges of the coverslip in an image. DETAILED DESCRIPTION D? I_A INVENTION With reference to the various figures, and initially to Figure i, there is shown an automated limit mapping system of the invention 10 for obtaining boundary information or mapping, such as an exemplary cell sorting system 12. The mapped system 10 includes a stage 14 on which a slide 16 to be placed is placed, camera N, light banks 20 and 22, a diffuser 24 and a processing system 26 for developing the map of limits. The marking system 10 may also include a robotic slide handler 28 for transferring slides between a storage container 30 and the platen 14, a bar code reader 32 for reading bar-coded information from the slide lß and a screen 34 to facilitate operator interaction and allow verification and editing of the mapping process. As indicated above, the slide mapping system 10 is especially useful for providing information relating to the positions of the specimen material on a slide to an automated specimen classifier or automated grading. In such a context, the slide screen information can be used by the specimen classifier to locate its classification functions in the areas of the slide where biological material is likely to occur, thus possibly reducing the sorting times. In addition, obtaining a map of the specimen placed on the slide, no accuracy is required when placing the specimen and coverslip Bobre an exact area of the slide, and you can use the specimen sorter with specimen slides that have covers of various shapes and sizes. sizes Various classification systems of exemplary specimens are described, for which the mapping system 10 would provide mapping information, in the United States patents, in condominium, Nos. 5,287,272, 5,257,182 and 4,965,725 and the applications for U.S. Patent Serial Nos. 07 / 425,665, 07 / 502,611 and 08 / 195,982, the descriptions of which are incorporated in their entirety by this reference. Nauromedical Systems, Inc.®, of Suffern, New York, produces a commercial specimen classification system under the trademark PAPNET.R). however, it is noted that the mapped device of the present invention has a wide range of potential applications and is not limited to use with specimen classifiers or for use with slides or slides are coverslips, these being exemplary means only of the description of the mapping system of the invention. The light banks 20 and 22 and the diffuser 24 cooperate to produce different lighting conditions incident on the slide, each condition being adapted to highlight certain features and optical characteristics of the slide and therefore to improve the detection of characteristics, as inclusions, bubbles and border of the coverslip. Preferably, the light banks 20 and 22 and the selective use of the diffuser 24 produce separate images of the slide that will be seen by the camera 18, such as a first Image (here an edge image), in which the information of the edge, and a second image (here a bubble image), in which the bubbles and similar inclusions of the specimen are highlighted. An edge image is obtained by illuminating the slide 16 with the oblique light bench 20 which directs light to the slide at an angle parallel or nearly parallel to the upper surface of the slide. The diffuser 24 is not used in obtaining an edge image and is therefore rotated or otherwise taken out of the field of view of the camera 18, The light from the oblique light bank 20 incident on the edges of the coverslip 35 tends to be dispersed at the edges and directed towards the camera 18 more than the incident light on the upper surface of the slide 16 or the cover, as explained in more detail below. Said light captured by the camera 1S forms an edge image, the edges of the coverslip appearing somewhat brighter than the rest of the image. The edge image is transferred to the processing system 26 which finds the edges in the edge image. A bubble image is obtained by producing the diffuser 24 in the field of view of the camera 18 next to the slide 16 and illuminating the diffuser with light from the upper light bench 22 arranged on top of the pen holder and the diffuser. The diffuser 24 disperses the manara light incident on the slide 16 from various angles. Due to the differences in refraction between the slide 16, the cover 35, the specimen material and an air bubble, the incident light on an air bubble will tend to be reflected towards the camera 18 more than the incident light on the specimen, As explained, they are more detailed later. Consequently, the bubbles will appear in the resulting bubble image brighter than the image information. The bubble image ee transfers to the processing system 26 where the bubble limits are found in the image. The bubble image may also include information about the position of stripes on the coverslips or inclusions in the specimen, which are referred to collectively as bubbles. Based on the edges of the coverslips and the bubbles detected, the processing system generates a boundary map that indicates the areas of the specimen within the confines of the coverslip, excluding the bubbles. Said boundary map correlates with the identification information of the slide 16 read by the bar code reader 32 and recorded for use by the automated sorting system 12. An array of boundary maps can be stored in recording media, such as a magnetic or optical disk, for each slide 16 in the storage cassette 30 and a boundary map can be electronically transferred to the automated sorting system 12, such as by a communication network. The classification system 12 can then use the boundary map to facilitate the classification of the specimen, such as locating the zoning of the specimen within a coverslip or not obstructed by bubbles, inclusions, stripes and the like. Turning now to FIG. 2, the optical components of the Boundary Trapping system 10 which produces the edge and bubble images are shown in more detail. The boundary mapping system 10 includes a base 36 and a back plate 38 on which the various optical components of the mapping system are mounted. In the center of the base 36 is the plate 14 on which it fixes a slide 16 to be mapped. Plate 14 can be formed directly on the base 36 and adapted to facilitate automated positioning and removal of the slide on the stage, such as by a cutting section 40, or the platen may be a separate element mounted on the base. Preferably, the platen 14 includes a positioning apparatus (not shown) for firmly holding the slide 16 on the platen at a known position that is consistent with the position at which the slide attaches to the system to which the slide system is attached. mapping limits or providing mapping information, such as the exemplary cell sorting system 12. A suitable positioning apparatus is disclosed in co-pending US Patent Application Serial No. 08 / 49B.321, which is incorporated herein by this reference. The oblique light bench 20 is also mounted on the base 36 and oriented so as to project light substantially to the entire periphery of the slide 16, the oblique lumen bank * 20 preferably includes four separate light sources 42 placed. next to each side of the holder 16 and slightly raised above the slide to direct light towards the slide from an oblique angle almost parallel to the upper surface of the slide. The light sources 42 may include series of LEDS 43 or other suitable means for producing light. "25. On the rear plate 38 are mounted the camera 10, the upper light bank 22 and a diffuser assembly 44 that selectively places the diffuser 24 in the field of view of the camera.The camera 18 is placed directly above the slide 16 at a distance and with adequate optics to allow a complete view of the relevant areas of the slide, such as the portions of the slide that probably contain the coverslips and the specimen material .. The camera 18 can be any of several conventional cameras, such as a CCD camera, 35 which, alone or in conjunction with other components, such as an analog-to-digital converter, can produce a digital output of sufficient resolution to allow the processing of the captured images, for example, an image with a resolution of 640 x 480 pixels.The upper light bank 22 includes two separate light sources 46 placed between the slide 16 and the camera 18 and spaced apart from each other. to the optical path of the camera so as not to obstruct the camera's view of the relevant areas of the slide. The upper light sources 46 are preferably light series LSD 48, although other suitable light sources can be employed. The diffuser assembly 44 is positioned between the carrier 16 and the upper feeder bank 22 and is adapted to selectively place the diffuser 24 in the optical path of the chamber 18. As a consequence, the light emitted by the upper light banks 22 is dispersed by the diffuser 24 towards the slide 16 and the retroreflective light from the slide is dispersed again, a portion being dispersed to the chamber 18. The diffuser 24 has a light diffusing element 50 that disperses the incident light, such as a sheet prylar, and may also include a frame 52 that supports the light diffuser element. The diffuser assembly 44 includes an actuator (not shown) which selectively places the diffuser 24 in a position, as shown in FIG. 2, slightly above the slide 16 and in the optical path of the camera 18 when an image has to be obtained. of bubble or out of the optical path of the camera, such as next to the back plate 28, as shown in figure 3, when an edge image has to be obtained. . The conditions of the light banks 20 and 22, that is, whether the light banks generate light or not, the position of the diffuser 24 in or out of the optical path of the camera 18, and the control of the camera, including the instructions Given to the camera to obtain an image of the slide 16, they are controlled by the processing system 26 (Figure 1). The processing system 26 is preferably a conventional microcomputer with suitable interfaces for controlling the light banks 20 and 22, the diffuser apparatus 44, the camera 18, the robotic slide manipulator 28 and the bar code reader 30, as well as for receiving image data from the camera and identification information of the etose holder of the bar code reader. In practice, once a slide 16 has been placed on the platen 14 and with the diffuser 24 rotated out of the optical path of the chamber 18, the processing system 26 commands the light sources 42 of the light bench. oblique 20 that illuminates the slide. This illumination of the slide 16 from the light sources 42 located almost parallel to the slide 16, as shown schematically for one edge of the coverslip in FIG. 4, results in predominantly only the lus dispersed at the edges of the coverslip being redirected. At least partially to the chamber 18. Since the light strikes the upper surface 54 of the coverslip 35 or the upper surface of the slide 16 from very oblique angles, the incident light, represented by the rays designated by the arrow 58, is reflected predominantly in the angle of incidence and thus does not affect the chamber. - However, since the edges of the coverslip tend to be relatively rough and therefore to scatter light, the light, represented by the rays designated by the arrow 60, will be scattered on the edges of the coverslip, scattering a portion of the light , represented by the rays designated by the arrow 62, in the direction of the chamber 18, Once the slide is illuminated by the oblique light bench 20, the processing system 26 will command the camera 18 to capture an image of the slide illuminated 16. Since predominantly only the light scattered around the edges of the coverglass 35 or the slide 16 strikes the camera, the edges will appear in the edge image formed brighter than the other areas of the slide and the subnet is "In some cases. When there are oxidized cells in an air bubble located under the coverglass 35, the obliquely incident light can be scattered and reflected towards the camera, so that the thus mitigating the detection of such occurrences in the edge image. The processing system 26 then deactivates the oblique light bench 20 and commands the diffuser assembly 44 to rotate the diffuser 24 to the optical path of the camera 18. The processing system 26 then activates the light sources 46 of the light bench. upper 22. The light generated by the lu bank. upper 22 thus illuminates the slide 16 and the coverslip 35, as shown in Figure 5, by the diffuser 24. (It is noted that, in Figure 5, the diffuser 24 is depicted at an exaggerated distance from the slide 16 for illustrative purposes.The diffuser would preferably be placed sufficiently close to the holder in order that the lateral displacement of the image of the slide, represented through the diffuser, is not considerable). The diffuser 24 scatters the light so as to impinge on the slide 16 and the coverslip 35 from a variety of angles. Considering a ray of light and emplar 64, the beam from the light bank 20 is __25 dispersed by the diffuser 24 in many directions, including those designated by rays 66 and 68. The exemplary rays 66 and 68 are partially transmitted to the coverslip 35 and a portion 70 is reflected on the air-coverslip 72 interface. portions of the rays 66 and 68 trans- mitted to the coverslips 35 are reflected at the interface 74 between the coverglass and the zone 76 of the specimen material 78 or the air bubbles 80 sandwiched between the coverslips and the slide 16 or are transmitted to zone 76 depending on the angle at which the rays are approximate the interface 72 and the difference between the refractive indices of the coverslip and the air bubble or the specimen material. With a large difference in the refractive indices between the two media, as glass and air enter, more light is reflected at the interface between the media than at the interface between media that have similar refractive indices, such as glass and material. of specimen or other areas that do not contain bubbles (here referred to collectively as specimen material). Consequently, when the differences in the indices The refraction of the coverslip 35 and the specimen material 78 are relatively small, a very small portion of the light beam 66 is reflected at the interface and most of it is transmitted through the interface where it is substantially absorbed by the specimen material. 75 However, in the areas of the interface 74 between the coverslip 35 and an air bubble 80, since the differences in the refractive indexes are large, it again reflects a large percentage of the light, such as the ? > ray 68, incident in the air bubble 80, towards the diffuser 24, as indicated by beam 82. In addition, a large part of the light transmitted through interface 74 to air bubble 80 is reflected again at the interface between the bubble and the slide 16 improving the reflection produced for the bubble. The light beam 82 reflected -25 by the bubble is dispersed again by the diffuser 24, ^ as represented by the rays 84, and a portion of said rays strikes the chamber 18, thus making the bubbles appear brighter than the specimen material 78. The processing system 26 commands the camera 18 that captures the bubble image of the illuminated slide and the image is transferred to the processing system. The processing system 26 then returns the slide 16 to the storage cassette 30, once images arrive they have been transferred to the processing system 26, the images are analyzed to find the edges of the coverslip and the bubbles that are inside the edges of the coverslip, as explained in more detail below. A boundary map of the areas of the slide 16 that can connect specimen material is then generated. Preferably, the boundary map is a list of pixel positions, or identifications, in the image corresponding to the darkened areas without bubbles within the edges of the coverglass, such as areas likely to contain specimen material. The boundary map is then correlated with the information identifying the slide with which it is related, such as by the information provided by the bar code reader 32, and stored for use by a system, such as the automated classification system 12, which can then limit its analysis to map pixels that indicate the probable presence of specimen material. Before the boundary map is transferred to the classification system 12, the operator is preferably given the opportunity to review the limits of a specimen and confirm that the map appears to be accurate or to edit or reject the map. The processing system 26 generates the boundary map by forming a coverslip mask and a bubble mask and then logically combining the maps to find the areas that are within the boundaries of the coverslips that do not represent bubbles. The cover slip ß mask is formed from the edges of the coverslip found in the edge image. The procrastination system 26 first removes the artifacts from the edge image by subtracting a reference image from the original edge image. The reference image is obtained by capturing an image of the plate 14, coa the removed slide, using edge information used to obtain an edge image. An offset of intensity is added to the original edge image. { to avoid the possibility of negative intensities after referring the reference image) and the reference image is subtracted after the original image to remove artifacts, such as streaks from the stage, which might otherwise appear in the border image. Several preprocessing operations can be performed on the resulting edge image to filter short and curved lines as well as to thin the edges of the image. Using the filtered image, there are the remaining long straight edges that are inside certain windows to the left, right, up and down of the slide and are closer to the center of the image. By finding the straight line closest to the center of the slide, the edges of the coverslip are distinguished from the edges of the slide. The cover slip boards in each window are using a repeated bisection projection algorithm that finds an initial point that is likely to be near the center edge of the coverslip and then uses that point as the starting point for another bisection of the image . The image is bisected repeatedly using the center point found in the anterior bisection until the algorithm has focused on a series of points on small discrete parts of the edge of the coverslip that represent the cover slip with relative accuracy. To explain it, consider the original and cumulative projection pixel images presented below as well as Figure 6. (The original image corresponds to the filtered edge image, except that the gray scale intensities of each pixel have been represented as 1 and 0 as opposed to the real grayscale intensities to facilitate the explanation and addition in the cumulative projection image.In addition, the presented images are only a partial representation of the largest number of pixels in the window image. ). The original image represents the pixel intensities of the pixels of a window that incl the area where there would be a horizontal edge of the coverslip and the 5 portables. The upper part of the original image, in this example, is the distal edge of the edge image, with the lower portion of the image being the pixels closest to the center of the edge image. The cumulative projection image is a consecutive sum of the10 intensity values for all the pixels of the same row, that is, at the same distance from the image edge, from the original image going from the left side of the image to the right side and projected to a one-dimensional profile of the pixels more on the right. For example, all pixels of the first 9 columns of the first row of the original image have zero intensities and thus the corresponding pixels of the cumulative projection image are all zero. Co to each of the pixels of columns 10 '•. and ll has an intensity of one, the pixel of column 10 of the cumulative projection image would therefore be one, or the sum of the previous nine pixels of the row and the pixel of column 10, and the value for the pixel of column ll would be the sum of the previous 10 pixels of the column. the same row, that is, one, more the intensity of the eleventh 25, pixel of the original image for a sum of 2. Since the intensities of the pixels of columns 12 and 13 of the original image are zero, the corresponding pixels in the cumulative projection image they will have values of two, or the sum of the intensities for all the pixels that precede in the same row of the original image. The same sum is made for rows 2 to 8 to form the cumulative projection image represented below, the one-dimensional profile being the right column of pixels that have the sums, from bottom to top, of, 3, 4, 5, 2 , 0, 4, 2.
Original border image 0 oooooooo 1 1 OO 1 oooooo 1 1 1 O oo 0 ooooooooo O ooo O ooooo 1 1 o O ooo OO ooooo 1 1 1 1 1 o OO o 1 1 1 io O ooo 1 1 1 oooooo O oooo OO oooooooooo Cumulative projection image 0 oooo OOOO 1 2 2 _- 1 1 i 1 1 1 1 2 3 4 4 4 4 oooo OOOOOOO 0 0 oooo OOO 2 2 2 2 2 2 oooooo O 1 2 3 4 5 O ooo 1 2 3 4 4 4 4 4 4 1 2 3 3 3 3 3 3 3 3 3 3 3 o OOOOOOOOO 0 0 0 To find a cover slip edge (for example, the edge 90 in Figure 6) using the cumulative projection image, examines the profile for an initial search window 92 that probably contains the border from log pixels closest to the center of the edge image going towards the outer edge for the first pixel that has a value of suffered that exceeds a certain threshold. (The profile pixels are examined beginning at the pixel closest to the center to distinguish the edge of the coverslip from the edge of the slide, since, as indicated above, both edges are probably represented in the Image and it is known that the edge of the coverslips will be closer to the center of the image than the edge of the portaobjects). The weighted average for this pixel and each pixel that Biga until a pixel is found that has a sum less than the threshold, is calculated based on the sum value for each pixel above the threshold, if the threshold was 2.5, the Weighted average would be calculated for the seventh, sixth and fifth pixels from the edge of the profile, because these pixels have sum values of 3, 4 and 5, respectively. The weighted average is then taken as an assumed point 94 at the edge of the coverglass 90 in the center of the initial search window 92. The initial detected edge thus produces the horizontal edge initial guess 96. The cumulative projection image is blended after in two search windows 98, loo, or ßubimages, and the center point 94 found in the last calculation is used as the starting point to find a new weighted mean and a new assumed edge 102, 104 for each of the subimages 98, 100, respectively, created bieecting the previous image 92. These subimagines 98, 100 are bisected and the blasted images are bissed again dß farma repeated until each image includes a relatively small number of pixels representing a relatively small distance in the original border image. Thus, the detected edge may follow the actual edge 90 relatively accurately, and may follow the irregularities of the actual edge. It is noted that one advantage of using the cumulative projection image is that the profile of each sub-agent can be calculated by subtracting the pixel intensity of the cumulative projection image corresponding to the left pixel of the sub-image, the pixel intensity of the image of the cumulative projection corresponding to the right pixel in the sub-image, rather than making redundant sums for each sub-image, to fill the intervals between the edge points of calculated coverslip, a minimum-quadratic adjustment function or an adjustment function can be used of similar curves. Preferably, the predicted edge is verified to ensure that it is in fact a coverslip edge, for example, by adding the portions of the profile adjacent to the edge of the coverslip calculated for each of the last series of subimages and verifying that the sum exceeds one intensity threshold. Although the above-described preprocessing operations for removing short or curved lines from the edge image may include any of several known conventional morphological filtration techniques, a cumulative aperture operation is preferably performed on the cumulative projection image prior to application of the bisector projection algorithm described above that produces an effect similar to a one-dimensional morphological opening operation, ie, erosion followed by procrastination. For a horizontal edge, the operation makes two passes through the rows of the projection image cumulative. In the first pass, the operation explores from right to left, changing from positive to negative the cumulative sum in the pixels corresponding to the edge points to be removed from the image. As an example, consider the rows illustrated below for an original and cumulative projection image. The pixels in the Cumulative Projection Row to be removed are those whose sum of n pixels in the original image is less than a predetermined threshold. As an example, suppose that nn "is four and that the threshold is also four.The Z2? Its a of n pixels for each pixel in a row is determined by subtracting the nth pixel to the left of the current pixel in the Cumulative Projection Row of the value of the current pixel in the Cumulative Projection Row going from right to left, this has the effect of an erosion of n pixels, in this case an erosion of four pixels. A delay of n pixelß is achieved, or, in this case, a delay of four pixels, putting a counter equal to "n" as long as the sum of n pixels is greater than or equal to the threshold and decreasing the counter in every step that occurs place to the Row of Counter next. When the values in the Counter Row ßon are less than or equal to zero, the values in the corresponding pixels in the Cumulative Projection Row are changed from positive to negative, resulting in the next Left-to-Right Pass Pile. 5 Original row: 1 1 1 0 1 1 1 1 1 0 0 1 Projection row i 1 2 3 3 4 5 6 7 8 8 8 9 Counter row: 10 -3 -2 -i 0 1 2 3 4 4 - 2-1 0 Row of pass from left to right; -1 -2 -3 -3 4 5 6 7 8 -8 - 8 9 The last second goes from left to right, calculating the new image by tracking the new sum cumulative and cascading error from the original cumulative sum. If the corresponding pixel in the Pass from Right to Left is positive, the new output pixel tf for the Left to Right Pass is equal to the ^ - current pixel plus the error. If the corresponding pixel in the Pass from Right to Left is negative, the new output pixel in the Pass from left to Right is equal to the previous pixel of output, current pixel in the Pass from left to Right and the current pixel in the Row of Brror is updated so that it is the pixel of -25 exit inside and the current pixel. For example, since the left pixel in the Pass from Right to left is "i", or both the left pixel in the Pass of Left to Right will be "0" because there is no previous error pixel, and the left pixel in the Error Row is "-1" because there is no previous output pixel and the current pixel in the Right to Left Pass is "-1". The second pixel in the Left to Right Pass will be zero again because the corresponding pixel in the Right to Left Pass is negative, "-2", and the pixel in Left to Right. Inside is "0". The second pixel in the Error Row thus turns out to be "-2" because the previous pixel in the Left to Right Path is zero and the current pixel in the Right to Left Path is »- 2", and the sum of these values is "-2." The fifth pixel 5 from the left in the Left to Right Pass (the first one having a corresponding pixel in the Right to Left Pass that is positive) will be "i" because the pixel value corresponding in the Pass from Right to Left is "41", the current pixel in the pass of Error 10 is not updated and therefore it is "-3", and the sum of »4" and "-3" is pi ". The rest of the row is calculated according to these examples. Error row: -1 -2 -3 -3 -3 -3 -3 -3 -3 -3 -3 -4 15 Row of Pass from Left to Right? 0 0 0 0 1 2 3 4 5 5 5 5 The values of the Row of Left to Right Pass can then be substituted for those of a corresponding Cumulative Projection Image and the 20 edges are found in the image as described above. . Based on the edges of coverslips found above, a coverslip mask is generated that distinguishes the areas that are inside the coverslip determined by the four edges of the coverslip, of the __ 25 zones that are outside the coverslip. Next, the processing system 26 forms the bubble mask based on the original bubble image as well as the edges found in both the bubble image and the edge image. Firstly, the edge image is subjected to threshold and the limit pixels in the threshold image are found to obtain related edges are the coverslips and air bubbles that are oxidized cells that also appear in the edge image. Said zone of oxidized cells, also called flakes, can be recorded separately and supplied to the classification system. Before the bubble image is analyzed by the processing system 26, the artifacts are removed from the bubble image by subtracting a reference image from the original bubble image. The reference image is obtained by capturing an image of an empty slide without coverslips using the bubble illumination technique used to obtain the original bubble image, described above, including the use of the diffuser 24. Be add a 0 shift to the image of bubble before subtracting the reference image to ensure that the resulting image contains all positive pixel intensity values. The edges are found in the resulting bubble image using a conventional morphological edge detector 5 and a threshold forming operation. These edges are combined with the edges found in the edge image to obtain a combined edge image. Since this image probably contains small intervals at the edges, a dilatation operation is performed to expand the appropriate edges in all directions to close the intervals. Since the combined edge image now includes several contiguous or connected zones defined and delimited by the edges connected in the image, these zones can be analyzed to determine B ^ whether they represent bubbles or specimen material. To distinguish the connected zones representing bubbles from those representing specimen material, the average gray scale intensity is determined for each of the connected zones using a histogram. In 0 basis a if the average for each connected zone exceeds one of two thresholds, it is determined whether the connected zone is a bubble or contains specimen material. The threshold to be applied to a particular connected zone BT determines by the brightness of the same area in the original edge image. Since the bubbles containing oxidized cells appear bright in the original edge image, but do not appear as bright in the bubble image, a relatively low threshold is applied to the areas connected in the bubble image corresponding to the bright areas 5 in the original edge image to determine if the connected areas are bubbles. With respect to the connected areas that appear dark in the original edge image, a relatively higher threshold is applied to distinguish whether the connected zone corresponds to a bubble or 10 specimen material. It is determined that the zones that exceed the applied threshold represent bubbles and thus form the bubble mask. By logically combining the coverslip mask and the bubble mask, you can obtain a boundary map of the areas of interest of the slide, that is, the areas containing specimen material within the limits The cover of etos. In some cases it may also be desirable for the processing system 26 to develop and provide the operator with an indication of the degree of confidence in the generation of a valid boundary map by the processing system in a manner that helps the operator to see only some limit maps for their accuracy. Confidence in < - ^ the accuracy of the boundary map can be estimated using several measures including: if bright bubble zones have been found that are outside an edge of the coverslip, the error in the detected position of the slide from the calibrated position, the error of rotation of the detected position of the slide from the calibrated portion 30, the error in the parallelism of the edges of the detected coverslips, if bright areas that are not contained within the detected bubbles have been detected in the image, the difference of the bottom of the slide with respect to the calibrated bottom and the total bubble zone detected. You can also use other confidence measures. In order for the operator to be able to review the boundary map, the processing system 26 generates a mapped image to be displayed on the monitor 34. The image is then imaged of the slide 16 which is capable of combining the bubble and border images and essentially coating the combined image with a transparent coating with color areas, for example, green areas, indicating the areas B to be excluded from the map sent to the sorting system 10. The coating is generated by assigning the corresponding pixels to the material of specimen in the combined image a certain delinker, such as making the gray scale intensities of said pixels all an odd or even number, and assigning the pixels corresponding to zones to exclude from the boundary map a different identifier, such as making the gray scale intensities of these pixels the other of the v odd or even number assigned to the corresponding pixels ^ to the specimen material. For example, at the intensity of each pixel in the combined image corresponding to a zone to be excluded from the map could be assigned a close even number, and the intensity of each pixel in the combined image corresponding to an area of specimen material on the map could be assign an odd number prdxi - 25 mo. Preferably, each pixel intensity that must be changed, is changed by another one that preserves the integrity of the pixel intensity value and, therefore, the integrity of the overall image. Naturally, pixels that are correctly odd or even without change will not change. Considering as an example two pixels in the combined image, the first pixel corresponding to an air bubble in the combined image and with a gray scale intensity of 199 and the second pixel corresponding to specimen material and with an intensity of grayscale of 150, the gray scale intensity of the first pixel will be changed to an intensity value of 200 to designate that the pixel corresponds to a zone to be excluded from the map of it, and the scale intensity of the second pixel will be changed to 151 to designate that the second pixel 5 corresponds to an area to be included in the map. Then a query table is used to determine the intensities of red, green and blue for each pixel on the screen. The table of query is built are an intensity of red, green and blue for each of the possible gray scale intensities, for example, 256 gray scale intensities. In the lookup table, all gray scale intensities of number itt? Par are assigned individual red, green and blue intensities that are equal and correspond to the intensity gray scale entered. Even-numbered gray scale intensities are assigned red and blue intangies of zero and a green intensity corresponding to the gray scale intensity entered. Therefore, for a pixel corresponding to material of specimen and having an input of pixel intensity of griseß scale of odd number of 151, for example, the look-up table will provide outputs of red, green and blue of 151. For a pixel corresponding to an area to be excluded from the map of boundaries, such as an air bubble, < -, 2S and with a pair gray scale pixel intensity input of, for example, 200, the look-up table would provide output intensities of red and blue of zero and a green intensity of 200. Consequently, the zones of the mapped image to be passed to the classification system 10 as part of the boundary map are presented on screen 34 in black and white, and the areas to be excluded from the map, such as the areas outside the coverslip and the zones that contain a bubble, appear with a greenish tone. Since the gray scale intensities of the pixels only need to be changed by one to convert the intensity to an odd or even number, the relative intensities of the pixels in the mapped image are substantially maintained, allowing the operator to see the image and judge reliably the accuracy of the boundary map, as well as edit the map, if necessary. The editing of the boundary map can be done using a stylus, a mouse or other suitable interface that allows the operator to indicate to the processing system 26 the areas to be included in the map or to be excluded from the map.

Claims (3)

  1. RSIVINDICATIONS 1. A method of mapping areas of a slide, including the steps of a) selectively illuminating the slide from a first light source oriented generally obliquely to the surface of the slide; b) obtaining a first image of the slide illuminated by the first light source, * c) selectively illuminating the slide from a second light source that provides generally scattered light; d) obtaining an image of said second image illuminated by the second lamp source; and e) generating a map of the areas of significance in 5 basis to the first and second images.
  2. 2. The method of claim i, wherein the step of illuminating the slide from a first light source "jk includes illuminating from multiple sides of the slide 3. The method of claim i, wherein the step of 0 illuminating the slide from a second light source includes placing a diffuser between the second light source and the slide when the second source of light is illuminating the slide and placing the diffuser in a different position when the first source of light is i ^ - illuminating the slide 4. a slide mapping system, including: a first source of lus generally oriented obliquely the surface of the slide to create a first image, or a second light source that provides light generally dispersed to the surface of the slide to create a second image, a camera to obtain the first and second images, and a processor to generate a map of the areas of significance based on the first and second images 5. The system of claim 4, including the second source e light a diffuser to scatter light. 6. The system of claim 4, wherein the first light source directs incident light onto the slide from four sides of the slide. The system of claim 4, wherein the light of the second source gives light is directed through a diffuser to scatter the light. 10. The system of claim 7, wherein the diffuser can be selectively placed between a position in the vision field d? the camera and a position outside the field of view of the camera. 9. A slide mapping system, including: a camera, - a diffuser that can be selectively placed in a first position in the field of view of the camera and a * second position outside the field of vision of the camera; where the camera obtains a first image of the holder when the diffuser is in the first position and a second image of the slide when the diffuser is in the second position, and a processor to generate a map of the areas of significance in base to the first and second images. ~ ^ 25, 10. The system of claim 9, including a first light source oriented generally obliquely to the surface of the slide, 11. The system of claim 9, including a second oriented light source for directing light. to him 30 slides through the diffuser when the diffuser is in the first position. 12. A mapping system, including: a first light source oriented generally obliquely to an area to create a first image; 35 a second Count of light that provides light to the surface; a diffuser for scattering light from the second source of light reflected from the surface to create a second image; 5 a camera for obtaining the first and second images, - and a processor for generating a map of the areas of significance based on the first and second images. The system of claim 12, wherein the diffuser can be selectively positioned between a position in the camera's viewing condition and a position outside the field of view of the camera. 1 . a method of presenting mapping information relating to a specimen, including the steps of: generating a pixel intensity map of the specimen; determine positions of interest in the specimen; assign to the pixeis within the positions of tf "i interest one of the odd or even numbers, being the • assigned number for each pixel representative of its intensity? assign to other pixels the other of the numbers ima is or pairs, being the number assigned for each pixel representative of its intensity? and present the pixels, presenting the pixels to the SS - which have been assigned odd numbers with a characteristic color? This is different from the pixels to which an even number has been assigned. 15. The method of claim 1, wherein the pixels within the positions of interest are presented in white and white. 16. The method of claim 14, wherein the pixels outside the zones of interest are presented in a certain color. 17. The method of claim 14, wherein at 5 pixels within the positions of interest are assigned an odd number. 18. The method of claim 14, wherein pixelB outside the zones of interest are assigned an even number. 19. The method of claim 14, wherein the number assigned to a pixel differs from the gray scale intensity of the pixel by a visibly insignificant amount when the pixels B are presented. The method of claim 17, wherein the number 10 assigned to a pixel differs from the gray scale intensity of the pixel by a visibly insignificant amount when the pixels are presented. The method of claim 14, further including the step of allowing the user to edit the map of 15 pixel intensity. 22. The method of claim 14, wherein the step of generating a pixel intensity map includes - ^ steps- *. a) selectively illuminating a slide that contains a specimen from a first light source oriented generally obliquely to the surface of the slide; b) obtaining a first image of the slide illuminated by the first light source; c) selectively illuminate the slide from one • 25. second source of light; ? d) obtaining an image of said second image illuminated by the second light source; and e) generating a map of significance zones based on the first and second images. 23. A method of verifying the mapping information relating to a specimen, including the steps of generating a pixel intensity map of the specimen; determine positions of interest in the specimen, assign the pixels within the positions of interest one of the odd numbers in pairs, the number assigned to each pixel representing its intensity being; assigning the other pixels to odd or even numbers, the number assigned to each pixel being representative of its intensity; presenting the pixels, presenting the pixels to which odd numbers have been assigned are a characteristic color different from the pixels to which an even number has been assigned; and 10 allowing the operator to change the pixel intensity map. The method of claim 23, wherein the pixeis within the positions of interest ee present in black and white. 25. The method of claim 23, wherein the pipelines outside the rings of interest are presented in a detergent color. * 26. The method of claim 23, wherein an odd number is assigned to the pixels within the positions of interest. The method of claim 23, wherein an even number is assigned to the pixels outside the zones of interest. The method of claim 23, wherein the number assigned to a pixel is different from the scale intensity - ^ 25 __ of gray pixels in a visibly insignificant amount when pixels are presented. 29. The method of claim 26, wherein the number assigned to a pixel is different from the gray scale intensity of the pixel in a visibly insignificant amount when the pixels are presented. 30. A method of detecting the position of bubbles in a pen carrier, including the steps of; a) obtain a first image of the illuminated slide under a first lighting condition; 35 b) obtaining a second image of the illuminated slide under a second condition; c) finding edges in the first and second images and combining the edges to form a third image d) finding delimited areas defined by edges 5 in the third image; e) calculating the mean of the gray scale intensity for each zone in the second image corresponding to an area delimited in the third image; and f) comparing the calculated means for each zone is a threshold based on the gray scale intensity of a corresponding year in the first image. The method of claim 30, wherein the first lighting condition includes illuminating the slide with obliquely incident light. 32. The method of claim 30, wherein the second lighting condition includes illuminating the slide with scattered light. tf "J_ 33. The method of claim 30, including the step of connecting intervals in the bardes in the third 20 image. 34. The method and claim 30, wherein it is determined that the zones in the second image having calculated means above the relevant threshold represent a bubble. 25 - 35. The method of claim 30, wherein the areas of the second image corresponding to areas of the first image with a relatively high gray scale intensity are compared to a lower threshold than the areas of the second image corresponding to areas from the first 30 image with less gray scale intensity. 36. A method of finding a line in an image formed by a plurality of pixel rows and columns, including the steps of a) adding the intensity values for multiple 35 pixels in a row are the intensity values for previous pixels in the row and store the sum for each of said multiple pixels; b) comparing the stored sums for a plurality of said multiple pixels in the same column with a value 5 threshold; and c) estimate a point on the line as a function of the pixels that have stored sum exceeding the threshold. 37. The method of claim 36, wherein the estimate step includes performing a weighted average of the 0 pixels and the stored sums, 38. The method of claim 36, including the step of using the estimated point to estimate the position of another point on line * 39. The method of claim 36, including the additional step of bisecting the image to multiple subimages and using the estimated point to estimate the position of additional points on the line. tf 40. if the method of claim 39, including the * step of obtaining the sums of the intensity values for 0 multiple pixels e 'one, row within each sub picture for a plurality of rows adjacent to the estimated tip and comparing the sums with a threshold value. 41. The method of claim 40, wherein the step of obtaining the sums involves subtracting the sum stored for _ ?, the pixel in a distal column from a sub-image of the sum stored in the other distal column of said sub-image. 42. A method of finding a line in an image formed by a plurality gives rows and columns of pixels; including the steps of; 0 a) add the intensity values for multiple pixels in a row with the intensity values for previous pixels in the row and store the sum for each of said multiple pixels; b) comparing the stored sums for a plurality of said pixel multiples in the same column with a threshold value; c) is i to a first point in the line as a function of the pixels that have stored sums that exceed the threshold d) bisect the image in multiple subimages; e) obtaining the sums of the intensity values for multiple pixels in a row within each sublimation for a plurality of rows adjacent to the estimated point; f) compare the sums obtained with the threshold value; and g) estimate the position of additional points on the line as a function of the pixels that have obtained sums that exceed the threshold. 43. The method of claim 42, wherein the step of obtaining the sums includes subtracting the stored sum for the pixal in a distal column of a sub-image of the sum stored in the other distal column of said sub-picture. 44. The method of claim 42, wherein the step of estimating the position of additional points includes performing a weighted average of the pixels and the obtained pixels. 45. A method of verifying mapping information for an image, including the steps of: presenting a first map of an image that has differentiated areas of significance from other regions of the image; allow an operator to change the areas of significance on the screen; and generate a second map according to the first map and the changes made by the operator. 46. The method of claim 45, wherein the operator can make changes using a mouse. 47. The method of claim 45, wherein the operator can make changes using a stylus. 48. The system of claim 5, wherein the diffuser is selectively in the field of view of the camera. 4 • = 9_ .. The system of the claim, wherein the first louis source directs incident light onto the slide from multiple sides of the slide. SUMMARY OF THE INVENTION A method for mapping areas of an object holder, said method includes the steps of selectively illuminating the object holder from a first light source that is oriented generally obliquely towards the surface of the object holder, obtaining a first image of the carrier illuminated by the first light source, selectively illuminating the carrier from a second source of light, providing generally scattered light, obtaining a second image of the carrier illuminated by the second source of light, and generating a map of the significant areas based on the first and second images.
MXPA/A/1998/004915A 1995-12-19 1998-06-18 System and method of mapear limi MXPA98004915A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US08576988 1995-12-19

Publications (1)

Publication Number Publication Date
MXPA98004915A true MXPA98004915A (en) 1999-07-06

Family

ID=

Similar Documents

Publication Publication Date Title
US5835620A (en) Boundary mapping system and method
JP3822242B2 (en) Method and apparatus for evaluating slide and sample preparation quality
US5647025A (en) Automatic focusing of biomedical specimens apparatus
CA2779795C (en) Inclusion detection in polished gemstones
CN109523541A (en) A kind of metal surface fine defects detection method of view-based access control model
CN110044405B (en) Automatic automobile instrument detection device and method based on machine vision
EP2596371B1 (en) Method to analyze an immunoassay test strip comb member
KR20010024617A (en) Automatic lens inspection system
US20080258061A1 (en) Method and apparatus for automated image analysis of biological specimens
JPH10506461A (en) Method and apparatus for detecting cover slips on microscope slides
HUT65842A (en) Arrangement for testing ophtalmic lens
US7680316B2 (en) Imaging device and methods to derive an image on a solid phase
Paulsen et al. Illumination for computer vision systems
JP3855360B2 (en) Method and apparatus for inspecting streak defects
MXPA98004915A (en) System and method of mapear limi
JP3874562B2 (en) Glass plate crushing test method, apparatus and glass test imaging method
JPH10506710A (en) Method and apparatus for recognizing a modulation pattern on an image plane
JPH08178855A (en) Method for inspecting light-transmissive object or specular object
JP3523764B2 (en) Foreign object detection device in nonmetallic inclusion measurement
JP2955686B2 (en) Surface defect inspection equipment
JPH05209734A (en) Surface-state inspecting apparatus
Uthaisombut Detecting defects in cherries using machine vision
CN117949470A (en) Multi-station transparent material edging corner defect detection system and method
JPS6189544A (en) Method for inspection of foreign matter mixed in resin