US20110002516A1 - Method and device for dividing area of image of particle in urine - Google Patents
Method and device for dividing area of image of particle in urine Download PDFInfo
- Publication number
- US20110002516A1 US20110002516A1 US12/920,424 US92042409A US2011002516A1 US 20110002516 A1 US20110002516 A1 US 20110002516A1 US 92042409 A US92042409 A US 92042409A US 2011002516 A1 US2011002516 A1 US 2011002516A1
- Authority
- US
- United States
- Prior art keywords
- image
- region
- object region
- urine
- density
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000002245 particle Substances 0.000 title claims abstract description 153
- 210000002700 urine Anatomy 0.000 title claims abstract description 98
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000011218 segmentation Effects 0.000 claims abstract description 89
- 238000009826 distribution Methods 0.000 claims abstract description 26
- 230000003287 optical effect Effects 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims description 29
- 239000013049 sediment Substances 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 6
- 241001510071 Pyrrhocoridae Species 0.000 claims description 3
- 239000007788 liquid Substances 0.000 claims 3
- 238000010186 staining Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 25
- 238000002474 experimental method Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 9
- 238000000605 extraction Methods 0.000 description 9
- 238000007796 conventional method Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 6
- 210000004027 cell Anatomy 0.000 description 5
- 238000000059 patterning Methods 0.000 description 5
- 230000035945 sensitivity Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 238000007493 shaping process Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000000593 degrading effect Effects 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 210000000601 blood cell Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 210000002919 epithelial cell Anatomy 0.000 description 1
- 210000003743 erythrocyte Anatomy 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 210000000265 leukocyte Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000011206 morphological examination Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 210000004085 squamous epithelial cell Anatomy 0.000 description 1
- 230000008961 swelling Effects 0.000 description 1
- 230000002485 urinary effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1468—Optical investigation techniques, e.g. flow cytometry with spatial resolution of the texture or inner structure of the particle
- G01N15/147—Optical investigation techniques, e.g. flow cytometry with spatial resolution of the texture or inner structure of the particle the analysis being performed on a sample stream
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1429—Signal processing
- G01N15/1433—Signal processing using image recognition
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1468—Optical investigation techniques, e.g. flow cytometry with spatial resolution of the texture or inner structure of the particle
- G01N2015/1472—Optical investigation techniques, e.g. flow cytometry with spatial resolution of the texture or inner structure of the particle with colour
Definitions
- the present invention relates to a method for image region segmentation that uses density information, and particularly to a method and an apparatus for region segmentation favorable for segmentation of a urine particle image.
- Patent documents 1 and 2 describes a method in which a urine specimen is passed through a specially-shaped flow channel (flow cell) to pass particles in the specimen through a wide imaging region, where an enlarged image of the urine particles is taken as a static image with a flash lamp lighted.
- the image is first divided into a region with a urine particle and a background region, and the urine particle is classified based on image feature parameters obtained for the region with the urine particle.
- this conventional technique a method is shown in which an object region is separated from a background region based on a threshold and the magnitude of change in density value, which are obtained from a density histogram.
- Patent document 4 Patent document 4
- Patent document 5 Automatic Blood Cell Classification Apparatus
- object particles for which region segmentation is performed vary from one another in their properties as shown below.
- Particles thus stained include sufficiently-stained particles (stained particles) and hard-to-stain particles (insufficiently-stained particles (light-stained particles) and hardly-stained particles (nonstained particles)).
- Even particles of the same type include sufficiently-stained particles and hard-to-stain particles.
- object particles examined by a urine sediment apparatus vary in their properties. This brings a problem that accurate regions might not necessarily be extracted if a single region segmentation method is used for all the objects.
- the above-mentioned conventional region segmentation method discloses a method in which a background density distribution is estimated with the mode of a density histogram used as an average density of the background, and both a part darker than and a part lighter than the background density distribution are extracted as object regions.
- a background density distribution is estimated with the mode of a density histogram used as an average density of the background, and both a part darker than and a part lighter than the background density distribution are extracted as object regions.
- some pixels of an object region have density values lighter than those in the background due to light refraction and reflection within the object region. Accordingly, this method claims to be able to accurately extract the shape of a hard-to-stain cell by extracting the part lighter than the background, as well.
- An objective of the present invention is to provide a method and an apparatus for region segmentation by which stable region segmentation is performed for each particle in a urine specimen in which urine particles having different sizes and tones coexist.
- a method for region segmentation of urine particle images is characterized by comprising the steps of: extracting a first object region by using one or more of an image with red components (hereinafter referred to as an R image), an image with green components (hereinafter referred to as a G image), and an image of blue components (hereinafter referred to as a B image) of a urine particle image taken by an image input optical system configured to input particle images; calculating a density distribution of one or more of the R image, the G image, and the B image in the first object region, and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and extracting a second object region from a local region including the first object region in the image, by using one or more of the R image, the G image, and the B image, depending on each of the groups.
- an R image an image with red components
- G image an image with green components
- B image an image of blue components
- the first object region larger than a particle image is extracted. Then, the first object region is classified into the predetermined number of groups based on the tone and the size thereof. Based on the classification result, the second object region is extracted according to the features of the particle image.
- stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist.
- a method for region segmentation of urine particle images is characterized by comprising: a first step of creating a density histogram of each of an R density, a G density, and a B density by using an R image, a G image, and a B image of a urine particle image taken by an image input optical system configured to input particle images, and obtaining one or more parameters indicating a shape of the density histogram; a second step of extracting a first object region by using the one or more parameters and one or more of the R image, the G image, and the B image; a third step of calculating a density distribution of one or more of the R image, the G image, and the B image in the first object region and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and a fourth step of extracting a second object region from a local region including the first object region in the image, by using the one or more parameters and one or more of the R image, the G image,
- an apparatus for region segmentation of a urine particle image is characterized by comprising: a means for extracting a first object region by using one or more of an R image, a G image, and a B image of a urine particle image taken by an image input optical system configured to input particle images; a means for calculating a density distribution of the one or more of the R image, the G image, and the B image in the first object region and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and a means for extracting a second object region from a local region including the first object region in the image by using one or more of the R image, the G image, and the B image, depending on each of the groups.
- region segmentation of urine particle images With the method for region segmentation of urine particle images according to the present invention, stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist and thus a more accurate binary image can be obtained.
- This effect consequently allows accurate calculation of feature parameters of an object region, prevention of erroneous classification of an object particle, and therefore improvement of accuracy in identifying urine particles of various types.
- FIG. 1 is a diagram showing a configuration example of region segmentation by a urine sediment examination apparatus to which the present invention is applied.
- FIG. 2 is a diagram illustrating a configuration example of the urine sediment examination apparatus.
- FIG. 3 is a diagram illustrating a configuration example of an input unit of the urine sediment examination apparatus.
- FIG. 4 is a diagram illustrating an image processing method by the urine sediment examination apparatus.
- FIG. 5 is a detailed configuration example of the region segmentation.
- FIG. 6 is a diagram illustrating a detailed configuration in a case of using a density histogram of each image for the region segmentation.
- FIG. 7 is a diagram showing an example of a urine particle image obtained by the urine sediment examination apparatus.
- FIG. 8 is a diagram illustrating a detailed procedure for obtaining parameters indicating the shape of the density histogram of each image.
- FIG. 9 shows an example of the density histogram.
- FIG. 10 is a diagram showing a detailed procedure for first region segmentation in which a region larger than an object is extracted.
- FIG. 11 is a diagram showing a detailed procedure for grouping the first object region.
- FIG. 12 is a diagram illustrating a flow of an example of the grouping carried out when an area is used as the parameter for size.
- FIG. 13 is a diagram illustrating an example of a distribution chart used when a G density and a B density are selected for density values used for the grouping.
- FIG. 14 is a diagram illustrating a flow of an example of grouping carried out when a G-B distribution chart is used as the density values used for the grouping.
- FIG. 15 is a diagram showing a detailed procedure for group-specific second region segmentation.
- FIG. 16 is a diagram illustrating a flow of an example of processing for determining the need for additional region segmentation for each group.
- FIG. 17 is a diagram showing a detailed procedure for the group-specific second region segmentation.
- FIG. 18 is a diagram illustrating a configuration of an apparatus for region segmentation of urine particle images, according to a second embodiment of the present invention.
- FIG. 2 is a diagram illustrating an apparatus for automatic analysis of urinary sediments, to which the present invention is applied.
- a stainer unit 202 stains a urine specimen with a stain, and after a certain period of time, an input unit 203 takes an enlarged static image of particles in the urine. The image thus taken is transferred to a processing unit 206 , where the urine particles are classified through image pattern recognition. The processing unit 206 then counts the types of the urine particles included in a single specimen as well as the appearance frequency of each type.
- a general-purpose personal computer having a display 204 and a keyboard 205 is used as the processing unit 206 . The operator is informed of the counting results through the display 204 .
- the image taken by the input unit 203 , data obtained by the processing unit 206 , such as a measurement result, a classification result for each object region, and image feature parameters obtained during the image pattern recognition, are saved in a storage device in the processing unit 206 .
- the processing unit 206 also has a review function which allows the operator to display any selected image and to make a correction to the automatic classification or to perform fine classification visually.
- FIG. 3 is a schematic diagram of the input unit 203 of the apparatus 201 .
- a flow cell 301 is used to form a wide, flat and thin flow for urine specimen between an objective lens 303 and a pulse lamp 302 . While the flow for urine specimen formed by the flow cell 301 is irradiated by the pulse lamp 302 momentarily, an image of urine particles enlarged by the objective lens 303 is taken by a camera 304 as a static image.
- a camera 304 a CCD color camera, a CMOS color camera, or the like is used for example. The image thus obtained is transferred to the processing unit 206 .
- FIG. 4 is a diagram illustrating how the processing unit 206 performs image processing.
- An image of urine particles taken by the camera 304 is transferred to the processing unit 206 as digital data.
- shading compensation step S 401 density unevenness of the image, resulting from the characteristics of the optical system, is removed.
- region segmentation step S 402 a binary image is formed by segmenting the image of urine particles into a background region and an object region.
- 0 indicates the background region
- 1 indicates the object region.
- correction and shaping of the binary image is performed, such as compensation for the object region and noise removal for the background region.
- filtering processing such as, for example, swelling and shaping, can be used as the means for the correction and shaping.
- each group of connected components in the binary image is labeled and assigned a number to be uniquely identified among multiple objects in the image.
- feature parameter calculation step S 405 feature parameters, such as an area, a perimeter, and an average density value, are calculated for each object region thus numbered.
- each object region is classified on the basis of its component type by using the image parameters obtained for the object region in step S 405 .
- a neutral network approach, a statistical recognition approach, or the like can be used as the means for the patterning recognition.
- items used for the classification include a red blood cell, a white blood cell, a squamous epithelial cell, other epithelial cells, a cast, a crystal, a bacterium, and the like.
- counting step S 407 based on the classification results obtained in patterning recognition step S 406 , the objects classified into each classification class are counted.
- the counting results are converted into the number of objects or the density per unit volume of the urine specimen, and the conversion results are outputted to the display 204 .
- FIG. 5 is a diagram illustrating in detail a configuration example of region segmentation step S 402 .
- step S 501 first object regions in each of which an object particle exist are extracted from an image of urine particles.
- a region larger than the object particle is extracted by the segmentation here so that the object particle including a hard-to-stain particle can be extracted as one continuous region.
- the region segmentation is carried out using one or more of an R image, a G image, and a B image of the image 101 and using a fixed threshold predetermined by experiment, or the like. As a result, regions larger than the object particles are obtained as the region segmentation results, as shown in 102 of FIG. 1 .
- each first object region is grouped into a predetermined number of groups.
- Feature parameters of the first object region are used for the grouping.
- the feature parameters used here include a feature parameter for size and a feature parameter for tone.
- an area, a perimeter, or the like is used for the feature parameter for size.
- an average density value or the like is used for the feature parameter for tone.
- the first object region is grouped into three groups based on the tone and the size thereof, and 103 of FIG. 1 shows the grouping results.
- second object regions are extracted from respective local regions each including the first object region by carrying out group-specific region segmentation.
- the region segmentation employs a method which uses one or more of an R image, a G image, and a B image of the image of urine particles and a fixed threshold predetermined by experiment, or employs other methods.
- the type and the number of the images selected and the fixed threshold used are different for each group. For example, in a case of urine particles stained with a typical Sternheimer(S) stain, the absorption peak of the particles stained is on the order of 550 nm.
- the particles stained include stained particles that are sufficiently stained and light-stained particles that are insufficiently stained.
- the same image may be used, but the fixed threshold has to be changed according to the tone.
- a contour extraction method such as dynamic contour (snake), which is a known conventional technique may be used.
- 104 of FIG. 1 shows results of object regions accurately extracted as a result of the region segmentation performed differently for each group.
- a first object region larger than a particle image is extracted. Then, the first object region is classified into a predetermined number of groups based on the tone and the size thereof. Based on the classification result, a second object region is extracted according to the features of the particle image.
- FIG. 6 is a diagram illustrating a detailed configuration employed in a case where a density histogram of each of an R density, a G density, and a B density of a urine particle image is used in region segmentation step S 402 .
- step S 601 a density histogram is created for each of the R density, the G density, and the B density of the urine particle image, and for each density histogram, parameters indicating the shape thereof are obtained.
- step S 602 a first object region in which an object particle exists is extracted from the image by using the parameters obtained in step S 601 and the like. A region larger than the object particle is extracted by the segmentation here so that the object particle including a hard-to-stain particle can be extracted as one continuous region. For example, when a urine particle image 701 shown in FIG. 7 is obtained, the first object region obtained in step S 602 is a region 703 being larger than an object particle 702 .
- the first object region is grouped into a predetermined number of groups, based on its average density value distribution and size. Note that an area, a perimeter, or the like is used as the size.
- second region segmentation step S 604 a second object region representing a more accurate shape of the object particle is extracted from a local region including the first object region using an optimum region segmentation method which is different for each group. The second object region thus extracted is subjected to a modifying process in step S 403 .
- FIG. 8 shows a detailed procedure, conducted in step S 601 of FIG. 6 , for obtaining parameters Pd(*), Phl(*), Phh(*), dl(*), and dh(*) indicating the shape of the density histogram of each image. Note that * indicates any one of R, G, and B.
- step S 801 a density histogram is created for each of the R image, the G image, and the B image.
- step S 802 as shown in FIG. 9 , a density value Pd(*) having a maximum frequency value Pmax(*) is obtained for each density histogram. Note that * indicates any one of R, G, and B.
- step S 803 of FIG. 8 as shown in FIG. 9 , density values Phl(*) and Phh(*) having a value Pmax/2(*) which is a half width of the peak are obtained for each density histogram. Note that * indicates any one of R, G, and B.
- step S 804 of FIG. 8 dl(*) and dh(*) in FIG. 9 are calculated through Formulae (1) and (2) shown below using Pd(*) in FIG. 9 obtained in step S 802 and Phl(*) and Phh(*) in FIG. 9 obtained in step S 803 .
- * indicates any one of R, G, and B.
- FIG. 10 shows a detailed procedure for the first region segmentation, conducted in step S 602 of FIG. 6 , in which a region larger than an object is extracted.
- step S 1001 shows a procedure for extracting an object region by using the density histogram.
- thresholds T 1 (*) and T 2 (*) shown in FIG. 9 are calculated through Formulae (3) and (4) shown below using Pd(*) obtained in step S 802 of FIG. 8 and dl(*) and dh(*) obtained in step S 804 of FIG. 8 .
- * indicates any one of R, G, and B.
- k 1 (*) and k 2 (*) are coefficients predetermined by experiment, and optimum coefficients which are different for each of the R image, the G image, and the B image are obtained in advance. Note that one or more of density values that have the highest sensitivity due to the color characteristics of the urine particles and spectral characteristics of the camera are selected and used for the calculation of thresholds.
- the threshold values are calculated as follows.
- T 1( R ) Pd ( R ) ⁇ dl ( R ) ⁇ k 1 ( R )
- T 1( G ) Pd ( G ) ⁇ dl ( G ) ⁇ k 1 ( G )
- k 1 (R), k 2 (R), k 1 (G), and k 2 (G) are coefficients predetermined by experiment, and optimum coefficients which are different for each image are obtained in advance.
- step S 1003 an object region is extracted using the thresholds thus obtained. For example, if the thresholds T 1 (R), T 2 (R), T 1 (G), and T 2 (G) are used, pixels (x, y) satisfying Formula (5) shown below are extracted as the object region.
- R(x, y) indicates an R density value of the pixel (x, y)
- G(x, y) indicates a G density value of the pixel (x, y).
- the coefficient k n (*) is set to be somewhat small. Setting the coefficient k n (*) small increases the area of the object region in FIG. 9 .
- step S 1004 shows a procedure for extracting an object region by using the magnitude of density value change.
- step S 1005 a value indicating the magnitude of density value change is calculated.
- step S 1005 when the difference between density values in a local small region is used, a density difference value defined by Formula (6) shown below is used. r(*)(x, y) is obtained by:
- *(x, y) and r(*)(x, y) indicate a density value and a density difference value, respectively, at the pixel position (x, y) on the image.
- step S 1006 using r(*)(x, y), pixels (x, y) satisfying Formula (7) shown below are extracted as an object region.
- ⁇ indicates that a value therebetween is an absolute value
- s n (*) is a constant predetermined by experiment, and an optimum constant different for each color is obtained in advance.
- * indicates any one of R, G, and B.
- the number of neighboring pixels used for calculating a difference value for a given pixel is represented by 2n+1, and is called a mask size.
- a mask size of 1 means that no difference processing is performed.
- the mask size in each of the x direction and the y direction is set as large as possible within an object region of a nonstained particle.
- pixels (x, y) satisfying Formula (8) shown below are extracted as the object region.
- the index of the magnitude of density value change is not limited to the density value difference.
- Other methods may be used, including a method using a density value distribution in a local small region, a method using filtering processing for emphasizing frequency components specifically included in the object region, and the like.
- a first object region is extracted by superimposing (i.e., logically ORing) the object regions obtained in step S 1001 and step S 1004 , respectively.
- the object region extraction in step S 1001 uses the R image and the G image and uses Formula (5) and that the object region extraction in step S 1004 uses the G image with a combination of a mask size of 5 in the x direction and a mask size of 1 in the y direction and uses Formula (8).
- pixels (x, y) satisfying the following formula are obtained as the first object region.
- II indicates a logical OR
- step S 1008 the first object region extracted is subjected to a modifying process in such a manner similar to step S 403 of FIG. 4 .
- step S 1009 the first object region is subjected to labeling in such a manner similar to step S 404 of FIG. 4 .
- step S 1010 feature parameters, such as an area, a perimeter, and an average density value, are calculated for the first object region thus numbered. The feature parameters thus calculated are used in the grouping in step S 603 of FIG. 6 .
- FIG. 11 shows a detailed procedure for the grouping performed in step S 603 of FIG. 6 .
- step S 1101 based on the feature parameter for size of the first object region, the first object region is classified into two groups: a group of big size particles and a group of small size particles. An area, a perimeter, or the like is used for the feature parameter for size.
- FIG. 12 is a diagram illustrating a flow of the grouping performed using an area as the feature parameter for size.
- the first object region satisfying M ⁇ m is classified as a group of big size particles, and the first object region not satisfying M ⁇ m is classified as a group of small size particles.
- M is the area of the first object region
- m is an optimum value obtained in advance by experiment.
- step S 1102 of FIG. 11 using the average density value of the first object region, the first object region is grouped into a predetermined number of groups according to the tone thereof.
- the average density value one or more of color images having the highest sensitivity due to the color characteristics of the object region and the spectra characteristics of the camera are selected.
- the first object region is grouped into the predetermined number of groups, depending on where in the density space the average density value of the image thus selected is located.
- discriminating boarders used for the grouping in the density space are predetermined by experiment. For example, the discriminating boarders can be obtained using a known conventional technique such as discriminant analysis.
- FIG. 13 is a diagram illustrating an example of a distribution chart used in the grouping when, for example, the G density and the B density are used for the density values.
- Symbols 1303 to 1306 indicate coordinates (g, b) of the average density values of the urine particles collected in advance to form a distribution chart. It has been made clear by experiment that an optimum region segmentation method is different for each group of the urine particles collected.
- the urine particles are grouped into four groups, A, B, C and D, according to the tone thereof, and the symbols 1303 to 1306 represent distributions, in a G-B space, of particles belonging to the groups A, B, C, and D, respectively.
- Discriminating boarders 1301 and 1302 are calculated using, for example, linear discriminant analysis.
- the linear discriminant analysis uses density value data on the symbols.
- the discriminating boarders thus calculated can be represented by the following formulae.
- FIG. 14 is a diagram illustrating a flow of grouping performed when the G-B distribution chart in FIG. 13 is used in the grouping in step S 1102 of FIG. 11 .
- the first object region belongs to a group of light-stained B ( 1304 ) if b ⁇ j 1 g+k 1 is satisfied in a determination in step S 1401 and additionally if b ⁇ j 2 g+k 2 is satisfied in step S 1402 .
- the first object region belongs to a group of nonstained D ( 1306 ) if b ⁇ j 1 g+k 1 is satisfied in the determination in step S 1401 and additionally if b ⁇ j 2 g+k 2 is not satisfied in step S 1402 .
- the first object region belongs to a group of stained A ( 1303 ) if b ⁇ j 1 g+k 1 is not satisfied in the determination in step S 1401 and additionally if b ⁇ j 2 g+k 2 is satisfied in step S 1402 .
- the first object region belongs to a group of light-stained C ( 1305 ) if b ⁇ j 1 g+k 1 is not satisfied in the determination in step S 1401 and additionally if b ⁇ j 2 g+k 2 is not satisfied in step S 1402 .
- step S 1103 of FIG. 11 the grouping is performed in a similar manner to step S 1102 .
- the number of classification groups and the classification method used in step S 1103 are predetermined by experiment and are different from those used in S 1102 .
- FIG. 15 shows a detailed procedure for the group-specific second region segmentation performed in step S 604 of FIG. 6 .
- step S 1501 the need for additional region segmentation is determined for each group.
- FIG. 16 shows a detailed flow of step S 1501 .
- Group N is predetermined by experiment as not needing additional region segmentation.
- Group N is big size particles.
- the region segmentation in step S 602 of FIG. 6 of extracting an object with a larger region does not give much difference to the feature parameters calculated in step S 405 of FIG. 4 .
- the patterning recognition in S 406 of FIG. 4 is performed with no problem, making final classification of urine particles unlikely to be erroneous.
- not needing additional region segmentation saves the time for the region segmentation, and therefore offers an effect of improving a processing speed of image processing.
- a local region including the first object region is extracted.
- a local rectangle including the first object region is extracted. For example, if the urine particle image 701 shown in FIG. 7 is obtained, a rectangle 704 including the first object region 703 obtained in step S 602 of FIG. 6 is extracted as the local region.
- step S 1503 group-specific region segmentation, which is different for each group, is performed to extract a more accurate second object region from the local region including the first object region.
- FIG. 17 shows a detailed procedure for the second region segmentation performed in step S 1503 .
- Step S 1701 is basically the same as step S 1001 of FIG. 10 , but the color images and parameters used in step S 1702 are different for each group.
- the coefficient k n (*) used in the threshold calculation is different for each group, so that the thresholds are calculated differently for each group. Note that * indicates any one of R, G, and B.
- Step S 1704 is basically the same as step S 1004 of FIG. 10 , but the color images, the constant s n (*), and the mask sizes used in step S 1705 are different for each group. Note that * indicates any one of R, G, and B.
- step S 1707 a second object region, which is extracted in step S 604 in the flow of FIG. 6 , is extracted.
- the second object region is obtained by superimposing (logically ORing) the object regions obtained in step S 1703 and step S 1706 , respectively.
- a group of stained particles ( 1 ) a group of light-stained particles ( 2 )
- a group of nonstatined particles ( 3 ) a group of nonstatined particles
- the object region extraction in step S 1701 uses the R image and the G image
- the object region extraction in step S 1704 uses the G image with a combination of a mask size of 5 in the x direction and a mask size of 1 in the y direction.
- the second object region is obtained by the following formula.
- a certain first object region belongs to the group of light-stained particles ( 2 ), and that the object region extraction in step S 1701 uses the R image and the G image, and the object region extraction in step S 1704 uses a mask size of 1 in both the x and y directions. Then, the second object region is obtained by the following formula.
- the object region extraction in step S 1701 uses the B image
- the object region extraction in step S 1704 uses a mask size of 1 in both the x and y directions. Then, the second object region is obtained by the following formula.
- the region segmentation method of the present invention in which threshold processing is performed using density histograms offers such an effect that stable threshold processing can be performed even for a urine specimen in which urine particles having different tones coexist.
- a first object region larger than a particle image is extracted. Then, the first object region is classified into a predetermined number of groups based on the tone and the size thereof. Based on the classification result, a second object region is extracted according to the features of the particle image.
- This configuration of the present invention offers such an effect that stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist. This effect consequently allows accurate calculation of feature parameters of an object region, prevention of erroneous classification of an object particle, and therefore improvement of accuracy in identifying urine particles of various types.
- FIG. 18 is a diagram illustrating the configuration of an apparatus for region segmentation of urine particle images according to a second embodiment of the present invention.
- the original data is then transferred to a first region segmentation device 1802 , where first region segmentation is carried out.
- first region segmentation the method described in Embodiment 1 may be used. Images of first object regions obtained through the region segmentation are transferred to the memory 1801 .
- the first object regions are transferred to a grouping device 1803 , where feature parameters are calculated for each first object region, and the first object region is classified into a predetermined number of groups.
- the method described in Embodiment 1 may be used. Results of the grouping are transferred to the memory 1801 .
- a group-specific second region segmentation device 1804 performs region segmentation using the original image, the first object regions, and the grouping results that are saved in the memory 1801 , and thereby obtains second object regions each accurately representing the shape of an object particle.
- the method described in Embodiment 1 may be used.
- Results of the second object regions are transferred to the memory 1801 .
- the means for saving the results of the second region segmentation is not limited to the memory 1801 , and may be, for example, an external storage medium such as an HDD or a floppy disk.
- a first object region larger than a particle image is extracted, and is classified into a predetermined number of groups based on the tone and the size of the first object region, and the second object region is extracted based on the classification result and the features of the particle image.
Landscapes
- Chemical & Material Sciences (AREA)
- Dispersion Chemistry (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
Abstract
Description
- The present invention relates to a method for image region segmentation that uses density information, and particularly to a method and an apparatus for region segmentation favorable for segmentation of a urine particle image.
- In a conventional method for morphological examination of particles in urine, an examination is performed by centrifugal separation of urine particles from collected urine, and then by direct observation of the urine particles by a laboratory technician through a microscope. Such microscopic examination has such problems that (1) the result depends on the proficiency of the technician and (2) examination takes time. Accordingly, a more efficient way is demanded.
- In recent years, automation of an examination of urine sediments (urine particles) has been in progress. For example, “Particle Analysis Apparatus and Particle Analysis Method (Patent documents 1 and 2)” describes a method in which a urine specimen is passed through a specially-shaped flow channel (flow cell) to pass particles in the specimen through a wide imaging region, where an enlarged image of the urine particles is taken as a static image with a flash lamp lighted.
- To automatically classify a urine particle whose image is taken as a static image, the image is first divided into a region with a urine particle and a background region, and the urine particle is classified based on image feature parameters obtained for the region with the urine particle.
- “Region Segmentation Method for Particle Images (Patent document 3),” for example, describes a conventional technique for dividing an image into a region with a urine particle and a background region. In this conventional technique, a method is shown in which an object region is separated from a background region based on a threshold and the magnitude of change in density value, which are obtained from a density histogram.
- In addition, “Patterning Recognition Apparatus (Patent document 4)” and “Automatic Blood Cell Classification Apparatus (Patent document 5),” for example, describe conventional techniques for classifying an object based on image feature parameters. In these conventional techniques, use of a layered network as a recognition logic is described.
- [Patent document 1] JP 5-296915 A
- [Patent document 2] JP 63-94156 A
- [Patent document 3] JP 8-145871A
- [Patent document 4] JP 10-302067 A
- [Patent document 5] JP 3-131756 A
- In a urine sediment examination apparatus, object particles for which region segmentation is performed vary from one another in their properties as shown below.
- (1) In urine sediment examination, a urine specimen is often stained to facilitate determination on particles. Particles thus stained include sufficiently-stained particles (stained particles) and hard-to-stain particles (insufficiently-stained particles (light-stained particles) and hardly-stained particles (nonstained particles)).
- (2) Even particles of the same type include sufficiently-stained particles and hard-to-stain particles.
- (3) Particles having different tones coexist in a single image.
- As described, object particles examined by a urine sediment apparatus vary in their properties. This brings a problem that accurate regions might not necessarily be extracted if a single region segmentation method is used for all the objects.
- For example, the above-mentioned conventional region segmentation method (Patent document 3) discloses a method in which a background density distribution is estimated with the mode of a density histogram used as an average density of the background, and both a part darker than and a part lighter than the background density distribution are extracted as object regions. As to hard-to-stain particles, some pixels of an object region have density values lighter than those in the background due to light refraction and reflection within the object region. Accordingly, this method claims to be able to accurately extract the shape of a hard-to-stain cell by extracting the part lighter than the background, as well. However, there are cases where light reflection and refraction near a contour of an object cause a background region adjacent to the object to have a part much lighter than the average density of the background. Consequently, the background near the contour might be extracted as an object, which is one of factors of degrading the region segmentation accuracy.
- Moreover, in the case of using the above-mentioned conventional region segmentation method (Patent document 3), an object region and a background region partially overlap in a density histogram of hard-to-stain particles. For this reason, in order to extract a region of such a particle, region segmentation needs to be performed with a smallest possible density range for the background region set in the estimation of a background density distribution. However, when a region with a sufficiently-stained particle is extracted using the same threshold as that used for hard-to-stain particles, a region which is really the background might be extracted as an object, which is another one of the factors of degrading the region segmentation accuracy.
- An objective of the present invention is to provide a method and an apparatus for region segmentation by which stable region segmentation is performed for each particle in a urine specimen in which urine particles having different sizes and tones coexist.
- To solve the above described problems, a method for region segmentation of urine particle images is characterized by comprising the steps of: extracting a first object region by using one or more of an image with red components (hereinafter referred to as an R image), an image with green components (hereinafter referred to as a G image), and an image of blue components (hereinafter referred to as a B image) of a urine particle image taken by an image input optical system configured to input particle images; calculating a density distribution of one or more of the R image, the G image, and the B image in the first object region, and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and extracting a second object region from a local region including the first object region in the image, by using one or more of the R image, the G image, and the B image, depending on each of the groups. With the configuration, the first object region larger than a particle image is extracted. Then, the first object region is classified into the predetermined number of groups based on the tone and the size thereof. Based on the classification result, the second object region is extracted according to the features of the particle image. Thus, stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist.
- Furthermore, a method for region segmentation of urine particle images is characterized by comprising: a first step of creating a density histogram of each of an R density, a G density, and a B density by using an R image, a G image, and a B image of a urine particle image taken by an image input optical system configured to input particle images, and obtaining one or more parameters indicating a shape of the density histogram; a second step of extracting a first object region by using the one or more parameters and one or more of the R image, the G image, and the B image; a third step of calculating a density distribution of one or more of the R image, the G image, and the B image in the first object region and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and a fourth step of extracting a second object region from a local region including the first object region in the image, by using the one or more parameters and one or more of the R image, the G image, and the B image, depending on each of the groups. Performing threshold processing using density histograms offers such an effect that stable threshold processing and stable region segmentation for each particle image can be performed even for a urine specimen in which urine particles having different tones coexist.
- Still furthermore, an apparatus for region segmentation of a urine particle image is characterized by comprising: a means for extracting a first object region by using one or more of an R image, a G image, and a B image of a urine particle image taken by an image input optical system configured to input particle images; a means for calculating a density distribution of the one or more of the R image, the G image, and the B image in the first object region and a size of the first object region, and classifying the first object region into a predetermined number of groups based on the density distribution and the size; and a means for extracting a second object region from a local region including the first object region in the image by using one or more of the R image, the G image, and the B image, depending on each of the groups.
- With the method for region segmentation of urine particle images according to the present invention, stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist and thus a more accurate binary image can be obtained. This effect consequently allows accurate calculation of feature parameters of an object region, prevention of erroneous classification of an object particle, and therefore improvement of accuracy in identifying urine particles of various types.
-
FIG. 1 is a diagram showing a configuration example of region segmentation by a urine sediment examination apparatus to which the present invention is applied. -
FIG. 2 is a diagram illustrating a configuration example of the urine sediment examination apparatus. -
FIG. 3 is a diagram illustrating a configuration example of an input unit of the urine sediment examination apparatus. -
FIG. 4 is a diagram illustrating an image processing method by the urine sediment examination apparatus. -
FIG. 5 is a detailed configuration example of the region segmentation. -
FIG. 6 is a diagram illustrating a detailed configuration in a case of using a density histogram of each image for the region segmentation. -
FIG. 7 is a diagram showing an example of a urine particle image obtained by the urine sediment examination apparatus. -
FIG. 8 is a diagram illustrating a detailed procedure for obtaining parameters indicating the shape of the density histogram of each image. -
FIG. 9 shows an example of the density histogram. -
FIG. 10 is a diagram showing a detailed procedure for first region segmentation in which a region larger than an object is extracted. -
FIG. 11 is a diagram showing a detailed procedure for grouping the first object region. -
FIG. 12 is a diagram illustrating a flow of an example of the grouping carried out when an area is used as the parameter for size. -
FIG. 13 is a diagram illustrating an example of a distribution chart used when a G density and a B density are selected for density values used for the grouping. -
FIG. 14 is a diagram illustrating a flow of an example of grouping carried out when a G-B distribution chart is used as the density values used for the grouping. -
FIG. 15 is a diagram showing a detailed procedure for group-specific second region segmentation. -
FIG. 16 is a diagram illustrating a flow of an example of processing for determining the need for additional region segmentation for each group. -
FIG. 17 is a diagram showing a detailed procedure for the group-specific second region segmentation. -
FIG. 18 is a diagram illustrating a configuration of an apparatus for region segmentation of urine particle images, according to a second embodiment of the present invention. -
- 101 example of urine particle image
- 102 example of first object region
- 103 example of grouping
- 104 example of second object region
- 201 apparatus
- 202 stainer unit
- 203 input unit
- 204 display
- 205 keyboard
- 206 processing unit
- 301 flow cell
- 302 pulse lamp
- 303 objective lens
- 304 camera
- 701 example of urine particle image
- 702 example of object particle
- 703 example of first object region
- 704 example of local region rectangle including first object region
- 1301 discriminating boarder
- 1302 discriminating boarder
- 1303 symbols belonging to Group A
- 1304 symbols belonging to Group B
- 1305 symbols belonging to Group C
- 1306 symbols belonging to Group D
- Embodiments of the present invention will be described below with reference to the drawings.
-
FIG. 2 is a diagram illustrating an apparatus for automatic analysis of urinary sediments, to which the present invention is applied. In anapparatus 201, astainer unit 202 stains a urine specimen with a stain, and after a certain period of time, aninput unit 203 takes an enlarged static image of particles in the urine. The image thus taken is transferred to aprocessing unit 206, where the urine particles are classified through image pattern recognition. Theprocessing unit 206 then counts the types of the urine particles included in a single specimen as well as the appearance frequency of each type. For example, a general-purpose personal computer having adisplay 204 and akeyboard 205 is used as theprocessing unit 206. The operator is informed of the counting results through thedisplay 204. The image taken by theinput unit 203, data obtained by theprocessing unit 206, such as a measurement result, a classification result for each object region, and image feature parameters obtained during the image pattern recognition, are saved in a storage device in theprocessing unit 206. Theprocessing unit 206 also has a review function which allows the operator to display any selected image and to make a correction to the automatic classification or to perform fine classification visually. -
FIG. 3 is a schematic diagram of theinput unit 203 of theapparatus 201. In theinput unit 203, aflow cell 301 is used to form a wide, flat and thin flow for urine specimen between anobjective lens 303 and apulse lamp 302. While the flow for urine specimen formed by theflow cell 301 is irradiated by thepulse lamp 302 momentarily, an image of urine particles enlarged by theobjective lens 303 is taken by acamera 304 as a static image. As thecamera 304, a CCD color camera, a CMOS color camera, or the like is used for example. The image thus obtained is transferred to theprocessing unit 206. -
FIG. 4 is a diagram illustrating how theprocessing unit 206 performs image processing. An image of urine particles taken by thecamera 304 is transferred to theprocessing unit 206 as digital data. In shading compensation step S401, density unevenness of the image, resulting from the characteristics of the optical system, is removed. - In region segmentation step S402, a binary image is formed by segmenting the image of urine particles into a background region and an object region. In the binary image, 0 indicates the background region, and 1 indicates the object region. In modifying process step S403, correction and shaping of the binary image is performed, such as compensation for the object region and noise removal for the background region. A known conventional technique, including filtering processing such as, for example, swelling and shaping, can be used as the means for the correction and shaping.
- In labeling step S404, each group of connected components in the binary image is labeled and assigned a number to be uniquely identified among multiple objects in the image. In feature parameter calculation step S405, feature parameters, such as an area, a perimeter, and an average density value, are calculated for each object region thus numbered.
- In patterning recognition step S406, each object region is classified on the basis of its component type by using the image parameters obtained for the object region in step S405. As the means for the patterning recognition, a neutral network approach, a statistical recognition approach, or the like can be used. Examples of items used for the classification include a red blood cell, a white blood cell, a squamous epithelial cell, other epithelial cells, a cast, a crystal, a bacterium, and the like.
- In counting step S407, based on the classification results obtained in patterning recognition step S406, the objects classified into each classification class are counted. The counting results are converted into the number of objects or the density per unit volume of the urine specimen, and the conversion results are outputted to the
display 204. - Note that all or part of the processing shown in
FIG. 4 can also be processed by hardware. -
FIG. 5 is a diagram illustrating in detail a configuration example of region segmentation step S402. - In step S501, first object regions in each of which an object particle exist are extracted from an image of urine particles. A region larger than the object particle is extracted by the segmentation here so that the object particle including a hard-to-stain particle can be extracted as one continuous region. For example, suppose that an
image 101 of urine particles shown inFIG. 1 is taken by an input optical system configured to input particle images. Then, the region segmentation is carried out using one or more of an R image, a G image, and a B image of theimage 101 and using a fixed threshold predetermined by experiment, or the like. As a result, regions larger than the object particles are obtained as the region segmentation results, as shown in 102 ofFIG. 1 . - In step S502, each first object region is grouped into a predetermined number of groups. Feature parameters of the first object region are used for the grouping. The feature parameters used here include a feature parameter for size and a feature parameter for tone. For example, an area, a perimeter, or the like is used for the feature parameter for size. For example, an average density value or the like is used for the feature parameter for tone. As an example, in
FIG. 1 , the first object region is grouped into three groups based on the tone and the size thereof, and 103 ofFIG. 1 shows the grouping results. - In step S503, second object regions are extracted from respective local regions each including the first object region by carrying out group-specific region segmentation. For example, the region segmentation here employs a method which uses one or more of an R image, a G image, and a B image of the image of urine particles and a fixed threshold predetermined by experiment, or employs other methods. The type and the number of the images selected and the fixed threshold used are different for each group. For example, in a case of urine particles stained with a typical Sternheimer(S) stain, the absorption peak of the particles stained is on the order of 550 nm. Accordingly, higher sensitivity can be obtained by using a G image or a B image having a 500 to 700 nm wavelength than by using a B image having a 400 to 500 nm wavelength. Further, the particles stained include stained particles that are sufficiently stained and light-stained particles that are insufficiently stained. In such a case, the same image may be used, but the fixed threshold has to be changed according to the tone. Moreover, for the region segmentation, a contour extraction method, such as dynamic contour (snake), which is a known conventional technique may be used. For example, 104 of
FIG. 1 shows results of object regions accurately extracted as a result of the region segmentation performed differently for each group. - As described above, a first object region larger than a particle image is extracted. Then, the first object region is classified into a predetermined number of groups based on the tone and the size thereof. Based on the classification result, a second object region is extracted according to the features of the particle image. This configuration offers such an effect that stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist.
-
FIG. 6 is a diagram illustrating a detailed configuration employed in a case where a density histogram of each of an R density, a G density, and a B density of a urine particle image is used in region segmentation step S402. - In step S601, a density histogram is created for each of the R density, the G density, and the B density of the urine particle image, and for each density histogram, parameters indicating the shape thereof are obtained. In step S602, a first object region in which an object particle exists is extracted from the image by using the parameters obtained in step S601 and the like. A region larger than the object particle is extracted by the segmentation here so that the object particle including a hard-to-stain particle can be extracted as one continuous region. For example, when a
urine particle image 701 shown inFIG. 7 is obtained, the first object region obtained in step S602 is aregion 703 being larger than anobject particle 702. - In grouping step S603 of
FIG. 6 , the first object region is grouped into a predetermined number of groups, based on its average density value distribution and size. Note that an area, a perimeter, or the like is used as the size. In second region segmentation step S604, a second object region representing a more accurate shape of the object particle is extracted from a local region including the first object region using an optimum region segmentation method which is different for each group. The second object region thus extracted is subjected to a modifying process in step S403. -
FIG. 8 shows a detailed procedure, conducted in step S601 ofFIG. 6 , for obtaining parameters Pd(*), Phl(*), Phh(*), dl(*), and dh(*) indicating the shape of the density histogram of each image. Note that * indicates any one of R, G, and B. - In step S801, a density histogram is created for each of the R image, the G image, and the B image. In step S802, as shown in
FIG. 9 , a density value Pd(*) having a maximum frequency value Pmax(*) is obtained for each density histogram. Note that * indicates any one of R, G, and B. In step S803 ofFIG. 8 , as shown inFIG. 9 , density values Phl(*) and Phh(*) having a value Pmax/2(*) which is a half width of the peak are obtained for each density histogram. Note that * indicates any one of R, G, and B. Although density values having the half value of the peak are used here, the density values are not limited to such values, and may be ones having a quarter, a tenth, or the like of the peak, instead. Optimum density values should be decided from the shape of the histogram. In step S804 ofFIG. 8 , dl(*) and dh(*) inFIG. 9 are calculated through Formulae (1) and (2) shown below using Pd(*) inFIG. 9 obtained in step S802 and Phl(*) and Phh(*) inFIG. 9 obtained in step S803. Note that * indicates any one of R, G, and B. -
dl(*)=Pd(*)−Phl(*) (1) -
dh(*)=Phh(*)−Pd(*) (2) -
FIG. 10 shows a detailed procedure for the first region segmentation, conducted in step S602 ofFIG. 6 , in which a region larger than an object is extracted. - First, step S1001 shows a procedure for extracting an object region by using the density histogram. In step S1002, thresholds T1(*) and T2(*) shown in
FIG. 9 are calculated through Formulae (3) and (4) shown below using Pd(*) obtained in step S802 ofFIG. 8 and dl(*) and dh(*) obtained in step S804 ofFIG. 8 . Note that * indicates any one of R, G, and B. -
T1(*)=Pd(*)−dl(*)×k 1(*) (3) -
T2(*)=Pd(*)+dh(*)×k 2(*) (4) - k1(*) and k2(*) are coefficients predetermined by experiment, and optimum coefficients which are different for each of the R image, the G image, and the B image are obtained in advance. Note that one or more of density values that have the highest sensitivity due to the color characteristics of the urine particles and spectral characteristics of the camera are selected and used for the calculation of thresholds.
- For example, if the R image and the G image are selected, the threshold values are calculated as follows.
-
T1(R)=Pd(R)−dl(R)×k 1(R) -
T2(R)=Pd(R)+dh(R)×k 2(R) -
T1(G)=Pd(G)−dl(G)×k 1(G) -
T2(G)=Pd(G)+dh(G)×k 2(G) - k1(R), k2(R), k1(G), and k2(G) are coefficients predetermined by experiment, and optimum coefficients which are different for each image are obtained in advance.
- In step S1003, an object region is extracted using the thresholds thus obtained. For example, if the thresholds T1(R), T2(R), T1(G), and T2(G) are used, pixels (x, y) satisfying Formula (5) shown below are extracted as the object region.
-
{T1(R)≧R(x,y)}II{T2(R)<R(x,y)}II{T1(G)≧G(x,y)}II{T2(G)<G(x,y)} (5) - Here, II indicates a logical OR. R(x, y) indicates an R density value of the pixel (x, y), and G(x, y) indicates a G density value of the pixel (x, y).
- Since a region larger than the object particle is to be extracted as the first object region in step S602 of
FIG. 6 , the coefficient kn(*) is set to be somewhat small. Setting the coefficient kn(*) small increases the area of the object region inFIG. 9 . - Next, step S1004 shows a procedure for extracting an object region by using the magnitude of density value change. In step S1005, a value indicating the magnitude of density value change is calculated.
- Here, as the index of the magnitude of density value change, a difference between density values in a local small region is used. In step S1005, when the difference between density values in a local small region is used, a density difference value defined by Formula (6) shown below is used. r(*)(x, y) is obtained by:
-
- where *(x, y) and r(*)(x, y) indicate a density value and a density difference value, respectively, at the pixel position (x, y) on the image.
- Note that * indicates any one of R, G, and B.
- In step S1006, using r(*)(x, y), pixels (x, y) satisfying Formula (7) shown below are extracted as an object region.
-
s n(*)≦|r(*)(x,y)| (7) - Here, ∥ indicates that a value therebetween is an absolute value, and sn(*) is a constant predetermined by experiment, and an optimum constant different for each color is obtained in advance. Note that * indicates any one of R, G, and B.
- The number of neighboring pixels used for calculating a difference value for a given pixel is represented by 2n+1, and is called a mask size. When n=2 for example, two pixels before and after the given pixel in an x direction or in a y direction are needed for the calculation of a difference value for the given pixel, and therefore the mask size=5. A mask size of 1 means that no difference processing is performed.
- Note that one or more images having the highest sensitivity due to the color characteristics of the urine particles and spectral characteristics of the camera are used for the region segmentation. In addition, the mask size in each of the x direction and the y direction is set as large as possible within an object region of a nonstained particle.
- Suppose, for example, that the G image is selected and that the mask size in the x direction is 5, and the mask size in the y direction is 1. In this case, pixels (x, y) satisfying Formula (8) shown below are extracted as the object region.
-
r(G)(x,y)=G(x+2,y)+G(x+1,y)−G(x−2,y)−G(x−1,y) s1(G)≦|r(G)(x,y)| (8) - | | indicates that a value therebetween is an absolute value, and s1(G) is a constant for which an optimal value is predetermined by experiment.
- Note that the index of the magnitude of density value change is not limited to the density value difference. Other methods may be used, including a method using a density value distribution in a local small region, a method using filtering processing for emphasizing frequency components specifically included in the object region, and the like.
- In step S1007, a first object region is extracted by superimposing (i.e., logically ORing) the object regions obtained in step S1001 and step S1004, respectively. Suppose, for example, that the object region extraction in step S1001 uses the R image and the G image and uses Formula (5) and that the object region extraction in step S1004 uses the G image with a combination of a mask size of 5 in the x direction and a mask size of 1 in the y direction and uses Formula (8). Then, pixels (x, y) satisfying the following formula are obtained as the first object region. Here, II indicates a logical OR
-
{T1(R)≧R(x,y)}II{T2(R)<R(x,y)}II{T1(G)≧G(x,y)}II{T2(G)<G(x,y)}IIs1(G)≦|r(G)(x,y)| - In step S1008, the first object region extracted is subjected to a modifying process in such a manner similar to step S403 of
FIG. 4 . In step S1009, the first object region is subjected to labeling in such a manner similar to step S404 ofFIG. 4 . In step S1010, feature parameters, such as an area, a perimeter, and an average density value, are calculated for the first object region thus numbered. The feature parameters thus calculated are used in the grouping in step S603 ofFIG. 6 . -
FIG. 11 shows a detailed procedure for the grouping performed in step S603 ofFIG. 6 . In step S1101, based on the feature parameter for size of the first object region, the first object region is classified into two groups: a group of big size particles and a group of small size particles. An area, a perimeter, or the like is used for the feature parameter for size. -
FIG. 12 is a diagram illustrating a flow of the grouping performed using an area as the feature parameter for size. In a determination in step S1201, the first object region satisfying M≧m is classified as a group of big size particles, and the first object region not satisfying M≧m is classified as a group of small size particles. Here, M is the area of the first object region, and m is an optimum value obtained in advance by experiment. - In step S1102 of
FIG. 11 , using the average density value of the first object region, the first object region is grouped into a predetermined number of groups according to the tone thereof. For the average density value, one or more of color images having the highest sensitivity due to the color characteristics of the object region and the spectra characteristics of the camera are selected. The first object region is grouped into the predetermined number of groups, depending on where in the density space the average density value of the image thus selected is located. Here, discriminating boarders used for the grouping in the density space are predetermined by experiment. For example, the discriminating boarders can be obtained using a known conventional technique such as discriminant analysis. -
FIG. 13 is a diagram illustrating an example of a distribution chart used in the grouping when, for example, the G density and the B density are used for the density values.Symbols 1303 to 1306 indicate coordinates (g, b) of the average density values of the urine particles collected in advance to form a distribution chart. It has been made clear by experiment that an optimum region segmentation method is different for each group of the urine particles collected. Here, the urine particles are grouped into four groups, A, B, C and D, according to the tone thereof, and thesymbols 1303 to 1306 represent distributions, in a G-B space, of particles belonging to the groups A, B, C, and D, respectively.Discriminating boarders -
b=j 1 g+k 1 (discriminating boarder 1301) -
b=j 2 g+k 2 (discriminating boarder 1302) -
FIG. 14 is a diagram illustrating a flow of grouping performed when the G-B distribution chart inFIG. 13 is used in the grouping in step S1102 ofFIG. 11 . - When the density average value of the G image and the B image of a certain first object region is represented as (g, b), the first object region belongs to a group of light-stained B (1304) if b≧j1g+k1 is satisfied in a determination in step S1401 and additionally if b≧j2g+k2 is satisfied in step S1402. The first object region belongs to a group of nonstained D (1306) if b≧j1g+k1 is satisfied in the determination in step S1401 and additionally if b≧j2g+k2 is not satisfied in step S1402.
- The first object region belongs to a group of stained A (1303) if b≧j1g+k1 is not satisfied in the determination in step S1401 and additionally if b≧j2g+k2 is satisfied in step S1402. The first object region belongs to a group of light-stained C (1305) if b≧j1g+k1 is not satisfied in the determination in step S1401 and additionally if b≧j2g+k2 is not satisfied in step S1402.
- In step S1103 of
FIG. 11 , the grouping is performed in a similar manner to step S1102. Note that the number of classification groups and the classification method used in step S1103 are predetermined by experiment and are different from those used in S1102. -
FIG. 15 shows a detailed procedure for the group-specific second region segmentation performed in step S604 ofFIG. 6 . - In step S1501, the need for additional region segmentation is determined for each group.
FIG. 16 shows a detailed flow of step S1501. For example, assume that Group N is predetermined by experiment as not needing additional region segmentation. For example, if Group X to which a certain first object region x belongs satisfies X=N in step S1601, the first object region x does not need additional region segmentation and is determined as the final result of region segmentation. If X=N is not satisfied in step S1601, the first object region x needs additional region segmentation, and the flow proceeds to step S1502. - For example, Group N is big size particles. In a case of big size particles, because an object is big in the first place, the region segmentation in step S602 of
FIG. 6 of extracting an object with a larger region does not give much difference to the feature parameters calculated in step S405 ofFIG. 4 . Accordingly, the patterning recognition in S406 ofFIG. 4 is performed with no problem, making final classification of urine particles unlikely to be erroneous. In addition, not needing additional region segmentation saves the time for the region segmentation, and therefore offers an effect of improving a processing speed of image processing. - In step S1502 of
FIG. 15 , a local region including the first object region is extracted. As the local region, a local rectangle including the first object region is extracted. For example, if theurine particle image 701 shown inFIG. 7 is obtained, arectangle 704 including thefirst object region 703 obtained in step S602 ofFIG. 6 is extracted as the local region. - In step S1503, group-specific region segmentation, which is different for each group, is performed to extract a more accurate second object region from the local region including the first object region.
-
FIG. 17 shows a detailed procedure for the second region segmentation performed in step S1503. Step S1701 is basically the same as step S1001 ofFIG. 10 , but the color images and parameters used in step S1702 are different for each group. In addition, the coefficient kn(*) used in the threshold calculation is different for each group, so that the thresholds are calculated differently for each group. Note that * indicates any one of R, G, and B. - Step S1704 is basically the same as step S1004 of
FIG. 10 , but the color images, the constant sn(*), and the mask sizes used in step S1705 are different for each group. Note that * indicates any one of R, G, and B. - In step S1707, a second object region, which is extracted in step S604 in the flow of
FIG. 6 , is extracted. The second object region is obtained by superimposing (logically ORing) the object regions obtained in step S1703 and step S1706, respectively. For example, assume that there are three groups obtained in advance by experiment: a group of stained particles (1), a group of light-stained particles (2), and a group of nonstatined particles (3). - Suppose that a certain first object region belongs to the group of stained particles (1), and that the object region extraction in step S1701 uses the R image and the G image, and the object region extraction in step S1704 uses the G image with a combination of a mask size of 5 in the x direction and a mask size of 1 in the y direction. Then, the second object region is obtained by the following formula.
-
{T3(R)≧R(x,y)}II{T4(R)<R(x,y)}II{T3(G)>G(x,y)}II{T4(G)<G(x,y)}IIs2(G)≦|r(G)(x,y)| - Suppose that a certain first object region belongs to the group of light-stained particles (2), and that the object region extraction in step S1701 uses the R image and the G image, and the object region extraction in step S1704 uses a mask size of 1 in both the x and y directions. Then, the second object region is obtained by the following formula.
-
{T5(R)≧R(x,y)}II{T6(R)<R(x,y)}II{T5(G)≧G(x,y)}II{T6(G)<G(x,y)} - Suppose that a certain first object region belongs to the group of nonstained particles (3), and that the object region extraction in step S1701 uses the B image, and the object region extraction in step S1704 uses a mask size of 1 in both the x and y directions. Then, the second object region is obtained by the following formula.
-
{T1(B)≧B(x,yj)}II{T2(B)<B(x,y)} - The region segmentation method of the present invention in which threshold processing is performed using density histograms offers such an effect that stable threshold processing can be performed even for a urine specimen in which urine particles having different tones coexist.
- In the present invention, a first object region larger than a particle image is extracted. Then, the first object region is classified into a predetermined number of groups based on the tone and the size thereof. Based on the classification result, a second object region is extracted according to the features of the particle image. This configuration of the present invention offers such an effect that stable region segmentation can be performed for each particle image even for a urine specimen in which urine particles having different sizes and tones coexist. This effect consequently allows accurate calculation of feature parameters of an object region, prevention of erroneous classification of an object particle, and therefore improvement of accuracy in identifying urine particles of various types.
-
FIG. 18 is a diagram illustrating the configuration of an apparatus for region segmentation of urine particle images according to a second embodiment of the present invention. - An original image taken by an input device, such as a camera, is transferred to a
memory 1801. The original data is then transferred to a firstregion segmentation device 1802, where first region segmentation is carried out. For the first region segmentation, the method described in Embodiment 1 may be used. Images of first object regions obtained through the region segmentation are transferred to thememory 1801. - Next, the first object regions are transferred to a
grouping device 1803, where feature parameters are calculated for each first object region, and the first object region is classified into a predetermined number of groups. For the grouping, the method described in Embodiment 1 may be used. Results of the grouping are transferred to thememory 1801. - A group-specific second
region segmentation device 1804 performs region segmentation using the original image, the first object regions, and the grouping results that are saved in thememory 1801, and thereby obtains second object regions each accurately representing the shape of an object particle. For the detail group-specific region segmentation, the method described in Embodiment 1 may be used. Results of the second object regions are transferred to thememory 1801. Note that the means for saving the results of the second region segmentation is not limited to thememory 1801, and may be, for example, an external storage medium such as an HDD or a floppy disk. - In the configuration of the apparatus of the present invention, a first object region larger than a particle image is extracted, and is classified into a predetermined number of groups based on the tone and the size of the first object region, and the second object region is extracted based on the classification result and the features of the particle image. This configuration offers such an effect that stable region segmentation for each particle image can be carried out even for a urine specimen in which urine particles having different sizes and tones coexist.
Claims (10)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008098866 | 2008-04-07 | ||
JP2008-098866 | 2008-04-07 | ||
PCT/JP2009/056135 WO2009125678A1 (en) | 2008-04-07 | 2009-03-26 | Method and device for dividing area of image of particle in urine |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110002516A1 true US20110002516A1 (en) | 2011-01-06 |
US9239281B2 US9239281B2 (en) | 2016-01-19 |
Family
ID=41161812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/920,424 Active 2031-01-03 US9239281B2 (en) | 2008-04-07 | 2009-03-26 | Method and device for dividing area of image of particle in urine |
Country Status (3)
Country | Link |
---|---|
US (1) | US9239281B2 (en) |
JP (1) | JP4948647B2 (en) |
WO (1) | WO2009125678A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120134559A1 (en) * | 2009-07-29 | 2012-05-31 | Hitachi High-Technologies Corporation | Particle image analysis apparatus |
US20140112562A1 (en) * | 2012-10-24 | 2014-04-24 | Nidek Co., Ltd. | Ophthalmic analysis apparatus and ophthalmic analysis program |
WO2014066218A2 (en) * | 2012-10-26 | 2014-05-01 | Siemens Healthcare Diagnostics Inc. | Cast recognition method and device, and urine analyzer |
CN104751431A (en) * | 2013-12-31 | 2015-07-01 | 西门子医疗保健诊断公司 | Method and device based on image processing |
CN104751440A (en) * | 2013-12-31 | 2015-07-01 | 西门子医疗保健诊断公司 | Method and device based on image processing |
WO2015168365A1 (en) * | 2014-04-30 | 2015-11-05 | Siemens Healthcare Diagnostics Inc. | Method and apparatus for processing block to be processed of urine sediment image |
US20180329299A1 (en) * | 2015-10-01 | 2018-11-15 | Promerus, Llc | Fluorine free photopatternable phenol functional group containing polymer compositions |
CN109387517A (en) * | 2017-08-10 | 2019-02-26 | 爱科来株式会社 | Analytical equipment and analysis method |
CN110007068A (en) * | 2019-03-25 | 2019-07-12 | 桂林优利特医疗电子有限公司 | A kind of urine drip detection method |
US10437036B2 (en) | 2017-10-02 | 2019-10-08 | Arkray, Inc. | Analysis apparatus |
JP2020169994A (en) * | 2019-04-03 | 2020-10-15 | メクウィンズ, エセ.アー.Mecwins, S.A. | Method for optically detecting biomarkers |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102707425B (en) * | 2012-06-21 | 2014-04-16 | 爱威科技股份有限公司 | Image processing method and device |
US10304188B1 (en) * | 2015-03-27 | 2019-05-28 | Caleb J. Kumar | Apparatus and method for automated cell analysis |
JPWO2018207361A1 (en) * | 2017-05-12 | 2020-03-12 | オリンパス株式会社 | Cell image acquisition device |
JP2019066461A (en) * | 2017-10-02 | 2019-04-25 | アークレイ株式会社 | Analyzer |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5068906A (en) * | 1988-04-22 | 1991-11-26 | Toa Medical Electronics Co., Ltd. | Processor for extracting and memorizing cell images |
US5768412A (en) * | 1994-09-19 | 1998-06-16 | Hitachi, Ltd. | Region segmentation method for particle images and apparatus thereof |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH073419B2 (en) | 1986-10-07 | 1995-01-18 | 東亜医用電子株式会社 | Method and device for analyzing cells in fluid |
US5121436A (en) * | 1987-08-14 | 1992-06-09 | International Remote Imaging Systems, Inc. | Method and apparatus for generating a plurality of parameters of an object in a field of view |
JPH01119765A (en) * | 1987-11-04 | 1989-05-11 | Hitachi Ltd | Dividing method of region |
JPH03131756A (en) | 1989-10-18 | 1991-06-05 | Hitachi Ltd | Automatic classifying device for blood cell |
JP3111706B2 (en) | 1992-02-18 | 2000-11-27 | 株式会社日立製作所 | Particle analyzer and particle analysis method |
DE69327182T2 (en) | 1992-02-18 | 2000-06-15 | Hitachi Ltd | Device and method for examining particles in a fluid |
JP3653804B2 (en) | 1994-09-19 | 2005-06-02 | 株式会社日立製作所 | Particle image region segmentation method and apparatus |
JP3127111B2 (en) * | 1996-02-22 | 2001-01-22 | 株式会社日立製作所 | Flow type particle image analysis method and apparatus |
JPH10302067A (en) | 1997-04-23 | 1998-11-13 | Hitachi Ltd | Pattern recognition device |
JPH1119765A (en) | 1997-06-30 | 1999-01-26 | Mitsubishi Electric Corp | Heat exchanger and its manufacture |
JP4061760B2 (en) * | 1999-01-14 | 2008-03-19 | 株式会社日立製作所 | Particle image segmentation method |
-
2009
- 2009-03-26 WO PCT/JP2009/056135 patent/WO2009125678A1/en active Application Filing
- 2009-03-26 JP JP2010507211A patent/JP4948647B2/en active Active
- 2009-03-26 US US12/920,424 patent/US9239281B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5068906A (en) * | 1988-04-22 | 1991-11-26 | Toa Medical Electronics Co., Ltd. | Processor for extracting and memorizing cell images |
US5768412A (en) * | 1994-09-19 | 1998-06-16 | Hitachi, Ltd. | Region segmentation method for particle images and apparatus thereof |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120134559A1 (en) * | 2009-07-29 | 2012-05-31 | Hitachi High-Technologies Corporation | Particle image analysis apparatus |
US20140112562A1 (en) * | 2012-10-24 | 2014-04-24 | Nidek Co., Ltd. | Ophthalmic analysis apparatus and ophthalmic analysis program |
US10064546B2 (en) * | 2012-10-24 | 2018-09-04 | Nidek Co., Ltd. | Ophthalmic analysis apparatus and ophthalmic analysis program |
WO2014066218A2 (en) * | 2012-10-26 | 2014-05-01 | Siemens Healthcare Diagnostics Inc. | Cast recognition method and device, and urine analyzer |
WO2014066218A3 (en) * | 2012-10-26 | 2014-07-10 | Siemens Healthcare Diagnostics Inc. | Cast recognition method and device, and urine analyzer |
CN104751431A (en) * | 2013-12-31 | 2015-07-01 | 西门子医疗保健诊断公司 | Method and device based on image processing |
CN104751440A (en) * | 2013-12-31 | 2015-07-01 | 西门子医疗保健诊断公司 | Method and device based on image processing |
WO2015102945A1 (en) * | 2013-12-31 | 2015-07-09 | Siemens Healthcare Diagnostics Inc. | Image processing-based method and apparatus |
US20170053400A1 (en) * | 2014-04-30 | 2017-02-23 | Siemens Healthcare Diagnostics Inc. | Method and apparatus for processing block to be processed of urine sediment image |
US9715729B2 (en) * | 2014-04-30 | 2017-07-25 | Siemens Healthcare Diagnostics Inc. | Method and apparatus for processing block to be processed of urine sediment image |
WO2015168365A1 (en) * | 2014-04-30 | 2015-11-05 | Siemens Healthcare Diagnostics Inc. | Method and apparatus for processing block to be processed of urine sediment image |
US20180329299A1 (en) * | 2015-10-01 | 2018-11-15 | Promerus, Llc | Fluorine free photopatternable phenol functional group containing polymer compositions |
CN109387517A (en) * | 2017-08-10 | 2019-02-26 | 爱科来株式会社 | Analytical equipment and analysis method |
US10437036B2 (en) | 2017-10-02 | 2019-10-08 | Arkray, Inc. | Analysis apparatus |
CN110007068A (en) * | 2019-03-25 | 2019-07-12 | 桂林优利特医疗电子有限公司 | A kind of urine drip detection method |
JP2020169994A (en) * | 2019-04-03 | 2020-10-15 | メクウィンズ, エセ.アー.Mecwins, S.A. | Method for optically detecting biomarkers |
US11519856B2 (en) * | 2019-04-03 | 2022-12-06 | Mecwins, S.A. | Method for optically detecting biomarkers |
JP7467205B2 (en) | 2019-04-03 | 2024-04-15 | メクウィンズ,エセ.アー. | Method for optically detecting biomarkers - Patents.com |
Also Published As
Publication number | Publication date |
---|---|
JPWO2009125678A1 (en) | 2011-08-04 |
JP4948647B2 (en) | 2012-06-06 |
WO2009125678A1 (en) | 2009-10-15 |
US9239281B2 (en) | 2016-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9239281B2 (en) | Method and device for dividing area of image of particle in urine | |
US10565479B1 (en) | Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring | |
CN113989279B (en) | Plastic film quality detection method based on artificial intelligence and image processing | |
CN106651872B (en) | Pavement crack identification method and system based on Prewitt operator | |
Jiang et al. | A novel white blood cell segmentation scheme using scale-space filtering and watershed clustering | |
CN107977682B (en) | Lymphocyte classification method and device based on polar coordinate transformation data enhancement | |
GB2395263A (en) | Image analysis | |
KR20080016847A (en) | Methods of chromogen separation-based image analysis | |
CN111402267B (en) | Segmentation method, device and terminal of epithelial cell nuclei in prostate cancer pathological image | |
US20100040276A1 (en) | Method and apparatus for determining a cell contour of a cell | |
US11538261B2 (en) | Systems and methods for automated cell segmentation and labeling in immunofluorescence microscopy | |
CN115082451B (en) | Stainless steel soup ladle defect detection method based on image processing | |
CN109975196B (en) | Reticulocyte detection method and system | |
CN112215790A (en) | KI67 index analysis method based on deep learning | |
US8582861B2 (en) | Method and apparatus for segmenting biological cells in a picture | |
WO2006087526A1 (en) | Apparatus and method for processing of specimen images for use in computer analysis thereof | |
Habibzadeh et al. | Application of pattern recognition techniques for the analysis of thin blood smear images | |
CN113393454A (en) | Method and device for segmenting pathological target examples in biopsy tissues | |
Lal et al. | A robust method for nuclei segmentation of H&E stained histopathology images | |
CN113657335A (en) | Mineral phase identification method based on HSV color space | |
Nguyen et al. | A new method for splitting clumped cells in red blood images | |
JP3653804B2 (en) | Particle image region segmentation method and apparatus | |
CN116596899A (en) | Method, device, terminal and medium for identifying circulating tumor cells based on fluorescence image | |
Jian et al. | Hyperchromatic nucleus segmentation on breast histopathological images for mitosis detection | |
US20240005682A1 (en) | Object classifying apparatus, object classification system, and object classification method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI HIGH-TECHNOLOGIES CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MANRI, CHIHIRO;BAN, HIDEYUKI;MITSUYAMA, SATOSHI;AND OTHERS;REEL/FRAME:024919/0324 Effective date: 20100824 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: HITACHI HIGH-TECH CORPORATION, JAPAN Free format text: CHANGE OF NAME AND ADDRESS;ASSIGNOR:HITACHI HIGH-TECHNOLOGIES CORPORATION;REEL/FRAME:052259/0227 Effective date: 20200212 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |