CN104751167A - Method and device for classifying urine visible components - Google Patents

Method and device for classifying urine visible components Download PDF

Info

Publication number
CN104751167A
CN104751167A CN201310752863.XA CN201310752863A CN104751167A CN 104751167 A CN104751167 A CN 104751167A CN 201310752863 A CN201310752863 A CN 201310752863A CN 104751167 A CN104751167 A CN 104751167A
Authority
CN
China
Prior art keywords
green
red
blue
pixel
visible component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310752863.XA
Other languages
Chinese (zh)
Inventor
迟颖
彭廷莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthcare Diagnostics GmbH Germany
Siemens Healthcare Diagnostics Inc
Original Assignee
Siemens Healthcare Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Healthcare Diagnostics Inc filed Critical Siemens Healthcare Diagnostics Inc
Priority to CN201310752863.XA priority Critical patent/CN104751167A/en
Priority to PCT/US2014/071500 priority patent/WO2015102948A1/en
Publication of CN104751167A publication Critical patent/CN104751167A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/435Computation of moments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention provides a method and a device for classifying urine visible components. The method comprises the following steps: using the difference between the average pixel value in a peripheral region of the visible components inside a visible component block and the average pixel value in a central region as characteristics in a characteristic set of a classifier; and using the classifier to classify the urine visible components. An embodiment of the invention aims at increasing the precision of classifying the urine visible components.

Description

Urinary formed element sorting technique and device
Technical field
The present invention relates to biological detection, particularly relate to a kind of urinary formed element sorting technique and device.
Background technology
In common urine sediment analysis technology, microscopic system is first utilized to take urine specimen image.Then, the particulate block in edge detecting technology segmentation urine specimen image is utilized.From these particulate blocks, remove contaminant particles block, leave visible component (as red blood cell, leucocyte, crystallization) block.Then, these visible components are classified, such as, divide erythroblast, leucocyte, crystallization etc.
Assorting process usually uses the sorter based on training pattern.Such as, some can be utilized for the helpful feature of all kinds of visible component of differentiation, and such as area, circularity, range of extension, gradient etc., composition characteristic collection carrys out training classifier, as neural network.Sorter utilizes existing a large amount of visible component block samples, is trained by the feature (such as area, circularity, range of extension, gradient etc.) calculating them.Like this, when after the visible component block that input is new, the sorter trained, according to the feature of this block of measuring and calculating, is classified to it.
Common sorting technique utilizes the composition characteristic collection such as shape facility and texure feature.Effective not during the classification of the different visible component of feature differentiation in traditional feature set, nicety of grading is not high enough.
Summary of the invention
One embodiment of the present of invention are intended to the precision improving urinary formed element classification.
According to one embodiment of present invention, provide a kind of urinary formed element sorting technique, comprising: the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block is used as the feature in the feature set of sorter; With described sorter, urinary formed element is classified.
In a kind of specific implementation, the step difference of the average pixel value of visible component outer region and the average pixel value of central area being used as the feature in the feature set of sorter comprises: remove surround lighting and noise out of focus to the impact of the pixel value in visible component; Calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area; Described difference is used as the feature in described feature set.
In a kind of specific implementation, removal surround lighting and the step of noise out of focus on the impact of the pixel value in visible component comprise: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of this pixel;
The step calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprises: the NT calculating each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
In a kind of specific implementation, removal surround lighting and the step of noise out of focus on the impact of the pixel value in visible component comprise: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
The step calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprises: the NT calculating each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
In a kind of specific implementation, described method also comprises: the gray level co-occurrence matrixes obtaining visible component, and by the feature in the one or more feature sets being used as sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
In a kind of specific implementation, described method also comprises: the difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of sorter.
In a kind of specific implementation, described method also comprises 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
In a kind of specific implementation, described method also comprises the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
In a kind of specific implementation, described visible component comprises red blood cell, leucocyte, crystallization.
According to one embodiment of present invention, provide a kind of urinary formed element sorter, comprise: characterization unit, be configured to the feature difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block be used as in the feature set of sorter; Taxon, is configured to classify to urinary formed element with described sorter.
In a kind of specific implementation, described characterization unit is configured to remove surround lighting and noise out of focus to the impact of the pixel value in visible component further, calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area, described difference is used as the feature in described feature set.
In a kind of specific implementation, described characterization unit is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
In a kind of specific implementation, described characterization unit is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in described visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
In a kind of specific implementation, described characterization unit is also configured to: the gray level co-occurrence matrixes obtaining visible component, and by the feature in the one or more feature sets being used as sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
In a kind of specific implementation, described characterization unit is also configured to:
The difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of sorter.
In a kind of specific implementation, described characterization unit is also configured to 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
In a kind of specific implementation, described characterization unit is also configured to the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
In a kind of specific implementation, described visible component comprises red blood cell, leucocyte, crystallization.
Inventor observes discovery, under microscopical white light, some visible components (as red blood cell) except the dark district of centre, the green glow that periphery has a circle generous.Therefore, based on the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, some visible components can be effectively distinguished, especially red blood cell, leucocyte, crystallization can be effectively distinguished.The difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block is used as the feature in the feature set of sorter by one embodiment of the present of invention, effectively distinguish some visible components, improve the precision of urinary formed element classification.
Inventor also found some other, the feature that do not find to be used for urinary formed element classification in prior art, as the feature based on gray level co-occurrence matrixes, the feature based on visible component luminance difference, based on the feature of Hu distance, the feature etc. based on histogram tolerance.The feature in the one or more feature sets being used as sorter in these features, more effectively can distinguish visible component, thus improve the precision of urinary formed element classification.
Accompanying drawing explanation
These and other feature and advantage of the present invention will be by becoming more apparent below in conjunction with the detailed description of accompanying drawing.
Fig. 1 shows the process flow diagram of urinary formed element sorting technique according to an embodiment of the invention.
Fig. 2 shows the detail flowchart of the feature in the feature set difference of the average pixel value of visible component outer region and the average pixel value of central area being used as sorter according to an embodiment of the invention.
Fig. 3 shows the block diagram of urinary formed element sorter according to an embodiment of the invention.
Fig. 4 shows the structural drawing of urinary formed element sorting device according to an embodiment of the invention.
Embodiment
Below, each embodiment of the present invention will be described by reference to the accompanying drawings in detail.
feature based on green glow circle characteristic is used as the feature in feature set
As shown in Figure 1, urinary formed element sorting technique 1 according to an embodiment of the invention, comprise: in step S1, by the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature in the feature set of sorter; In step S2, with described sorter, urinary formed element is classified.
Wherein, outer region and central area are determined by range conversion (Distance Transform) method.The profile of visible component can be obtained in such as edge detection process.By the inwardly single-frame indentation of the profile of visible component, the size of the size of every lattice and a pixel is suitable.When after profile indentation n lattice, this profile cannot inwardly indentation.Cannot inwardly indentation standard such as: if any point on the outline line after indentation n lattice and along inside indentation direction, the size distance between millet cake being less than or equal to 2 lattice on outline line after indentation n lattice, then thinking cannot inwardly indentation ".When this profile cannot inwardly indentation time, the part that inside for the profile of the visible component indentation n × 2/3 lattice hour circle is lived is defined as central area, and the part beyond the central area of visible component is defined as outer region.
In another embodiment, the part that inside for the profile of the visible component indentation n × 1/2 lattice hour circle is lived is defined as central area, and the part beyond the central area of visible component is defined as outer region.
As previously mentioned, under microscopical white light, some visible components (as red blood cell) except the dark district of centre, the green glow that periphery has a circle generous.The difference of the average pixel value of visible component outer region and the average pixel value of central area just embodies this characteristic.Therefore, the difference of the average pixel value of the average pixel value of visible component outer region and central area is used as the feature in feature set, contributes to the classification of visible component.
As shown in Figure 2, in one embodiment, step S1 can be subdivided into following steps again: in step S11, and removal surround lighting and noise out of focus are on the impact of the pixel value in visible component; In step S12, calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area; In step S13, described difference is used as the feature in described feature set.
It will be appreciated by those skilled in the art that step S11 also can omit.Because removal surround lighting and noise out of focus just make the calculating of described difference more accurate on the impact of the pixel value in visible component, still can calculate the difference of the average pixel value of outer region and the average pixel value of central area when not removing the affecting of surround lighting and noise out of focus, its result of calculation also can embody the evaluation to above-mentioned green glow circle characteristic to a certain extent.
About the calculating of the difference of the average pixel value of outer region and the average pixel value of central area, following present two embodiments.
In one embodiment, for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue Formula 1
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively.
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) ) Formula 2
Wherein max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in described visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of this pixel.
By formula 1 and 2 above, for pixel each in visible component, the green trichromatic coefficient normalized value NT of this pixel can be calculated green.Then, the NT of each pixel of visible component outer region is calculated greenmean value and the NT of each pixel of central area greenthe difference of mean value.The average pixel value of outer region calculated with green trichromatic coefficient normalized value and the difference of the average pixel value of central area more can reflect above-mentioned green glow circle characteristic.
In another embodiment, inventor finds, the average pixel value of outer region calculated with the grey coefficients normalized value based on gray scale and the difference of the average pixel value of central area also can reflect above-mentioned green glow circle characteristic to a certain extent.
In this embodiment, for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue Formula 3
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, I graythe gray-scale pixel values of described pixel, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel.
NT gray = T gray max ( T gray ) Formula 4
Max (T gray) represent the grey coefficients maximal value of pixel in described visible component, NT grayrepresent the grey coefficients normalized value of this pixel.
By formula 3 and 4 above, for pixel each in visible component, the grey coefficients normalized value NT of this pixel can be calculated gray.Then, the NT of each pixel of visible component outer region is calculated graymean value and the NT of each pixel of central area graythe difference of mean value.
Those skilled in the art are to be understood that, except the difference of the difference of the average pixel value of outer region that calculates with green trichromatic coefficient normalized value and the average pixel value of central area, the average pixel value of outer region calculated with grey coefficients normalized value and the average pixel value of central area, by the difference of the average pixel value of outer region red, blue coefficient normalized value calculates and the average pixel value of central area, also there is certain help for differentiation visible component.Therefore, the calculating of the difference of the average pixel value of outer region and the average pixel value of central area is not limited to above-mentioned two kinds of embodiments.
Those skilled in the art are to be understood that, although be defined as part inside and outside for this profile after inside for the profile of visible component indentation n × 1/2 lattice or n × 2/3 lattice in central area and outer region, other the ratio except 2/3 and 1/2 also can be specified above.Such as central area and outer region are defined as part inside and outside for this profile after the lattice of inside for the profile of visible component indentation n × 3/4.
Those skilled in the art are to be understood that, central area and outer region can be defined according to by part inside and outside for this profile after the lattice of inside for the profile of visible component indentation n × 2/3, and calculate the first poor of the average pixel value of outer region and the average pixel value of central area accordingly.Define central area and outer region according to by part inside and outside for this profile after the lattice of inside for the profile of visible component indentation n × 1/2 again, and calculate the second poor of the average pixel value of outer region and the average pixel value of central area accordingly.First difference and second are differed from all as the feature of two in feature set.In addition, when not only with green trichromatic coefficient normalized value but also when calculating the difference of the average pixel value of outer region and the average pixel value of central area with grey coefficients normalized value, in feature set, four features had just been added.
feature based on gray level co-occurrence matrixes is used as the feature in feature set
The present inventor finds, utilizes some features based on gray level co-occurrence matrixes, also can effectively distinguish some visible components, especially distinguish red blood cell, leucocyte, crystallization.
According to one embodiment of present invention, the gray level co-occurrence matrixes of visible component is obtained.For specific visible component, its gray level co-occurrence matrixes can be measured by prior art.Then, by the feature in the one or more feature sets being used as sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy.The contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
formula 5
formula 6
formula 7
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
Feature in one or more feature sets being used as sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, can improve the validity distinguishing visible component, improve nicety of grading.
the feature of the contrast based on visible component block is used as the feature in feature set
Inventor finds, for some visible components, its block prospect part under the microscope (visible component itself) and background parts difference in brightness are very large, and this point also can help to distinguish some visible components.Therefore, according to one embodiment of present invention, the difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of sorter.This also can improve the validity distinguishing visible component (especially red blood cell, leucocyte, crystallization), improves nicety of grading.
feature based on Hu distance is used as the feature in feature set
Inventor finds, for some visible components, its Hu is apart from also very large with other visible component difference.Therefore, 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more as feature be used as sorter feature sets in feature, also can improve distinguish visible component (especially red blood cell, leucocyte, crystallization) validity, improve nicety of grading.
First, convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
M pq=∫ ∫ x py qf (x, y) dxdy p, q=0,1,2 ... formula 8
(p+q) rank centre distance is defined as:
μ pq=∫ ∫ (x-x 0) p(y-y 0) qf (x, y) dxdy formula 9
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00 .
Reference center distance is formula 10
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+ y 02formula 11
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2 Formula 12
I 3=(y 30+ 3y) 2+ (3y 21-y 03) 2formula 13
I 4=(y 30+ y 12) 2+ (y 21+ y 03) 2formula 14
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
Formula 15
I 6=(y 20-y 02) [(y 30+ y 12) 2-(y 21+ y 03) 2]+formula 16
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+ y 03) (y 30+ y 12) [(y 30+ y 12) 2-3 (y 21+ y 03) 2]+formula 17.
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
the feature of measuring based on histogram is used as the feature in feature set
Inventor finds, for some visible components, its some histograms tolerance is also very large with other visible component difference.Therefore, the histogram of the visible component calculated according to the following formula tolerance (is comprised average gray level the standard deviation sigma of gray level g, the gradient κ of RGB look, entropy and energy) in one or more feature sets being used as sorter as feature in feature, also can improve the validity distinguishing visible component (especially red blood cell, leucocyte, crystallization), improve nicety of grading.
g ‾ = Σ g = 0 L - 1 P ( g ) · g Formula 18
Wherein, probability density function gray scale is divided in L grey level.Such as, the gray scale in a certain tonal range is divided in a grey level.L can determine as required based on experience value or by the experiment of limited number of time and emulation, and usual L can not be too little.H (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component.
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g ) Formula 19
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g ) Formula 20
formula 21
formula 22.
an apparatus in accordance with one embodiment of the invention
As shown in Figure 3, urinary formed element sorter 2 according to an embodiment of the invention, comprises characterization unit 201 and taxon 202.Characterization unit 201 is configured to the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature in the feature set of sorter.Taxon 202 is configured to classify to urinary formed element with described sorter.Device shown in Fig. 3 can utilize the mode of software, hardware (such as integrated circuit, FPGA etc.) or software and hardware combining to realize.
In one embodiment, described characterization unit 201 can be configured to remove surround lighting and noise out of focus to the impact of the pixel value in visible component further, calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area, described difference is used as the feature in described feature set.
In one embodiment, described characterization unit 201 can be configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue Formula 1
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) ) Formula 2
Wherein, max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in described visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
In one embodiment, described characterization unit 201 can be configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue Formula 3
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, I graythe gray-scale pixel values of described pixel, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray ) Formula 4
Max (T gray) represent the grey coefficients maximal value of pixel in described visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
In one embodiment, described characterization unit 201 is also configured to:
Obtain the gray level co-occurrence matrixes of visible component, and by the feature in the one or more feature sets being used as sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
formula 5
formula 6
formula 7
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
In one embodiment, described characterization unit 201 also can be configured to the difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block to do as the feature in the feature set of sorter.
In one embodiment, described characterization unit 201 also can be configured to will go out according to following process computation 7 recklessly apart from I 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more as feature be used as sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represent pixel in gray level image (x, y) gray scale, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
M pq=∫ ∫ x py qf (x, y) dxdy p, q=0,1,2 ... formula 8
(p+q) rank centre distance is defined as:
μ pq=∫ ∫ (x-x 0) p(y-y 0) qf (x, y) dxdy formula 9
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is formula 10
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+ y 02formula 11
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2 Formula 12
I 3=(y 30+ 3y) 2+ (3y 21-y 03) 2formula 13
I 4=(y 30+ y 12) 2+ (y 21+ y 03) 2formula 14
I 5=(y 30-y 12) (y 30+ y 12) [(y 30+ y 12) 2-3 (y 21+ y 03) 2] formula 15
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02) [(y 30+ y 12) 2-(y 21+ y 03) 2]+formula 16
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+ y 03) (y 30+ y 12) [(y 30+ y 12) 2-3 (y 21+ y 03) 2]+formula 17
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
In one embodiment, described characterization unit 201 also can be configured to the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as sorter in entropy and energy
g ‾ = Σ g = 0 L - 1 P ( g ) · g Formula 18
Wherein, probability density function gray scale is divided in L grey level.Such as, the gray scale in a certain tonal range is divided in a grey level.L can determine as required, but L can not be too little.H (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component.
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g ) Formula 19
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g ) Formula 20
formula 21
formula 22
In one embodiment, described visible component comprises red blood cell, leucocyte, crystallization.
Fig. 4 shows urinary formed element sorting device 3 according to an embodiment of the invention.This equipment can comprise storer 301 and processor 302.Storer 301 is for stores executable instructions.The executable instruction of processor 302 for storing according to described storer, the operation that in actuating unit 2, unit performs.
In addition, one embodiment of the present of invention also provide a kind of machine readable media, and it stores executable instruction, when this executable instruction is performed, make the operation of machine execution performed by processor 302.
It will be appreciated by those skilled in the art that each embodiment above can make various changes and modifications when not departing from invention essence, therefore, protection scope of the present invention should be limited by appending claims.

Claims (20)

1. a urinary formed element sorting technique (1), comprising:
By the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature (S1) in the feature set of sorter;
With described sorter to urinary formed element classification (S2).
2. urinary formed element sorting technique (1) according to claim 1, the step (S1) wherein the difference of the average pixel value of visible component outer region and the average pixel value of central area being used as the feature in the feature set of sorter comprising:
Removal surround lighting and noise out of focus are on the impact (S11) of the pixel value in visible component;
Calculate the difference (S12) of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area;
Described difference is used as the feature (S13) in described feature set.
3. urinary formed element sorting technique (1) according to claim 2, wherein
Removal surround lighting and the step (S11) of noise out of focus on the impact of the pixel value in visible component comprising: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of described pixel;
The step (S12) calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprising: the NT calculating each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
4. urinary formed element sorting technique (1) according to claim 2, wherein
Removal surround lighting and the step (S11) of noise out of focus on the impact of the pixel value in visible component comprising: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
The step (S12) calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprising: the NT calculating each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
5. urinary formed element sorting technique (1) according to claim 1, described method also comprises:
Obtain the gray level co-occurrence matrixes of visible component, and one or more as the feature in the feature set of sorter using in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
6. urinary formed element sorting technique (1) according to claim 1, described method also comprises:
The difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of sorter.
7. urinary formed element sorting technique (1) according to claim 1, described method also comprises 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
8. urinary formed element sorting technique (1) according to claim 1, described method also comprises the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
9. urinary formed element sorting technique (1) according to claim 1, wherein said visible component comprises red blood cell, leucocyte, crystallization.
10. a urinary formed element sorter (2), comprising:
Characterization unit (201), is configured to the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature in the feature set of sorter;
Taxon (202), is configured to classify to urinary formed element with described sorter.
11. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is configured to remove surround lighting and noise out of focus to the impact of the pixel value in visible component further, calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area, described difference is used as the feature of described feature set.
12. urinary formed element sorters (2) according to claim 11, wherein said characterization unit (201) is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of described pixel;
Calculate the NT of each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
13. urinary formed element sorters (2) according to claim 11, wherein said characterization unit (201) is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
14. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to:
Obtain the gray level co-occurrence matrixes of visible component, and by the feature in the one or more feature sets being used as described sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
15. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to:
The difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of described sorter.
16. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as described sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
17. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as described sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
18. urinary formed element sorters (2) according to claim 10, wherein said visible component comprises red blood cell, leucocyte, crystallization.
19. 1 kinds of urinary formed element sorting devices (3), comprising:
Storer (301), for stores executable instructions;
Processor (302), for the executable instruction stored according to described storer, enforcement of rights requires the operation performed by any one claim in 1-9.
20. 1 kinds of machine readable medias, it stores executable instruction, when described executable instruction is performed, makes the operation performed by any one claim in machine enforcement of rights requirement 1-9.
CN201310752863.XA 2013-12-31 2013-12-31 Method and device for classifying urine visible components Pending CN104751167A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310752863.XA CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components
PCT/US2014/071500 WO2015102948A1 (en) 2013-12-31 2014-12-19 Urine formed element classification method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310752863.XA CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components

Publications (1)

Publication Number Publication Date
CN104751167A true CN104751167A (en) 2015-07-01

Family

ID=53493904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310752863.XA Pending CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components

Country Status (2)

Country Link
CN (1) CN104751167A (en)
WO (1) WO2015102948A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344852A (en) * 2018-08-01 2019-02-15 迈克医疗电子有限公司 Image-recognizing method and device, analysis instrument and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021152089A1 (en) 2020-01-30 2021-08-05 Vitadx International Systematic characterization of objects in a biological sample

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525401A (en) * 2003-02-28 2004-09-01 ��˹���´﹫˾ Method and system for enhancing portrait images that are processed in a batch mode
CN101873411A (en) * 2009-04-24 2010-10-27 瑞萨电子株式会社 Image processing equipment and image processing method
CN102354388A (en) * 2011-09-22 2012-02-15 北京航空航天大学 Method for carrying out adaptive computing on importance weights of low-level features of image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6718053B1 (en) * 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US7522762B2 (en) * 2003-04-16 2009-04-21 Inverness Medical-Biostar, Inc. Detection, resolution, and identification of arrayed elements

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525401A (en) * 2003-02-28 2004-09-01 ��˹���´﹫˾ Method and system for enhancing portrait images that are processed in a batch mode
CN101873411A (en) * 2009-04-24 2010-10-27 瑞萨电子株式会社 Image processing equipment and image processing method
CN102354388A (en) * 2011-09-22 2012-02-15 北京航空航天大学 Method for carrying out adaptive computing on importance weights of low-level features of image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344852A (en) * 2018-08-01 2019-02-15 迈克医疗电子有限公司 Image-recognizing method and device, analysis instrument and storage medium

Also Published As

Publication number Publication date
WO2015102948A1 (en) 2015-07-09

Similar Documents

Publication Publication Date Title
CN101398894B (en) Automobile license plate automatic recognition method and implementing device thereof
CN101334835B (en) Color recognition method
US9239281B2 (en) Method and device for dividing area of image of particle in urine
EP3343440A1 (en) Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring
CN103593670B (en) A kind of copper plate/strip detection method of surface flaw based on online limit of sequence learning machine
CN108052980B (en) Image-based air quality grade detection method
CN115249246B (en) Optical glass surface defect detection method
CN114926463B (en) Production quality detection method suitable for chip circuit board
Quan et al. The method of the road surface crack detection by the improved Otsu threshold
CN113205051B (en) Oil storage tank extraction method based on high spatial resolution remote sensing image
CN104732227A (en) Rapid license-plate positioning method based on definition and luminance evaluation
CN103984939B (en) A kind of sample visible component sorting technique and system
CN101702197A (en) Method for detecting road traffic signs
CN105974120B (en) Automatic detection device and method for C-reactive protein chromaticity
CN107730499A (en) A kind of leucocyte classification method based on nu SVMs
CN106295491A (en) Track line detection method and device
KR20170127269A (en) Method and apparatus for detecting and classifying surface defect of image
CN102637301A (en) Method for automatically evaluating color quality of image during aerial photography in real time
WO2020119624A1 (en) Class-sensitive edge detection method based on deep learning
CN103903009A (en) Industrial product detection method based on machine vision
CN104751167A (en) Method and device for classifying urine visible components
CN104050678A (en) Underwater monitoring color image quality measurement method
CN111368625B (en) Pedestrian target detection method based on cascade optimization
CN107545565A (en) A kind of solar energy half tone detection method
TW201419168A (en) A method and system for license plate recognition under non-uniform illumination

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150701

WD01 Invention patent application deemed withdrawn after publication