CN104751167A - Method and device for classifying urine visible components - Google Patents

Method and device for classifying urine visible components Download PDF

Info

Publication number
CN104751167A
CN104751167A CN201310752863.XA CN201310752863A CN104751167A CN 104751167 A CN104751167 A CN 104751167A CN 201310752863 A CN201310752863 A CN 201310752863A CN 104751167 A CN104751167 A CN 104751167A
Authority
CN
China
Prior art keywords
green
red
blue
pixel
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310752863.XA
Other languages
Chinese (zh)
Inventor
迟颖
彭廷莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthcare Diagnostics Inc
Original Assignee
Siemens Healthcare Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Healthcare Diagnostics Inc filed Critical Siemens Healthcare Diagnostics Inc
Priority to CN201310752863.XA priority Critical patent/CN104751167A/en
Priority to PCT/US2014/071500 priority patent/WO2015102948A1/en
Publication of CN104751167A publication Critical patent/CN104751167A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/42Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
    • G06V10/435Computation of moments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention provides a method and a device for classifying urine visible components. The method comprises the following steps: using the difference between the average pixel value in a peripheral region of the visible components inside a visible component block and the average pixel value in a central region as characteristics in a characteristic set of a classifier; and using the classifier to classify the urine visible components. An embodiment of the invention aims at increasing the precision of classifying the urine visible components.

Description

尿液有形成分分类方法和装置Urine formed component classification method and device

技术领域technical field

本发明涉及生物检测,尤其涉及一种尿液有形成分分类方法和装置。The invention relates to biological detection, in particular to a method and device for classifying urine formed components.

背景技术Background technique

在通常的尿沉渣分析技术中,首先利用显微系统拍摄尿液样本图像。然后,利用边缘检测技术分割尿液样本图像中的微粒区块。从这些微粒区块中去除杂质微粒区块,留下有形成分(如红细胞、白细胞、结晶)区块。然后,对这些有形成分分类,例如分成红细胞、白细胞、结晶等。In a common urine sediment analysis technique, firstly, a microscope system is used to take an image of a urine sample. Then, the particle block in the urine sample image is segmented by edge detection technology. Impurity particle blocks are removed from these particle blocks, leaving blocks of formed elements (eg, red blood cells, white blood cells, crystals). Then, these formed components are classified, for example, into red blood cells, white blood cells, crystals, and the like.

分类过程常常使用基于训练模型的分类器。例如,会利用一些对于区分各类有形成分有帮助的特征,例如面积、圆形度、伸展度、梯度等,组成特征集来训练分类器,如神经网络。分类器利用已有大量的有形成分区块样本,通过测算它们的特征(例如面积、圆形度、伸展度、梯度等)进行训练。这样,当输入新的有形成分区块后,训练好的分类器根据测算的该区块的特征,对其进行分类。The classification process often uses a classifier based on the trained model. For example, some features that are helpful for distinguishing various formed components, such as area, circularity, extension, gradient, etc., will be used to form a feature set to train a classifier, such as a neural network. The classifier is trained by using a large number of samples of formed partition blocks and measuring their characteristics (such as area, circularity, stretch, gradient, etc.). In this way, when a new formed block is input, the trained classifier will classify it according to the calculated features of the block.

通常的分类技术利用形状特征和部分纹理特征等组成特征集。传统的特征集里的特征区分不同有形成分的类别时不够有效,分类精度不够高。Common classification techniques use shape features and some texture features to form feature sets. The features in the traditional feature set are not effective enough to distinguish the categories of different formed components, and the classification accuracy is not high enough.

发明内容Contents of the invention

本发明的一个实施例旨在提高尿液有形成分分类的精度。One embodiment of the present invention aims at improving the accuracy of urine formed component classification.

根据本发明的一个实施例,提供了一种尿液有形成分分类方法,包括:将有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差用作分类器的特征集中的特征;以所述分类器对尿液有形成分分类。According to one embodiment of the present invention, a method for classifying urine formed components is provided, comprising: using the difference between the average pixel value of the peripheral area of the formed component in the formed block and the average pixel value of the central area as a classifier The features in the feature set; the urine formed components are classified by the classifier.

在一种具体实现中,将有形成分外周区域的平均像素值和中心区域的平均像素值的差用作分类器的特征集中的特征的步骤包括:去除环境光和失焦噪声对有形成分中的像素值的影响;计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差;将所述差用作所述特征集中的特征。In a specific implementation, the step of using the difference between the average pixel value of the peripheral region of the formed component and the average pixel value of the central region as a feature in the feature set of the classifier includes: removing the influence of ambient light and out-of-focus noise on the formed component Influence of the pixel value in ; calculate the difference between the average pixel value of the peripheral area and the average pixel value of the central area after removing the influence of ambient light and defocus noise; use the difference as a feature in the feature set.

在一种具体实现中,去除环境光和失焦噪声对有形成分中的像素值的影响的步骤包括:对于有形成分中的像素,计算In a specific implementation, the step of removing the influence of ambient light and defocus noise on the pixel value in the formed component includes: for the pixel in the formed component, calculating

TT redred == II redred II redred ++ II greengreen ++ II blueblue

TT greengreen == II greengreen II redred ++ II greengreen ++ II blueblue

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. color factor,

NTNT greengreen == TT greengreen maxmax (( maxmax (( TT redred )) ,, maxmax (( TT greengreen )) ,, maxmax (( TT blueblue )) ))

max(Tred)、max(Tgreen)、max(Tblue)表示在有形成分中像素的红色的三色系数最大值、绿色的三色系数最大值、蓝色的三色系数最大值,NTgreen表示该像素的绿色三色系数归一化值;max(T red ), max(T green ), max(T blue ) represent the maximum value of the three-color coefficient of red, the maximum value of the green three-color coefficient, and the maximum value of the blue three-color coefficient of the pixel in the formed component, NT green represents the normalized value of the green three-color coefficient of the pixel;

计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差的步骤包括:计算有形成分外周区域的各像素的NTgreen的平均值和中心区域的各像素的NTgreen的平均值的差。The step of calculating the difference between the average pixel value of the peripheral region and the average pixel value of the central region after removing the influence of ambient light and defocus noise includes: calculating the average value of NT green of each pixel in the peripheral region of the formed component and the NT green value of the central region The difference between the average values of NT green for each pixel.

在一种具体实现中,去除环境光和失焦噪声对有形成分中的像素值的影响的步骤包括:对于有形成分中的像素,计算In a specific implementation, the step of removing the influence of ambient light and defocus noise on the pixel value in the formed component includes: for the pixel in the formed component, calculating

TT redred == II redred II redred ++ II greengreen ++ II blueblue

TT greengreen == II greengreen II redred ++ II greengreen ++ II blueblue

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

Tgray=0.2989Tred+0.5870Tgreen+0.1140Tblue T gray =0.2989T red +0.5870T green +0.1140T blue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Tgray表示所述像素的灰色系数,Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. Color coefficient, T gray represents the gray coefficient of the pixel,

NTNT graygray == TT graygray maxmax (( TT graygray ))

max(Tgray)表示在有形成分中像素的灰色系数最大值,NTgray表示该像素的灰色系数归一化值;max(T gray ) indicates the maximum value of the gray coefficient of the pixel in the formed component, and NT gray indicates the normalized value of the gray coefficient of the pixel;

计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差的步骤包括:计算有形成分外周区域的各像素的NTgray的平均值和中心区域的各像素的NTgray的平均值的差。The step of calculating the difference between the average pixel value of the peripheral area and the average pixel value of the central area after removing the influence of ambient light and defocus noise includes: calculating the average value of NT gray of each pixel in the peripheral area of the formed component and the average value of NT gray in the central area. The difference between the average values of NT gray for each pixel.

在一种具体实现中,所述方法还包括:获得有形成分的灰度共生矩阵,并将有形成分的灰度共生矩阵的对比度、同质性、能量中的一个或多个用作分类器的特征集中的特征,其中灰度共生矩阵的对比度、同质性、能量如下计算:In a specific implementation, the method further includes: obtaining the gray level co-occurrence matrix of the formed component, and using one or more of the contrast, homogeneity, and energy of the gray level co-occurrence matrix of the formed component as the classification The features in the feature set of the device, where the contrast, homogeneity, and energy of the gray-level co-occurrence matrix are calculated as follows:

其中P(i,j)表示灰度为i和灰度为j的值同时出现的概率。Among them, P(i, j) represents the probability that the values of gray level i and gray level j appear at the same time.

在一种具体实现中,所述方法还包括:将有形成分区块中前景部分的像素平均亮度与有形成分区块中背景部分的像素平均亮度的差用作分类器的特征集中的特征。In a specific implementation, the method further includes: using the difference between the average brightness of the pixels in the foreground part of the formed sub-block and the average brightness of the pixels in the background part of the formed sub-block as a feature in the feature set of the classifier.

在一种具体实现中,所述方法还包括将按照以下过程计算出的7个胡距I1、I2、I3、I4、I5、I6、I7中的一个或多个用作分类器的特征集中的特征:In a specific implementation, the method further includes using one or more of the seven Hu distances I 1 , I 2 , I 3 , I 4 , I 5 , I 6 , and I 7 calculated according to the following process Features in the feature set for the classifier:

将有形成分转换成灰度图像,f(x,y)表示灰度图像中像素(x,y)的灰度,f(x,y)=0.2989Ired+0.5870Igreen+0.1140Iblue,其中Ired、Igreen、Iblue分别表示所述像素(x,y)在红色层、绿色层、蓝色层的像素值,其(p+q)阶距定义为:Convert the formed component into a grayscale image, f(x,y) represents the grayscale of the pixel (x,y) in the grayscale image, f(x,y)=0.2989I red +0.5870I green +0.1140I blue , Among them, I red , I green , and I blue represent the pixel values of the pixel (x, y) in the red layer, green layer, and blue layer respectively, and the (p+q) steps are defined as:

mpq=∫∫xpyqf(x,y)dxdy     p,q=0,1,2,…m pq =∫∫x p y q f(x,y)dxdy p,q=0,1,2,…

(p+q)阶中心距定义为:(p+q) order center distance is defined as:

μpq=∫∫(x-x0)p(y-y0)qf(x,y)dxdyμ pq =∫∫(xx 0 ) p (yy 0 ) q f(x,y)dxdy

其中 x 0 = m 10 m 00 , y 0 = m 01 m 00 in x 0 = m 10 m 00 , the y 0 = m 01 m 00

标准中心距为 The standard center distance is

其中 r + p + q + 2 2 in r + p + q + 2 2

由此,7个胡距定义为:Thus, the seven Hu distances are defined as:

I1=y20+y02 I 1 =y 20 +y 02

II 22 == (( ythe y 2020 ++ ythe y 0202 )) 22 ++ 44 ythe y 1111 22

I3=(y30+3y)2+(3y21-y03)2 I 3 =(y 30 +3y) 2 +(3y 21 -y 03 ) 2

I4=(y30+y12)2+(y21+y03)2 I 4 =(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2

I5=(y30-y12)(y30+y12)[(y30+y12)2-3(y21+y03)2]I 5 =(y 30 -y 12 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]

+(3y21-y03)(y21+y30)[3(y30+y12)2-(y21+y03)2]+(3y 21 -y 03 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

I6=(y20-y02)[(y30+y12)2-(y21+y03)2]+I 6 =(y 20 -y 02 )[(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]+

4y11(y30+y12)(y21+y03)4y 11 (y 30 +y 12 )(y 21 +y 03 )

I7=(3y21+y03)(y30+y12)[(y30+y12)2-3(y21+y03)2]+I 7 =(3y 21 +y 03 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]+

(y30-3y12)(y21+y30)[3(y30+y12)2-(y21+y03)2]。(y 30 -3y 12 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ].

在一种具体实现中,所述方法还包括将按照下式计算出的有形成分的平均灰度级灰度级的标准差σg、红绿蓝色的斜度κ、熵和能量中的一个或多个用作分类器的特征集中的特征:In a specific implementation, the method also includes calculating the average gray level of the formed components according to the following formula One or more of the standard deviation σ g of gray levels, the slope κ of red, green and blue, entropy and energy are used as features in the feature set of the classifier:

gg ‾‾ == ΣΣ gg == 00 LL -- 11 PP (( gg )) ·· gg

其中,概率密度函数将灰度划分到L个灰度级别中,h(g)是有形成分中处于灰度级别g的像素数,M是有形成分中的像素总数,Among them, the probability density function Divide the grayscale into L grayscale levels, h(g) is the number of pixels at grayscale level g in the formed component, M is the total number of pixels in the formed component,

σσ gg == ΣΣ gg -- 00 LL -- 11 (( gg -- gg ‾‾ )) 22 ·&Center Dot; PP (( gg ))

κκ == 11 σσ gg 33 ΣΣ gg == 00 LL -- 11 (( gg -- gg ‾‾ )) 33 ·· PP (( gg ))

在一种具体实现中,所述有形成分包括红细胞、白细胞、结晶。In a specific implementation, the shaped components include red blood cells, white blood cells, and crystals.

根据本发明的一个实施例,提供了一种尿液有形成分分类装置,包括:特征化单元,被配置为将有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差用作分类器的特征集中的特征;分类单元,被配置为以所述分类器对尿液有形成分分类。According to an embodiment of the present invention, there is provided a urine formed component classification device, including: a characterization unit configured to convert the average pixel value of the peripheral area of the formed component in the formed component block and the average pixel value of the central area The difference in values is used as a feature in a feature set of a classifier; a classification unit configured to classify urine formed components with said classifier.

在一种具体实现中,所述特征化单元进一步被配置为去除环境光和失焦噪声对有形成分中的像素值的影响,计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差,将所述差用作所述特征集中的特征。In a specific implementation, the characterization unit is further configured to remove the influence of ambient light and defocus noise on the pixel values in the formed component, and calculate the average value of the peripheral area after removing the influence of ambient light and defocus noise The difference between the pixel value and the average pixel value of the central region is used as a feature in the feature set.

在一种具体实现中,所述特征化单元进一步被配置为:In a specific implementation, the characterization unit is further configured to:

对于有形成分中的像素,计算For pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

TT greengreen == II greengreen II redred ++ II greengreen ++ II blueblue

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. color factor,

NTNT greengreen == TT greengreen maxmax (( maxmax (( TT redred )) ,, maxmax (( TT greengreen )) ,, maxmax (( TT blueblue )) ))

max(Tred)、max(Tgreen)、max(Tblue)表示在有形成分中像素的红色的三色系数最大值、绿色的三色系数最大值、蓝色的三色系数最大值,NTgreen表示该像素的绿色三色系数归一化值;max(T red ), max(T green ), max(T blue ) represent the maximum value of the three-color coefficient of red, the maximum value of the green three-color coefficient, and the maximum value of the blue three-color coefficient of the pixel in the formed component, NT green represents the normalized value of the green three-color coefficient of the pixel;

计算有形成分外周区域的各像素的NTgreen的平均值和中心区域的各像素的NTgreen的平均值的差。The difference between the average value of NT green of each pixel in the outer peripheral area of the formed component and the average value of NT green of each pixel in the central area is calculated.

在一种具体实现中,所述特征化单元进一步被配置为:In a specific implementation, the characterization unit is further configured to:

对于有形成分中的像素,计算For pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

TT greengreen == II greengreen II redred ++ II greengreen ++ II blueblue

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

Tgray=0.2989Tred+0.5870Tgreen+0.1140Tblue T gray =0.2989T red +0.5870T green +0.1140T blue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Tgray表示所述像素的灰色系数,Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. Color coefficient, T gray represents the gray coefficient of the pixel,

NTNT graygray == TT graygray maxmax (( TT graygray ))

max(Tgray)表示在所述有形成分中像素的灰色系数最大值,NTgray表示该像素的灰色系数归一化值;max(T gray ) represents the maximum value of the gray coefficient of the pixel in the formed component, and NT gray represents the normalized value of the gray coefficient of the pixel;

计算有形成分外周区域的各像素的NTgray的平均值和中心区域的各像素的NTgray的平均值的差。The difference between the average value of NT gray of each pixel in the outer peripheral area of the formed component and the average value of NT gray of each pixel in the central area is calculated.

在一种具体实现中,所述特征化单元还被配置为:获得有形成分的灰度共生矩阵,并将有形成分的灰度共生矩阵的对比度、同质性、能量中的一个或多个用作分类器的特征集中的特征,其中灰度共生矩阵的对比度、同质性、能量如下计算:In a specific implementation, the characterization unit is further configured to: obtain the gray level co-occurrence matrix of the formed component, and use one or more of the contrast, homogeneity, and energy of the gray level co-occurrence matrix of the formed component features in a feature set used as a classifier, where the contrast, homogeneity, and energy of the gray-level co-occurrence matrix are calculated as follows:

其中P(i,j)表示灰度为i和灰度为j的值同时出现的概率。Among them, P(i, j) represents the probability that the values of gray level i and gray level j appear at the same time.

在一种具体实现中,所述特征化单元还被配置为:In a specific implementation, the characterization unit is further configured to:

将有形成分区块中前景部分的像素平均亮度与有形成分区块中背景部分的像素平均亮度的差用作分类器的特征集中的特征。The difference between the average brightness of the pixels in the foreground part of the shaped partition and the average brightness of the pixels in the background part of the shaped partition is used as a feature in the feature set of the classifier.

在一种具体实现中,所述特征化单元还被配置为将按照以下过程计算出的7个胡距I1、I2、I3、I4、I5、I6、I7中的一个或多个用作分类器的特征集中的特征:In a specific implementation, the characterization unit is further configured to use one of the seven Hu distances I 1 , I 2 , I 3 , I 4 , I 5 , I 6 , and I 7 calculated according to the following process or features from multiple feature sets used as classifiers:

将有形成分转换成灰度图像,f(x,y)表示灰度图像中像素(x,y)的灰度,f(x,y)=0.2989Ired+0.5870Igreen+0.1140Iblue,其中Ired、Igreen、Iblue分别表示所述像素(x,y)在红色层、绿色层、蓝色层的像素值,其(p+q)阶距定义为:Convert the formed component into a grayscale image, f(x,y) represents the grayscale of the pixel (x,y) in the grayscale image, f(x,y)=0.2989I red +0.5870I green +0.1140I blue , Among them, I red , I green , and I blue represent the pixel values of the pixel (x, y) in the red layer, green layer, and blue layer respectively, and the (p+q) steps are defined as:

mpq=∫∫xpyqf(x,y)dxdy      p,q=0,1,2,…m pq =∫∫x p y q f(x,y)dxdy p,q=0,1,2,…

(p+q)阶中心距定义为:(p+q) order center distance is defined as:

μpq=∫∫(x-x0)p(y-y0)qf(x,y)dxdyμ pq =∫∫(xx 0 ) p (yy 0 ) q f(x,y)dxdy

其中 x 0 = m 10 m 00 , y 0 = m 01 m 00 in x 0 = m 10 m 00 , the y 0 = m 01 m 00

标准中心距为 The standard center distance is

其中 r + p + q + 2 2 in r + p + q + 2 2

由此,7个胡距定义为:Thus, the seven Hu distances are defined as:

I1=y20+y02 I 1 =y 20 +y 02

II 22 == (( ythe y 2020 ++ ythe y 0202 )) 22 ++ 44 ythe y 1111 22

I3=(y30+3y)2+(3y21-y03)2 I 3 =(y 30 +3y) 2 +(3y 21 -y 03 ) 2

I4=(y30+y12)2+(y21+y03)2 I 4 =(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2

I5=(y30-y12)(y30+y12)[(y30+y12)2-3(y21+y03)2]I 5 =(y 30 -y 12 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]

+(3y21-y03)(y21+y30)[3(y30+y12)2-(y21+y03)2]+(3y 21 -y 03 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

I6=(y20-y02)[(y30+y12)2-(y21+y03)2]+I 6 =(y 20 -y 02 )[(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]+

4y11(y30+y12)(y21+y03)4y 11 (y 30 +y 12 )(y 21 +y 03 )

I7=(3y21+y03)(y30+y12)[(y30+y12)2-3(y21+y03)2]+I 7 =(3y 21 +y 03 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]+

(y30-3y12)(y21+y30)[3(y30+y12)2-(y21+y03)2]。(y 30 -3y 12 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ].

在一种具体实现中,所述特征化单元还被配置为将按照下式计算出的有形成分的平均灰度级灰度级的标准差σg、红绿蓝色的斜度κ、熵和能量中的一个或多个用作分类器的特征集中的特征:In a specific implementation, the characterization unit is further configured to use the average gray level of the formed components calculated according to the following formula One or more of the standard deviation σ g of gray levels, the slope κ of red, green and blue, entropy and energy are used as features in the feature set of the classifier:

gg ‾‾ == ΣΣ gg == 00 LL -- 11 PP (( gg )) ·&Center Dot; gg

其中,概率密度函数将灰度划分到L个灰度级别中,h(g)是有形成分中处于灰度级别g的像素数,M是有形成分中的像素总数,Among them, the probability density function Divide the grayscale into L grayscale levels, h(g) is the number of pixels at grayscale level g in the formed component, M is the total number of pixels in the formed component,

σσ gg == ΣΣ gg -- 00 LL -- 11 (( gg -- gg ‾‾ )) 22 ·· PP (( gg ))

κκ == 11 σσ gg 33 ΣΣ gg == 00 LL -- 11 (( gg -- gg ‾‾ )) 33 ·&Center Dot; PP (( gg ))

在一种具体实现中,所述有形成分包括红细胞、白细胞、结晶。In a specific implementation, the shaped components include red blood cells, white blood cells, and crystals.

发明人观察发现,在显微镜的白光照射下,一些有形成分(如红细胞)除了中间的深色区外,周边有一圈宽厚的绿光。因此,基于有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差,可以有效区分一些有形成分,尤其能有效区分红细胞、白细胞、结晶。本发明的一个实施例将有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差用作分类器的特征集中的特征,有效地区分了一些有形成分,提高尿液有形成分分类的精度。The inventor observed and found that under the white light of the microscope, some formed components (such as red blood cells) have a thick circle of green light in the periphery except for the dark area in the middle. Therefore, based on the difference between the average pixel value of the peripheral region of the formed component and the average pixel value of the central region of the formed component in the formed subblock, some formed components can be effectively distinguished, especially red blood cells, white blood cells, and crystals. An embodiment of the present invention uses the difference between the average pixel value of the outer peripheral area of the formed sub-block and the average pixel value of the central area in the formed sub-block as the feature set of the classifier, which effectively distinguishes some formed components and improves Accuracy of Urine Formed Component Classification.

发明人还发现了一些其它的、现有技术中没有发现能用于尿液有形成分分类的特征,如基于灰度共生矩阵的特征、基于有形成分亮度差的特征、基于胡距的特征、基于直方图度量的特征等。把这些特征中的一个或多个用作分类器的特征集中的特征,能够更有效地区分有形成分,从而提高尿液有形成分分类的精度。The inventor has also discovered some other features that have not been found in the prior art and can be used for the classification of urine formed components, such as features based on gray-scale co-occurrence matrix, features based on the brightness difference of formed components, and features based on Hu distance , features based on histogram metrics, etc. Using one or more of these features as features in the feature set of the classifier can more effectively distinguish the formed components, thereby improving the accuracy of urine formed components classification.

附图说明Description of drawings

本发明的这些和其它的特征和优点通过以下结合附图的详细描述将变得更加显而易见。These and other features and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.

图1示出了根据本发明的一个实施例的尿液有形成分分类方法的流程图。Fig. 1 shows a flowchart of a method for classifying urine formed components according to an embodiment of the present invention.

图2示出了根据本发明的一个实施例的将有形成分外周区域的平均像素值和中心区域的平均像素值的差用作分类器的特征集中的特征的详细流程图。FIG. 2 shows a detailed flowchart of using the difference between the average pixel value of the outer peripheral area of the shaped component and the average pixel value of the central area as a feature in the feature set of the classifier according to an embodiment of the present invention.

图3示出了根据本发明的一个实施例的尿液有形成分分类装置的框图。Fig. 3 shows a block diagram of a urine formed component classification device according to an embodiment of the present invention.

图4示出了根据本发明的一个实施例的尿液有形成分分类设备的结构图。Fig. 4 shows a structural diagram of a urine formed component classification device according to an embodiment of the present invention.

具体实施方式Detailed ways

下面,将结合附图详细描述本发明的各个实施例。In the following, various embodiments of the present invention will be described in detail with reference to the accompanying drawings.

把基于绿光圈特性的特征用作特征集中的特征Use features based on the green aperture feature as features in the feature set

如图1所示,根据本发明一个实施例的尿液有形成分分类方法1,包括:在步骤S1,将有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差,用作分类器的特征集中的特征;在步骤S2,以所述分类器对尿液有形成分分类。As shown in Figure 1, the urine formed component classification method 1 according to an embodiment of the present invention includes: in step S1, the average pixel value of the formed component peripheral area and the average pixel value of the central area in the formed block block The difference is used as a feature in the feature set of the classifier; in step S2, the urine formed components are classified by the classifier.

其中,外周区域和中心区域通过距离变换(Distance Transform)方法确定。在例如边缘检测处理中可得到有形成分的轮廓。将有形成分的轮廓向内逐格缩进,每格的尺寸与一个像素的尺寸相当。当轮廓缩进n格后,该轮廓无法向内缩进。无法向内缩进的标准例如是:如果缩进n格后的轮廓线上的任一点与沿着向内缩进的方向的、缩进n格后的轮廓线上的对面点之间的距离小于或等于2个格的尺寸,则认为无法向内缩进”。当该轮廓无法向内缩进时,将有形成分的轮廓向内缩进n×2/3格时圈住的部分定义为中心区域,有形成分的中心区域以外的部分定义为外周区域。Among them, the peripheral area and the central area are determined by the method of distance transform. Contours of shaped elements can be obtained, for example, in an edge detection process. Indents the outline of the formed element inwards by cells equal to the size of a pixel. When an outline is indented by n spaces, the outline cannot be indented inward. The criterion for indentation that cannot be indented is, for example, the distance between any point on the contour line that is indented by n spaces and the opposite point on the contour line that is indented by n spaces along the direction of indentation If the size is less than or equal to 2 grids, it is considered that it cannot be indented." When the outline cannot be indented, the definition of the part enclosed when the outline of the formed component is indented by n×2/3 grids is the central area, and the part outside the central area of the formed component is defined as the peripheral area.

在另一个实施例中,将有形成分的轮廓向内缩进n×1/2格时圈住的部分定义为中心区域,有形成分的中心区域以外的部分定义为外周区域。In another embodiment, the part enclosed when the outline of the formed component is indented by n×1/2 grids is defined as the central area, and the part outside the central area of the formed component is defined as the peripheral area.

如前所述,在显微镜的白光照射下,一些有形成分(如红细胞)除了中间的深色区外,周边有一圈宽厚的绿光。有形成分外周区域的平均像素值和中心区域的平均像素值的差就体现了这一特性。因此,把有形成分外周区域的平均像素值和中心区域的平均像素值的差用作特征集中的特征,有助于有形成分的分类。As mentioned above, under the white light of the microscope, some formed components (such as red blood cells) have a wide circle of green light around them except for the dark area in the middle. The difference between the average pixel value of the outer peripheral area of the formed component and the average pixel value of the central area reflects this characteristic. Therefore, using the difference between the average pixel value of the outer peripheral area of the formed element and the average pixel value of the central area as a feature in the feature set is helpful for the classification of the formed element.

如图2所示,在一个实施例中,步骤S1又可细分为如下步骤:在步骤S11,去除环境光和失焦噪声对有形成分中的像素值的影响;在步骤S12,计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差;在步骤S13,将所述差用作所述特征集中的特征。As shown in Figure 2, in one embodiment, step S1 can be subdivided into the following steps: in step S11, remove the influence of ambient light and defocus noise on the pixel value in the formed component; in step S12, calculate and remove The difference between the average pixel value of the peripheral area and the average pixel value of the central area after the influence of ambient light and defocus noise; in step S13, the difference is used as a feature in the feature set.

本领域技术人员应当理解,步骤S11也可以省略。因为去除环境光和失焦噪声对有形成分中的像素值的影响只是使所述差的计算更加精确,在不去除环境光和失焦噪声的影响的情况下仍然可以计算外周区域的平均像素值和中心区域的平均像素值的差,其计算结果在一定程度上也能够体现对上述绿光圈特性的评价。Those skilled in the art should understand that step S11 can also be omitted. Because removing the influence of ambient light and out-of-focus noise on the pixel values in the formed components only makes the calculation of the difference more accurate, the average pixel of the peripheral region can still be calculated without removing the influence of ambient light and out-of-focus noise The difference between the value and the average pixel value in the central area, the calculation result can also reflect the evaluation of the above-mentioned green aperture characteristics to a certain extent.

关于外周区域的平均像素值和中心区域的平均像素值的差的计算,以下给出了两个实施例。Regarding the calculation of the difference between the average pixel value of the peripheral area and the average pixel value of the central area, two examples are given below.

在一个实施例中,对于有形成分中的像素,计算In one embodiment, for pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

T green = I green I red + I green + I blue              公式1 T green = I green I red + I green + I blue Formula 1

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数。Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. color factor.

NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )           公式2 NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) ) Formula 2

其中max(Tred)、max(Tgreen)、max(Tblue)表示在所述有形成分中像素的红色的三色系数最大值、绿色的三色系数最大值、蓝色的三色系数最大值,NTgreen表示该像素的绿色三色系数归一化值。Wherein max(T red ), max(T green ), max(T blue ) represent the maximum value of the red trichromatic coefficient, the maximum green trichromatic coefficient, and the blue trichromatic coefficient of the pixel in the formed component. The maximum value, NT green indicates the normalized value of the green three-color coefficient of the pixel.

通过上面的公式1和2,对于有形成分中每个像素,都可以计算该像素的绿色三色系数归一化值NTgreen。然后,计算有形成分外周区域的各像素的NTgreen的平均值和中心区域的各像素的NTgreen的平均值的差。用绿色三色系数归一化值计算出的外周区域的平均像素值和中心区域的平均像素值的差更能够反映上述绿光圈特性。Through the above formulas 1 and 2, for each pixel in the formed component, the normalized value NT green of the green three-color coefficient of the pixel can be calculated. Then, the difference between the average value of NT green of each pixel in the outer peripheral area of the formed component and the average value of NT green of each pixel in the central area is calculated. The difference between the average pixel value of the peripheral area and the average pixel value of the central area calculated by using the normalized value of the green three-color coefficient can better reflect the above-mentioned characteristics of the green aperture.

在另一个实施例中,发明人发现,用基于灰度的灰色系数归一化值计算出的外周区域的平均像素值和中心区域的平均像素值的差在一定程度上也能反映上述绿光圈特性。In another embodiment, the inventors found that the difference between the average pixel value of the peripheral area and the average pixel value of the central area calculated by using the gamma normalized value based on the gray scale can also reflect the above-mentioned green halo to a certain extent characteristic.

在该实施例中,对于有形成分中的像素,计算In this example, for pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

T green = I green I red + I green + I blue                 公式3 T green = I green I red + I green + I blue Formula 3

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

Tgray=0.2989Tred+0.5870Tgreen+0.1140Tblue T gray =0.2989T red +0.5870T green +0.1140T blue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Igray是所述像素的灰度像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Tgray表示所述像素的灰色系数。Wherein I red , I green , and I blue respectively represent the pixel values of the pixel in the red layer, green layer, and blue layer, I gray is the grayscale pixel value of the pixel, and T red , T green , T blue represent respectively The red, green and blue three-color coefficients of the pixel, T gray represents the gray coefficient of the pixel.

NT gray = T gray max ( T gray )                  公式4 NT gray = T gray max ( T gray ) Formula 4

max(Tgray)表示在所述有形成分中像素的灰色系数最大值,NTgray表示该像素的灰色系数归一化值。max(T gray ) represents the maximum value of the gray coefficient of the pixel in the formed component, and NT gray represents the normalized value of the gray coefficient of the pixel.

通过上面的公式3和4,对于有形成分中每个像素,都可以计算该像素的灰色系数归一化值NTgray。然后,计算有形成分外周区域的各像素的NTgray的平均值和中心区域的各像素的NTgray的平均值的差。Through the above formulas 3 and 4, for each pixel in the formed component, the gray coefficient normalized value NT gray of the pixel can be calculated. Then, the difference between the average value of NT gray of each pixel in the formed peripheral area and the average value of NT gray of each pixel in the central area is calculated.

本领域技术人员应当理解,除了用绿色三色系数归一化值计算出的外周区域的平均像素值和中心区域的平均像素值的差、用灰色系数归一化值计算出的外周区域的平均像素值和中心区域的平均像素值的差,用红色、蓝色系数归一化值计算出的外周区域的平均像素值和中心区域的平均像素值的差,也对于区分有形成分有一定帮助。因此,外周区域的平均像素值和中心区域的平均像素值的差的计算不限于上述两种实施例。Those skilled in the art should understand that, in addition to the difference between the average pixel value of the peripheral area and the average pixel value of the central area calculated by using the normalized value of the green three-color coefficient, the average value of the peripheral area calculated by using the normalized value of the gray coefficient The difference between the pixel value and the average pixel value of the central area, and the difference between the average pixel value of the peripheral area and the average pixel value of the central area calculated by the normalized value of the red and blue coefficients are also helpful for distinguishing the formed components . Therefore, the calculation of the difference between the average pixel value of the peripheral area and the average pixel value of the central area is not limited to the above two embodiments.

本领域技术人员应当理解,虽然上面将中心区域和外周区域定义为将有形成分的轮廓向内缩进n×1/2格或n×2/3格后该轮廓内、外的部分,但也可以规定除2/3和1/2之外的其它的比例。例如将中心区域和外周区域定义为将有形成分的轮廓向内缩进n×3/4格后该轮廓内、外的部分。Those skilled in the art should understand that although the central area and the peripheral area are defined above as the parts inside and outside the outline after the outline of the formed component is indented by n×1/2 or n×2/3 grids, Other ratios than 2/3 and 1/2 can also be specified. For example, the central area and the peripheral area are defined as the inner and outer parts of the outline after the outline of the formed component is indented by n×3/4 grids.

本领域技术人员应当理解,可以按照将有形成分的轮廓向内缩进n×2/3格后该轮廓内、外的部分来定义中心区域和外周区域,并据此计算外周区域的平均像素值和中心区域的平均像素值的第一差。再按照将有形成分的轮廓向内缩进n×1/2格后该轮廓内、外的部分来定义中心区域和外周区域,并据此计算外周区域的平均像素值和中心区域的平均像素值的第二差。将第一差和第二差都作为特征集中的两个特征。另外,在既用绿色三色系数归一化值又用灰色系数归一化值计算出外周区域的平均像素值和中心区域的平均像素值的差的情况下,就在特征集中加入了四个特征。Those skilled in the art should understand that the central area and the peripheral area can be defined according to the inner and outer parts of the contour after the contour of the formed component is indented by n×2/3 grids, and the average pixels of the peripheral area can be calculated accordingly value and the first difference of the average pixel value of the central region. Then define the central area and the outer peripheral area according to the inner and outer parts of the outline after the outline of the formed component is indented inward by n×1/2 grids, and calculate the average pixel value of the outer peripheral area and the average pixel value of the central area accordingly The second worst value. Both the first difference and the second difference are taken as two features in the feature set. In addition, when the difference between the average pixel value of the peripheral area and the average pixel value of the central area is calculated by using both the normalized value of the green tricolor coefficient and the normalized value of the gray coefficient, four pixels are added to the feature set feature.

将基于灰度共生矩阵的特征用作特征集中的特征Use gray level co-occurrence matrix based features as features in feature set

本发明的发明人发现,利用基于灰度共生矩阵的一些特征,也能够有效地区分一些有形成分,尤其是区分红细胞、白细胞、结晶。The inventors of the present invention found that some formed components, especially red blood cells, white blood cells, and crystals, can also be effectively distinguished by using some features based on the gray scale co-occurrence matrix.

根据本发明的一个实施例,获得有形成分的灰度共生矩阵。对于特定有形成分,其灰度共生矩阵是可以通过现有技术测定的。然后,将有形成分的灰度共生矩阵的对比度、同质性、能量中的一个或多个用作分类器的特征集中的特征。灰度共生矩阵的对比度、同质性、能量如下计算:According to an embodiment of the present invention, the gray level co-occurrence matrix of the formed components is obtained. For a specific formed component, its gray level co-occurrence matrix can be determined by existing technology. Then, one or more of the contrast, homogeneity, energy of the gray level co-occurrence matrix of the formed components is used as features in the feature set of the classifier. The contrast, homogeneity, and energy of the gray-level co-occurrence matrix are calculated as follows:

          公式5 Formula 5

              公式6 Formula 6

                公式7 Formula 7

其中P(i,j)表示灰度为i和灰度为j的值同时出现的概率。Among them, P(i, j) represents the probability that the values of gray level i and gray level j appear at the same time.

有形成分的灰度共生矩阵的对比度、同质性、能量中的一个或多个用作分类器的特征集中的特征,能够提高区分有形成分的有效性,提高分类精度。One or more of the contrast, homogeneity, and energy of the gray-level co-occurrence matrix of the formed components is used as the feature set of the classifier, which can improve the effectiveness of distinguishing the formed components and improve the classification accuracy.

将基于有形成分区块的对比度的特征用作特征集中的特征Use features based on the contrast of shaped partitions as features in the feature set

发明人发现,对于一些有形成分,其区块在显微镜下的前景部分(有形成分本身)和背景部分亮度差别很大,这一点也可以帮助区别一些有形成分。因此,根据本发明的一个实施例,将有形成分区块中前景部分的像素平均亮度与有形成分区块中背景部分的像素平均亮度的差用作分类器的特征集中的特征。这也能够提高区分有形成分(尤其是红细胞、白细胞、结晶)的有效性,提高分类精度。The inventors found that for some formed components, the brightness of the foreground part (the formed component itself) and the background part of the block under the microscope are very different, which can also help to distinguish some formed components. Therefore, according to one embodiment of the present invention, the difference between the average brightness of the pixels in the foreground part of the formed partition and the average brightness of the pixels in the background part of the formed partition is used as a feature in the feature set of the classifier. This can also improve the effectiveness of distinguishing formed elements (especially red blood cells, white blood cells, crystals) and improve classification accuracy.

将基于胡距的特征用作特征集中的特征Use Hu distance based features as features in feature set

发明人发现,对于一些有形成分,它的胡距也与其它有形成分差别很大。因此,将按照以下过程计算出的7个胡距I1、I2、I3、I4、I5、I6、I7中的一个或多个作为特征用作分类器的特征集中的特征,也能够提高区分有形成分(尤其是红细胞、白细胞、结晶)的有效性,提高分类精度。The inventors found that, for some formed components, its distance is also very different from other formed components. Therefore, one or more of the seven Hu distances I 1 , I 2 , I 3 , I 4 , I 5 , I 6 , and I 7 calculated according to the following process are used as features in the feature set of the classifier , It can also improve the effectiveness of distinguishing formed components (especially red blood cells, white blood cells, and crystals), and improve the classification accuracy.

首先,将有形成分转换成灰度图像,f(x,y)表示灰度图像中像素(x,y)的灰度,f(x,y)=0.2989Ired+0.5870Igreen+0.1140Iblue,其中Ired、Igreen、Iblue分别表示所述像素(x,y)在红色层、绿色层、蓝色层的像素值,其(p+q)阶距定义为:First, convert the formed component into a grayscale image, f(x,y) represents the grayscale of the pixel (x,y) in the grayscale image, f(x,y)=0.2989I red +0.5870I green +0.1140I blue , where I red , I green , and I blue represent the pixel values of the pixel (x, y) in the red layer, green layer, and blue layer respectively, and the (p+q) step is defined as:

mpq=∫∫xpyqf(x,y)dxdy   p,q=0,1,2,…             公式8m pq =∫∫x p y q f(x,y)dxdy p,q=0,1,2,… Formula 8

(p+q)阶中心距定义为:(p+q) order center distance is defined as:

μpq=∫∫(x-x0)p(y-y0)qf(x,y)dxdy                 公式9μ pq =∫∫(xx 0 ) p (yy 0 ) q f(x,y)dxdy Formula 9

其中 x 0 = m 10 m 00 , y 0 = m 01 m 00 . in x 0 = m 10 m 00 , the y 0 = m 01 m 00 .

标准中心距为                          公式10The standard center distance is Formula 10

其中 r + p + q + 2 2 in r + p + q + 2 2

由此,7个胡距定义为:Thus, the seven Hu distances are defined as:

I1=y20+y02                                       公式11I 1 =y 20 +y 02 Formula 11

I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2                                                         公式12 I 2 = ( the y 20 + the y 02 ) 2 + 4 the y 11 2 Formula 12

I3=(y30+3y)2+(3y21-y03)2                           公式13I 3 =(y 30 +3y) 2 +(3y 21 -y 03 ) 2 Formula 13

I4=(y30+y12)2+(y21+y03)2                           公式14I 4 =(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2 Formula 14

I5=(y30-y12)(y30+y12)[(y30+y12)2-3(y21+y03)2]I 5 =(y 30 -y 12 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]

+(3y21-y03)(y21+y30)[3(y30+y12)2-(y21+y03)2]+(3y 21 -y 03 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

                                                 公式15Formula 15

I6=(y20-y02)[(y30+y12)2-(y21+y03)2]+                公式16I 6 =(y 20 -y 02 )[(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]+ Formula 16

4y11(y30+y12)(y21+y03)4y 11 (y 30 +y 12 )(y 21 +y 03 )

I7=(3y21+y03)(y30+y12)[(y30+y12)2-3(y21+y03)2]+      公式17。I 7 =(3y 21 +y 03 )(y 30 +y 12 )[(y 30 +y 12 ) 2 −3(y 21 +y 03 ) 2 ]+ Formula 17.

(y30-3y12)(y21+y30)[3(y30+y12)2-(y21+y03)2](y 30 -3y 12 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

将基于直方图度量的特征用作特征集中的特征Use features based on histogram metrics as features in a feature set

发明人发现,对于一些有形成分,它的一些直方图度量也与其它有形成分差别很大。因此,将按照下列公式计算出的有形成分的直方图度量(包括平均灰度级灰度级的标准差σg、红绿蓝色的斜度κ、熵和能量)中的一个或多个作为特征用作分类器的特征集中的特征,也能够提高区分有形成分(尤其是红细胞、白细胞、结晶)的有效性,提高分类精度。The inventors found that for some formed components, some histogram measures are also very different from other formed components. Therefore, the histogram measure of the formed components (including the average gray level) calculated according to the following formula One or more of the standard deviation σ g of the gray level, the slope κ of red, green and blue, entropy and energy) are used as features in the feature set of the classifier, which can also improve the distinction between formed components (especially red blood cells, white blood cells, and crystals) to improve classification accuracy.

g ‾ = Σ g = 0 L - 1 P ( g ) · g                                                  公式18 g ‾ = Σ g = 0 L - 1 P ( g ) · g Formula 18

其中,概率密度函数将灰度划分到L个灰度级别中。例如,将某一灰度范围内的灰度划分到一个灰度级别中。L可根据需要根据经验值或通过有限次的实验和仿真确定,通常L不能太小。h(g)是有形成分中处于灰度级别g的像素数,M是有形成分中的像素总数。Among them, the probability density function The grayscale is divided into L grayscale levels. For example, grayscales within a certain grayscale range are divided into one grayscale level. L can be determined based on empirical values or through limited experiments and simulations as needed, and usually L cannot be too small. h(g) is the number of pixels in the formed component at gray level g, and M is the total number of pixels in the formed component.

σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )                    公式19 σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 &Center Dot; P ( g ) Formula 19

κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )                     公式20 κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g ) Formula 20

                   公式21 Formula 21

                           公式22。 Formula 22.

根据本发明一个实施例的装置A device according to one embodiment of the invention

如图3所示,根据本发明一个实施例的尿液有形成分分类装置2,包括特征化单元201和分类单元202。特征化单元201被配置为将有形成分区块中有形成分外周区域的平均像素值和中心区域的平均像素值的差,用作分类器的特征集中的特征。分类单元202被配置为以所述分类器对尿液有形成分分类。图3所示的装置可以利用软件、硬件(例如集成电路、FPGA等)或软硬件结合的方式实现。As shown in FIG. 3 , the urine formed component classification device 2 according to an embodiment of the present invention includes a characterization unit 201 and a classification unit 202 . The characterizing unit 201 is configured to use the difference between the average pixel value of the outer peripheral area of the formed segment and the average pixel value of the central area in the formed segment block as a feature in the feature set of the classifier. The classification unit 202 is configured to classify the urine formed components with the classifier. The apparatus shown in FIG. 3 can be realized by software, hardware (such as an integrated circuit, FPGA, etc.), or a combination of software and hardware.

在一个实施例中,所述特征化单元201可以进一步被配置为去除环境光和失焦噪声对有形成分中的像素值的影响,计算去除环境光和失焦噪声的影响后的外周区域的平均像素值和中心区域的平均像素值的差,将所述差用作所述特征集中的特征。In one embodiment, the characterization unit 201 may be further configured to remove the influence of ambient light and defocus noise on the pixel values in the formed components, and calculate the The difference between the average pixel value and the average pixel value of the central region is used as a feature in the feature set.

在一个实施例中,所述特征化单元201可以进一步被配置为:In one embodiment, the characterization unit 201 may be further configured as:

对于有形成分中的像素,计算For pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

T green = I green I red + I green + I blue                                  公式1 T green = I green I red + I green + I blue Formula 1

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Wherein I red , I green , and I blue represent the pixel values of the pixel in the red layer, green layer, and blue layer respectively, and T red , T green , and T blue represent the red, green, and blue three-dimensional values of the pixel respectively. color factor,

NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )                                 公式2 NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) ) Formula 2

其中,max(Tred)、max(Tgreen)、max(Tblue)表示在所述有形成分中像素的红色的三色系数最大值、绿色的三色系数最大值、蓝色的三色系数最大值,NTgreen表示该像素的绿色三色系数归一化值;Among them, max(T red ), max(T green ), max(T blue ) represent the maximum value of red trichromatic coefficients, the maximum green trichromatic coefficients, and the blue trichromatic coefficients of pixels in the formed components. The maximum value of the coefficient, NT green represents the normalized value of the green three-color coefficient of the pixel;

计算有形成分外周区域的各像素的NTgreen的平均值和中心区域的各像素的NTgreen的平均值的差。The difference between the average value of NT green of each pixel in the outer peripheral area of the formed component and the average value of NT green of each pixel in the central area is calculated.

在一个实施例中,所述特征化单元201可以进一步被配置为:In one embodiment, the characterization unit 201 may be further configured as:

对于有形成分中的像素,计算For pixels in the formed component, compute

TT redred == II redred II redred ++ II greengreen ++ II blueblue

T green = I green I red + I green + I blue                                        公式3 T green = I green I red + I green + I blue Formula 3

TT blueblue == II blueblue II redred ++ II greengreen ++ II blueblue

Tgray=0.2989Tred+0.5870Tgreen+0.1140Tblue T gray =0.2989T red +0.5870T green +0.1140T blue

其中Ired、Igreen、Iblue分别表示所述像素在红色层、绿色层、蓝色层的像素值,Igray是所述像素的灰度像素值,Tred、Tgreen、Tblue分别表示所述像素的红色、绿色、蓝色的三色系数,Tgray表示所述像素的灰色系数,Wherein I red , I green , and I blue respectively represent the pixel values of the pixel in the red layer, green layer, and blue layer, I gray is the grayscale pixel value of the pixel, and T red , T green , T blue represent respectively The red, green and blue three-color coefficients of the pixel, T gray represents the gray coefficient of the pixel,

NT gray = T gray max ( T gray )                                           公式4 NT gray = T gray max ( T gray ) Formula 4

max(Tgray)表示在所述有形成分中像素的灰色系数最大值,NTgray表示该像素的灰色系数归一化值;max(T gray ) represents the maximum value of the gray coefficient of the pixel in the formed component, and NT gray represents the normalized value of the gray coefficient of the pixel;

计算有形成分外周区域的各像素的NTgray的平均值和中心区域的各像素的NTgray的平均值的差。The difference between the average value of NT gray of each pixel in the outer peripheral area of the formed component and the average value of NT gray of each pixel in the central area is calculated.

在一个实施例中,所述特征化单元201还被配置为:In one embodiment, the characterization unit 201 is further configured to:

获得有形成分的灰度共生矩阵,并将有形成分的灰度共生矩阵的对比度、同质性、能量中的一个或多个用作分类器的特征集中的特征,其中灰度共生矩阵的对比度、同质性、能量如下计算:Obtain the gray level co-occurrence matrix of the formed component, and use one or more of the contrast, homogeneity, and energy of the gray level co-occurrence matrix of the formed component as the feature set of the classifier, wherein the gray level co-occurrence matrix Contrast, homogeneity, energy are calculated as follows:

                   公式5 Formula 5

                              公式6 Formula 6

                                公式7 Formula 7

其中P(i,j)表示灰度为i和灰度为j的值同时出现的概率。Among them, P(i, j) represents the probability that the values of gray level i and gray level j appear at the same time.

在一个实施例中,所述特征化单元201还可被配置为将有形成分区块中前景部分的像素平均亮度与有形成分区块中背景部分的像素平均亮度的差作用作分类器的特征集中的特征。In one embodiment, the characterization unit 201 can also be configured to use the difference between the average brightness of pixels in the foreground part of the formed block and the average brightness of the pixels in the background of the formed block as the feature set of the classifier Characteristics.

在一个实施例中,所述特征化单元201还可被配置为将按照以下过程计算出的7个胡距I1、I2、I3、I4、I5、I6、I7中的一个或多个作为特征用作分类器的特征集中的特征:In one embodiment, the characterization unit 201 can also be configured to use the seven Hu distances I 1 , I 2 , I 3 , I 4 , I 5 , I 6 , and I 7 calculated according to the following process One or more features in the feature set used as features in the classifier:

将有形成分转换成灰度图像,f(x,y)表示灰度图像中像素(x,y)的灰度,f(x,y)=0.2989Ired+0.5870Igreen+0.1140Iblue,其中Ired、Igreen、Iblue分别表示所述像素(x,y)在红色层、绿色层、蓝色层的像素值,其(p+q)阶距定义为:Convert the formed component into a grayscale image, f(x, y ) represents the grayscale of the pixel (x, y ) in the grayscale image, f(x,y)=0.2989I red +0.5870I green +0.1140I blue , Among them, I red , I green , and I blue represent the pixel values of the pixel (x, y) in the red layer, green layer, and blue layer respectively, and the (p+q) steps are defined as:

mpq=∫∫xpyqf(x,y)dxdy    p,q=0,1,2,…       公式8m pq =∫∫x p y q f(x,y)dxdy p,q=0,1,2,… Formula 8

(p+q)阶中心距定义为:(p+q) order center distance is defined as:

μpq=∫∫(x-x0)p(y-y0)qf(x,y)dxdy            公式9μ pq =∫∫(xx 0 ) p (yy 0 ) q f(x,y)dxdy Formula 9

其中 x 0 = m 10 m 00 , y 0 = m 01 m 00 in x 0 = m 10 m 00 , the y 0 = m 01 m 00

标准中心距为                     公式10The standard center distance is Formula 10

其中 r + p + q + 2 2 in r + p + q + 2 2

由此,7个胡距定义为:Thus, the seven Hu distances are defined as:

I1=y20+y02                                  公式11I 1 =y 20 +y 02 Formula 11

I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2                                                    公式12 I 2 = ( the y 20 + the y 02 ) 2 + 4 the y 11 2 Formula 12

I3=(y30+3y)2+(3y21-y03)2                      公式13I 3 =(y 30 +3y) 2 +(3y 21 -y 03 ) 2 Formula 13

I4=(y30+y12)2+(y21+y03)2                               公式14I 4 =(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2 Formula 14

I5=(y30-y12)(y30+y12)[(y30+y12)2-3(y21+y03)2]           公式15I 5 =(y 30 -y 12 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ] Formula 15

+(3y21-y03)(y21+y30)[3(y30+y12)2-(y21+y03)2]+(3y 21 -y 03 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

I6=(y20-y02)[(y30+y12)2-(y21+y03)2]+                    公式16I 6 =(y 20 -y 02 )[(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]+ Formula 16

4y11(y30+y12)(y21+y03)4y 11 (y 30 +y 12 )(y 21 +y 03 )

I7=(3y21+y03)(y30+y12)[(y30+y12)2-3(y21+y03)2]+          公式17I 7 =(3y 21 +y 03 )(y 30 +y 12 )[(y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 ]+ Formula 17

(y30-3y12)(y21+y30)[3(y30+y12)2-(y21+y03)2](y 30 -3y 12 )(y 21 +y 30 )[3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 ]

在一个实施例中,所述特征化单元201还可被配置为将按照下式计算出的有形成分的平均灰度级灰度级的标准差σg、红绿蓝色的斜度κ、熵和能量中的一个或多个用作分类器的特征集中的特征In one embodiment, the characterization unit 201 can also be configured to use the average gray level of the formed components calculated according to the following formula One or more of the standard deviation σ g of gray levels, the slope κ of red, green and blue, entropy, and energy are used as features in the feature set of the classifier

g ‾ = Σ g = 0 L - 1 P ( g ) · g                                              公式18 g ‾ = Σ g = 0 L - 1 P ( g ) &Center Dot; g Formula 18

其中,概率密度函数将灰度划分到L个灰度级别中。例如,将某一灰度范围内的灰度划分到一个灰度级别中。L可根据需要确定,但L不能太小。h(g)是有形成分中处于灰度级别g的像素数,M是有形成分中的像素总数。Among them, the probability density function The grayscale is divided into L grayscale levels. For example, grayscales within a certain grayscale range are divided into one grayscale level. L can be determined according to needs, but L cannot be too small. h(g) is the number of pixels in the formed component at gray level g, and M is the total number of pixels in the formed component.

σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )                                              公式19 σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g ) Formula 19

κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )                                              公式20 κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 &Center Dot; P ( g ) Formula 20

              公式21 Formula 21

                      公式22 Formula 22

在一个实施例中,所述有形成分包括红细胞、白细胞、结晶。In one embodiment, said formed elements include red blood cells, white blood cells, crystals.

图4示出了根据本发明的一个实施例的尿液有形成分分类设备3。该设备可包括存储器301和处理器302。存储器301用于存储可执行指令。处理器302用于根据所述存储器所存储的可执行指令,执行装置2中各个单元执行的操作。Fig. 4 shows a urine formed component classification device 3 according to one embodiment of the present invention. The device may include a memory 301 and a processor 302 . The memory 301 is used to store executable instructions. The processor 302 is configured to perform operations performed by each unit in the device 2 according to the executable instructions stored in the memory.

此外,本发明的一个实施例还提供一种机器可读介质,其上存储有可执行指令,当该可执行指令被执行时,使得机器执行处理器302所执行的操作。In addition, an embodiment of the present invention also provides a machine-readable medium on which executable instructions are stored, and when the executable instructions are executed, the machine executes the operations performed by the processor 302 .

本领域技术人员应当理解,上面的各个实施例可以在没有偏离发明实质的情况下做出各种变形和修改,因此,本发明的保护范围应当由所附的权利要求书来限定。Those skilled in the art should understand that various variations and modifications can be made to the above embodiments without departing from the essence of the invention, therefore, the protection scope of the present invention should be defined by the appended claims.

Claims (20)

1. a urinary formed element sorting technique (1), comprising:
By the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature (S1) in the feature set of sorter;
With described sorter to urinary formed element classification (S2).
2. urinary formed element sorting technique (1) according to claim 1, the step (S1) wherein the difference of the average pixel value of visible component outer region and the average pixel value of central area being used as the feature in the feature set of sorter comprising:
Removal surround lighting and noise out of focus are on the impact (S11) of the pixel value in visible component;
Calculate the difference (S12) of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area;
Described difference is used as the feature (S13) in described feature set.
3. urinary formed element sorting technique (1) according to claim 2, wherein
Removal surround lighting and the step (S11) of noise out of focus on the impact of the pixel value in visible component comprising: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of described pixel;
The step (S12) calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprising: the NT calculating each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
4. urinary formed element sorting technique (1) according to claim 2, wherein
Removal surround lighting and the step (S11) of noise out of focus on the impact of the pixel value in visible component comprising: for the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
The step (S12) calculating the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area comprising: the NT calculating each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
5. urinary formed element sorting technique (1) according to claim 1, described method also comprises:
Obtain the gray level co-occurrence matrixes of visible component, and one or more as the feature in the feature set of sorter using in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
6. urinary formed element sorting technique (1) according to claim 1, described method also comprises:
The difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of sorter.
7. urinary formed element sorting technique (1) according to claim 1, described method also comprises 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
8. urinary formed element sorting technique (1) according to claim 1, described method also comprises the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
9. urinary formed element sorting technique (1) according to claim 1, wherein said visible component comprises red blood cell, leucocyte, crystallization.
10. a urinary formed element sorter (2), comprising:
Characterization unit (201), is configured to the difference of the average pixel value of visible component outer region and the average pixel value of central area in visible component block, as the feature in the feature set of sorter;
Taxon (202), is configured to classify to urinary formed element with described sorter.
11. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is configured to remove surround lighting and noise out of focus to the impact of the pixel value in visible component further, calculate the difference of the average pixel value of outer region after the impact of removing surround lighting and noise out of focus and the average pixel value of central area, described difference is used as the feature of described feature set.
12. urinary formed element sorters (2) according to claim 11, wherein said characterization unit (201) is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively,
NT green = T green max ( max ( T red ) , max ( T green ) , max ( T blue ) )
Max (T red), max (T green), max (T blue) represent the trichromatic coefficient maximal value of the redness of pixel in visible component, green trichromatic coefficient maximal value, blue trichromatic coefficient maximal value, NT greenrepresent the green trichromatic coefficient normalized value of described pixel;
Calculate the NT of each pixel of visible component outer region greenmean value and the NT of each pixel of central area greenthe difference of mean value.
13. urinary formed element sorters (2) according to claim 11, wherein said characterization unit (201) is configured to further:
For the pixel in visible component, calculate
T red = I red I red + I green + I blue
T green = I green I red + I green + I blue
T blue = I blue I red + I green + I blue
T gray=0.2989T red+0.5870T green+0.1140T blue
Wherein I red, I green, I bluerepresent the pixel value of described pixel at red color layer, green layer, cyan coloring layer respectively, T red, T green, T bluerepresent the redness of described pixel, green, blue trichromatic coefficient respectively, T grayrepresent the grey coefficients of described pixel,
NT gray = T gray max ( T gray )
Max (T gray) represent the grey coefficients maximal value of pixel in visible component, NT grayrepresent the grey coefficients normalized value of this pixel;
Calculate the NT of each pixel of visible component outer region graymean value and the NT of each pixel of central area graythe difference of mean value.
14. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to:
Obtain the gray level co-occurrence matrixes of visible component, and by the feature in the one or more feature sets being used as described sorter in the contrast of the gray level co-occurrence matrixes of visible component, homogeney, energy, wherein the contrast of gray level co-occurrence matrixes, homogeney, energy calculate as follows:
Wherein P(i, j) represent gray scale to be i and gray scale be the probability that the value of j occurs simultaneously.
15. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to:
The difference of the average luminance of pixels of background parts in the average luminance of pixels of prospect part in visible component block and visible component block is used as the feature in the feature set of described sorter.
16. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to 7 the Hu distance I will gone out according to following process computation 1, I 2, I 3, I 4, I 5, I 6, I 7in one or more be used as described sorter feature sets in feature:
Convert visible component to gray level image, f (x, y) represents the gray scale of pixel (x, y) in gray level image, f (x, y)=0.2989I red+ 0.5870I green+ 0.1140I blue, wherein I red, I green, I bluerepresent the pixel value of described pixel (x, y) at red color layer, green layer, cyan coloring layer respectively, its (p+q) rank are apart from being defined as:
m pq=∫∫x py qf(x,y)dxdy p,q=0,1,2,…
(p+q) rank centre distance is defined as:
μ pq=∫∫(x-x 0) p(y-y 0) qf(x,y)dxdy
Wherein x 0 = m 10 m 00 , y 0 = m 01 m 00
Reference center distance is
Wherein r + p + q + 2 2
Thus, 7 Hus are apart from being defined as:
I 1=y 20+y 02
I 2 = ( y 20 + y 02 ) 2 + 4 y 11 2
I 3=(y 30+3y) 2+(3y 21-y 03) 2
I 4=(y 30+y 12) 2+(y 21+y 03) 2
I 5=(y 30-y 12)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]
+(3y 21-y 03)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]
I 6=(y 20-y 02)[(y 30+y 12) 2-(y 21+y 03) 2]+
4y 11(y 30+y 12)(y 21+y 03)
I 7=(3y 21+y 03)(y 30+y 12)[(y 30+y 12) 2-3(y 21+y 03) 2]+
(y 30-3y 12)(y 21+y 30)[3(y 30+y 12) 2-(y 21+y 03) 2]。
17. urinary formed element sorters (2) according to claim 10, wherein said characterization unit (201) is also configured to the average gray level of the visible component calculated according to the following formula the standard deviation sigma of gray level g, the gradient κ of RGB look, feature in one or more feature sets being used as described sorter in entropy and energy:
g ‾ = Σ g = 0 L - 1 P ( g ) · g
Wherein, probability density function gray scale be divided in L grey level, h (g) is the pixel count being in grey level g in visible component, and M is the sum of all pixels in visible component,
σ g = Σ g - 0 L - 1 ( g - g ‾ ) 2 · P ( g )
κ = 1 σ g 3 Σ g = 0 L - 1 ( g - g ‾ ) 3 · P ( g )
18. urinary formed element sorters (2) according to claim 10, wherein said visible component comprises red blood cell, leucocyte, crystallization.
19. 1 kinds of urinary formed element sorting devices (3), comprising:
Storer (301), for stores executable instructions;
Processor (302), for the executable instruction stored according to described storer, enforcement of rights requires the operation performed by any one claim in 1-9.
20. 1 kinds of machine readable medias, it stores executable instruction, when described executable instruction is performed, makes the operation performed by any one claim in machine enforcement of rights requirement 1-9.
CN201310752863.XA 2013-12-31 2013-12-31 Method and device for classifying urine visible components Pending CN104751167A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310752863.XA CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components
PCT/US2014/071500 WO2015102948A1 (en) 2013-12-31 2014-12-19 Urine formed element classification method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310752863.XA CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components

Publications (1)

Publication Number Publication Date
CN104751167A true CN104751167A (en) 2015-07-01

Family

ID=53493904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310752863.XA Pending CN104751167A (en) 2013-12-31 2013-12-31 Method and device for classifying urine visible components

Country Status (2)

Country Link
CN (1) CN104751167A (en)
WO (1) WO2015102948A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344852A (en) * 2018-08-01 2019-02-15 迈克医疗电子有限公司 Image-recognizing method and device, analysis instrument and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220133918A (en) 2020-01-30 2022-10-05 비타디엑스 인터내셔널 Systematic characterization of a subject within a biological sample

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525401A (en) * 2003-02-28 2004-09-01 ��˹���´﹫˾ Method and system for enhancing portrait images that are processed in a batch mode
CN101873411A (en) * 2009-04-24 2010-10-27 瑞萨电子株式会社 Image processing device and image processing method
CN102354388A (en) * 2011-09-22 2012-02-15 北京航空航天大学 Method for carrying out adaptive computing on importance weights of low-level features of image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6718053B1 (en) * 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
US7522762B2 (en) * 2003-04-16 2009-04-21 Inverness Medical-Biostar, Inc. Detection, resolution, and identification of arrayed elements

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525401A (en) * 2003-02-28 2004-09-01 ��˹���´﹫˾ Method and system for enhancing portrait images that are processed in a batch mode
CN101873411A (en) * 2009-04-24 2010-10-27 瑞萨电子株式会社 Image processing device and image processing method
CN102354388A (en) * 2011-09-22 2012-02-15 北京航空航天大学 Method for carrying out adaptive computing on importance weights of low-level features of image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109344852A (en) * 2018-08-01 2019-02-15 迈克医疗电子有限公司 Image-recognizing method and device, analysis instrument and storage medium

Also Published As

Publication number Publication date
WO2015102948A1 (en) 2015-07-09

Similar Documents

Publication Publication Date Title
CN107507173B (en) A no-reference sharpness assessment method and system for whole slice images
Rahtu et al. Learning a category independent object detection cascade
CN109829914A (en) The method and apparatus of testing product defect
CN107220624A (en) A kind of method for detecting human face based on Adaboost algorithm
CN105809121A (en) Multi-characteristic synergic traffic sign detection and identification method
CN103593670A (en) Copper sheet and strip surface defect detection method based on-line sequential extreme learning machine
CN108492294B (en) Method and device for evaluating harmony degree of image colors
CN104794502A (en) Image processing and mode recognition technology-based rice blast spore microscopic image recognition method
CN111209858B (en) Real-time license plate detection method based on deep convolutional neural network
CN104217438A (en) Image significance detection method based on semi-supervision
CN112926652A (en) Fish fine-grained image identification method based on deep learning
CN104992183B (en) The automatic testing method of well-marked target in natural scene
CN104657714B (en) Illumination symmetry merged with global illumination intensity without refer to human face light evaluation method
CN110532946A (en) A method of the green vehicle spindle-type that is open to traffic is identified based on convolutional neural networks
CN110599453A (en) Panel defect detection method and device based on image fusion and equipment terminal
CN102629386A (en) Region segmentation method for colorful textile texture images
CN108171683B (en) Cell counting method adopting software for automatic identification
CN104574391A (en) Stereoscopic vision matching method based on adaptive feature window
CN106446890A (en) Candidate area extraction method based on window scoring and superpixel segmentation
CN105913451B (en) A kind of natural image superpixel segmentation method based on graph model
CN100565584C (en) A kind of global optimization method with natural image matting of correction property
CN113129390B (en) A color-blind image recoloring method and system based on joint saliency
CN102509299B (en) Image salient area detection method based on visual attention mechanism
WO2020119624A1 (en) Class-sensitive edge detection method based on deep learning
Giusti et al. A comparison of algorithms and humans for mitosis detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150701