CN110544262A - cervical cell image segmentation method based on machine vision - Google Patents
cervical cell image segmentation method based on machine vision Download PDFInfo
- Publication number
- CN110544262A CN110544262A CN201910725869.5A CN201910725869A CN110544262A CN 110544262 A CN110544262 A CN 110544262A CN 201910725869 A CN201910725869 A CN 201910725869A CN 110544262 A CN110544262 A CN 110544262A
- Authority
- CN
- China
- Prior art keywords
- cell
- cervical
- image
- nucleus
- cytoplasm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000003709 image segmentation Methods 0.000 title claims abstract description 29
- 210000004027 cell Anatomy 0.000 claims abstract description 116
- 210000003855 cell nucleus Anatomy 0.000 claims abstract description 61
- 210000000805 cytoplasm Anatomy 0.000 claims abstract description 41
- 238000012545 processing Methods 0.000 claims abstract description 24
- 238000001514 detection method Methods 0.000 claims abstract description 13
- 238000012216 screening Methods 0.000 claims abstract description 7
- 210000004940 nucleus Anatomy 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 13
- 238000001914 filtration Methods 0.000 claims description 9
- 230000011218 segmentation Effects 0.000 claims description 9
- 230000001413 cellular effect Effects 0.000 claims description 6
- 230000001086 cytosolic effect Effects 0.000 claims description 6
- 238000000926 separation method Methods 0.000 claims description 6
- 230000001629 suppression Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 4
- 230000000877 morphologic effect Effects 0.000 claims description 4
- 238000007670 refining Methods 0.000 claims description 4
- 230000002146 bilateral effect Effects 0.000 claims description 3
- 238000002790 cross-validation Methods 0.000 claims description 3
- 239000000446 fuel Substances 0.000 claims description 3
- 239000007788 liquid Substances 0.000 claims description 3
- 238000003672 processing method Methods 0.000 abstract 1
- 206010008342 Cervix carcinoma Diseases 0.000 description 3
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 description 3
- 201000010881 cervical cancer Diseases 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000975 dye Substances 0.000 description 3
- 239000008188 pellet Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 102000004169 proteins and genes Human genes 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 235000012766 Cannabis sativa ssp. sativa var. sativa Nutrition 0.000 description 1
- 235000012765 Cannabis sativa ssp. sativa var. spontanea Nutrition 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000000601 blood cell Anatomy 0.000 description 1
- 235000009120 camo Nutrition 0.000 description 1
- 235000005607 chanvre indien Nutrition 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 230000002380 cytological effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000003527 eukaryotic cell Anatomy 0.000 description 1
- 238000000105 evaporative light scattering detection Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000011487 hemp Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 210000004969 inflammatory cell Anatomy 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003097 mucus Anatomy 0.000 description 1
- -1 mucus Substances 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a cervical cell image segmentation method based on machine vision, and relates to the technical field of medical image processing. The method comprises the following steps: the method comprises the steps of firstly cutting original images of the cervical cells into blocks, then respectively detecting and screening the cytoplasm blocks and the cell nucleuses, and the like, wherein the detection of the cell nucleuses not only adopts an image processing method, but also uses the prior knowledge in cytology to screen the cell nucleuses, so that the accuracy of cell nucleus screening is greatly improved, and the cell is positioned by combining the obtained cytoplasm blocks and the cell nucleuses.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a cervical cell image segmentation method based on machine vision.
background
The traditional cervical cancer cell identification is mainly that a professional doctor observes a cervical cell medical image of a person to be detected and tested, and judges whether the cervical cell of the person to be detected is diseased or not and whether the cervical cancer is at risk or not by means of professional knowledge and experience.
image segmentation is a technique and process for extracting a specific region of interest in an image, and as the medical level increases, medical image segmentation has become a key object for medical analysis techniques. Cervical cell segmentation seriously threatens the health and life safety of women due to high morbidity and mortality of cervical cancer, and is also a technology of important attention in clinical diagnosis and treatment. Various classical image segmentation methods are applied to medical image segmentation, but due to the particularity and complexity of medical images, although some progress is made, no good effect is achieved in practical application.
The nucleus contains the main genetic material, and the judgment of whether the cell is diseased inevitably needs to be carried out on the nucleus, while the cytoplasm only contains a small amount of genetic material, but due to the sensitivity of the cytoplasm, once the cell is diseased, the cytoplasm expresses the pathological degree of the cell in different ways. The segmentation of the cervical map therefore also focuses mainly on the detection of the nucleus and cytoplasm. A super-pixel image is obtained by using a clustering scheme, and then the separation of a foreground (cytoplasm lumps) and a background (regions outside the cytoplasm lumps) is realized by using a threshold value method according to the super-pixel image, so that the cytoplasm lumps can be segmented. The nucleus has the characteristics of relatively low gray value, uniform texture and clear boundary and almost circular boundary, and can be detected by a watershed method, but various noises which are difficult to predict are mixed in the whole time due to the complexity and uncertainty of the cervical cell image.
disclosure of Invention
The invention aims to provide a cervical cell image segmentation method based on machine vision for segmenting and positioning cells in a cervical cell image and helping doctors to improve cervical cell identification efficiency.
The purpose of the invention can be realized by the following technical scheme:
a cervical cell image segmentation method based on machine vision is applied to a medical image of cervical cells, and the medical image of cervical cells mainly has three characteristics:
1) the image is very large, and generally occupies more than 30GB of storage space;
2) the number of cells on each image is very large, a lot of cells are gathered in the dense hemp in some places, the number of the cells in some places is small, and even no cells exist in a large range around some areas;
3) When cervical cytograms are acquired, the cellular images are exposed to light, dyes, mucus, blood and inflammatory cells, which cause a lot of interference factors in the cellular images that affect the processing of the cellular images, i.e., objects appear in the images that may look somewhat similar to cells, but are not true cells in reality.
The invention relates to a cervical cell image segmentation method based on machine vision, which comprises the following steps of cutting an oversized medical cytogram which is difficult to directly process into small images which are easy to process, and performing cell segmentation after denoising treatment is performed on the cut small images:
step 1: carrying out dicing treatment on the original image of the cervical cells according to specific requirements to obtain a plurality of cervical cell subgraphs;
step 2: removing objects influencing cell cutting in the image by denoising treatment on each cervical cell subgraph, wherein the objects comprise fuel speed, dark lines of liquid and burrs caused by uneven dye;
and step 3: processing the data type and format of the image and reducing the size of the image further uniformly for all the cervical cell subgraphs which are subjected to denoising processing to obtain all the preprocessed cervical cell subgraphs;
and 4, step 4: calculating all preprocessed cervical cell subgraphs by adopting a clustering method to obtain superpixels, separating the foreground from the background by utilizing a threshold value method based on the obtained superpixels, and performing open operation on a cytoplasm block to obtain a cytoplasm region;
and 5: performing nucleus detection on all the preprocessed cervical cell subgraphs by adopting an improved maximum stable extremum region method to obtain screened nuclei;
step 6: and further carrying out cell region estimation by combining the obtained cytoplasm region and the screened cell nucleus, and refining the estimation result to obtain the cervical cell image segmentation result.
Further, the step 1 specifically includes: and carrying out dicing treatment on the original image of the cervical cells in different sizes according to different requirements by using equipment with different processing capabilities on the image on the basis of ensuring the resolution of the image to obtain a plurality of cervical cell subgraphs.
further, the denoising processing in the step 2 adopts a bilateral filtering denoising method.
Further, the step 3 specifically includes: and further processing the data types and formats of the images uniformly for all the cervical cell subgraphs which are subjected to denoising processing, namely unifying the color spaces of the image formats into an RGB format, unifying the image data values into 0-255, and reducing the size to obtain all the preprocessed cervical cell subgraphs.
Further, the step 4 specifically comprises the following sub-steps:
step 41: performing quick shift clustering on all the preprocessed cervical cell subgraphs to obtain a clustering label;
Step 42: obtaining a region adjacency graph by using an average color intensity calculation method for the obtained clustering label;
Step 43: carrying out similar region combination on the obtained region adjacency graphs to obtain super-pixel images;
Step 44: obtaining a foreground by using a threshold value method based on the superpixel image, and realizing the separation of a cytoplasm lump and a background;
step 45: and (4) applying mathematical morphological opening operation to the cytoplasm lumps to obtain cytoplasm regions.
further, the step 5 comprises the following sub-steps:
step 51: converting all the preprocessed cervical cell subgraphs into gray images, and detecting cell nuclei in the images by adopting a maximum stable extremum region algorithm (MSER) to obtain cell nucleus candidate items;
step 52: filtering the cell nucleus candidate items according to cytology prior knowledge and scoring the remaining cell nucleus candidate items after filtering in the cell nucleus candidate items;
step 53: and according to the obtained filtered remaining cell nucleus candidate items and the corresponding scores, applying a non-maximum suppression algorithm (NMS suppression) to remove repeated cell nucleus candidate items to obtain the screened cell nucleus.
further, the step 6 comprises the following sub-steps:
step 61: and further screening the unqualified cell nucleuses by carrying out a delivery operation on the obtained cytoplasm lumps and the screened cell nucleuses, wherein the corresponding mathematical description formula is as follows:
wherein N represents the selected nucleus, M represents the cytoplasmic mass, NT represents the nucleus obtained by further selection, and NF represents the nucleus removed by further selection;
Step 62: evaluating the cell morphology in the image through the further screened cell nucleus combination prospect to obtain an evaluation result;
and step 63: and refining the estimation result to obtain the cervical cell image segmentation result.
Further, the step 62 includes the following sub-steps:
step 621: associating each point of the foreground boundary with the cell nucleus with the closest corresponding straight line distance;
Step 622: obtaining the cell boundary of each overlapping region by tracing extreme plexus boundary points associated with the cell nucleus using interpolation;
step 623: carrying out distance transformation on cell boundaries to obtain a geometric centroid;
step 624: and calculating a single shape prior by utilizing the cell boundary further in combination with the geometric center, wherein the obtained shape prior is an evaluation result.
further, the calculation formula of the shape prior in step 624 is:
h(x)=max h(x),h∈C
where C represents the intra-cluster cell set, hc represents the shape prior of each cluster, β represents the free parameter estimated by cross-validation, and t (-) represents the distance of the computed point x from the geometric centroid of the initial segmentation.
Compared with the prior art, the invention has the following advantages:
Firstly, the reasonable application of the super-pixel and the mathematical morphology in the invention is as follows: the invention utilizes the characteristic of cell distribution in the cervical cytogram, integrally processes the cytoplasm lumps by carrying out superpixel calculation on the image, improves the processing speed, and processes the obtained cytoplasm lumps by applying mathematical morphology, thereby improving the accuracy.
secondly, the improved maximum stable extremum region algorithm of the invention: according to the characteristics of the cell nucleus, the cell nucleus candidate items can be conveniently and quickly obtained by applying the maximum stable extremum region algorithm, the maximum stable extremum region algorithm is improved by using the prior knowledge of cytology and an improved non-maximum value inhibition method, and the robustness of cell nucleus detection is improved.
and thirdly, the obtained cell nucleus candidate items and the cytoplasm lumps are combined to further screen the cell nucleus, so that the robustness of cell nucleus detection is further improved.
drawings
FIG. 1 is a simplified structural diagram of a cervical cell image segmentation method based on machine vision according to the present invention;
FIG. 2 is a flow chart of cytoplasm mass and nucleus detection of a cervical cell image segmentation method based on machine vision according to the present invention;
FIG. 3 is a diagram of the core algorithm structure of the cervical cell image segmentation method based on machine vision according to the present invention;
Fig. 4 is a flowchart illustrating an overall method of the cervical cell image segmentation method based on machine vision according to the present invention.
Detailed Description
the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
examples
as shown in FIG. 4, the invention discloses a cervical cell image segmentation method based on machine vision, which realizes automatic positioning of cells on a medical original image of cervical cells to help a doctor to identify the image of the cervical cells.
The invention comprises the following procedures:
(1) The method comprises the following steps of performing dicing treatment on a medical original image of cervical cells, and performing appropriate dicing treatment on a cervical cell map according to specific requirements to obtain a cervical cell small map which is relatively easy to treat;
(2) performing noise removal processing on each small image to remove objects which can influence cell segmentation in the image, wherein the objects comprise fuel speed, dark lines of liquid, burrs and the like caused by uneven dye;
(3) Processing the data type and format of the image, unifying the color space into RGB, and enabling the value to be 0-255, so that the subsequent processing operation is facilitated, and in addition, in order to improve the processing speed, the size of the image can be reduced appropriately;
(4) calculating to obtain superpixels of the cervical cell subgraph by adopting a clustering scheme, realizing the separation of a foreground (cytoplasm lumps) and a background by utilizing a threshold value method by relying on the superpixels, and then performing mathematical morphological opening operation on the cytoplasm lumps to obtain cytoplasm;
(5) Nuclear detection is performed using an improved maximally stable extremal region approach, where the improvement includes a priori knowledge of the cytological morphology and the addition of improved non-maxima suppression methods.
(6) the resulting cytoplasmic pellet was combined with nuclei for further screening of nuclei and cellular region estimation.
In the process (1), the size of the original cervical cell medical image far exceeds the upper limit of the operation of general machine equipment, and the original cervical cell medical image cannot be directly processed at all, but the medical image is processed, and in order to ensure the identification accuracy, the requirement of the resolution of the image needs to be ensured, so that the original cervical cell medical image needs to be cut into blocks on the basis of ensuring the resolution of the image, the specific identification requirement and the processing capacity of different equipment on the image are different, and therefore the image needs to be cut into pieces with proper size according to the requirement.
In the processes (2) and (3), bilateral filtering capable of achieving the effects of keeping edges and reducing noise and smoothing is adopted, then the image is reduced by 0.5 time, the data values of the image are unified to 0-255, the calculated amount is reduced, the error probability is reduced, and the processing speed is improved.
In the process (4), quick shift clustering is carried out on the preprocessed cervical cell minimap to obtain a clustering label, an area adjacent map is obtained by calculating the average color intensity of the obtained clustering label, similar areas of the obtained area adjacent map are merged to obtain a super-pixel image of the original image, then a threshold method is used for obtaining a foreground (namely a cytoplasm area) according to the obtained super-pixel image, so that effective separation of cell masses and a background (areas except the cell masses) is realized, then isolated small points, burrs, bridges and other small noise points are removed by using mathematical morphological opening operation, and the robustness of cytoplasm block mass detection is improved.
In the process (5) of the present invention, the method comprises the following sub-steps:
Step 51: converting all the preprocessed cervical cell subgraphs into gray images, and detecting cell nuclei in the images by adopting a maximum stable extremum region algorithm to obtain cell nucleus candidate items;
step 52: filtering the cell nucleus candidate items according to cytology prior knowledge and scoring the remaining cell nucleus candidate items after filtering in the cell nucleus candidate items;
Step 53: and according to the obtained filtered remaining cell nucleus candidate items and the corresponding scores, obtaining screened cell nuclei by removing repeated cell nucleus candidate items by using a non-maximum suppression algorithm.
in order to detect all nuclei as far as possible, the maximum stable extremum region algorithm is performed by setting the key parameters as follows:
the gradient value _ delta is 5, the minimum cell nucleus parameter _ min _ area is 50, and the maximum cell nucleus parameter _ max _ area is 1200
Performing preliminary detection on cell nucleuses, and obtaining cell nucleus candidate items; then, screening the cell nucleus candidate items by using the appearance and shape attributes of the cell nucleus, and setting a score for each cell nucleus candidate item by using the attributes during screening, wherein the score is the credibility of whether the candidate nucleus is a eukaryotic cell, the score of a nucleus is determined to be 0, the score of the nucleus is determined to be 1, and values of two key parameters are determined: average intensity is 0.4 and eccentricity is 0.9. This step enables to screen out candidates for which a large part is not a true nucleus; on the basis, a non-maximum suppression algorithm is applied to remove repeated cell nucleus candidate items, and the robustness of cell nucleus detection is greatly improved.
In the process (6), the obtained cytoplasm lumps and the candidate nuclei are subjected to intersection operation, and non-compliant nuclei are further screened out, because the nuclei are actually in the cytoplasm, for the detected nucleus chosen from the cytoplasm, if the candidate nuclei are not in the cytoplasm lumps, it is determined that the candidate nuclei are not the nuclei and are an erroneous detection, and the mathematical expression is as follows:
wherein N represents the selected nucleus, M represents the cytoplasmic mass, NT represents the nucleus obtained by further selection, and NF represents the nucleus removed by further selection; note that M here refers to one of the cytoplasmic pellets, not the entire cytoplasmic pellet.
then the estimation of the cellular region is carried out as follows:
1) Associating each point of the cluster boundary with the nearest core; this indicates that the cell is assumed to possess this boundary point. The only constraint applied here is that the line connecting the cluster boundary to the nucleus must be completely within the cluster (which means that in this example it is assumed that the cell region forms a convex set). In a mass containing a large number of cells, it may happen that some nuclei do not have boundaries associated with them (i.e., boundaries of nuclei). The cells are completely within the cluster, assuming the cells are circular with a radius equal to the distance to the nearest nucleus within the cluster.
2) the cell boundaries in each overlap region were inferred by interpolation by tracing extreme plexus boundary points associated with the cell.
3) and (3) carrying out distance transformation on the boundary obtained in the step (2) to obtain a geometric centroid.
4) The resulting boundary described above is used to compute a single shape prior.
the calculation formula of the shape prior in 4) above is:
h(x)=max h(x),h∈C
where C represents the intra-cluster cell set, hc represents the shape prior of each cluster, β represents the free parameter estimated by cross-validation, and t (-) represents the distance of the computed point x from the geometric centroid of the initial segmentation.
The above process can be simplified as the following process:
(1) distance transformation;
(2) Watershed algorithm;
(3) A shape prior of the cell;
as shown in fig. 1, the method is a simplified overall flow chart of a cervical cell image segmentation method based on machine vision, and the processing procedure can be clearly understood, namely, the method is divided into two parts, namely, the cervical cell original image is cut into small blocks to be subjected to subsequent segmentation processing, and then the cervical cell small image obtained by cutting is subjected to the core processing part of the method.
as shown in fig. 2, a cytoplasm mass and nucleus detection flowchart of a cervical cell image segmentation method based on machine vision according to the present invention adopts the concept of first-order separation and then-order combination: the cytoplasm lumps and the cell nucleus are separated for detection, and then the cell nucleus is screened according to the position relation of the cell nucleus and the cytoplasm.
as shown in fig. 3, which is a core algorithm structure diagram of the cervical cell image segmentation method based on machine vision according to the present invention, after the cell nucleus is finally screened by using the cytoplasm lumps, the cytoplasm lumps and the finally obtained cell nucleus are combined, and the position of the cell is determined by using distance transformation, watershed algorithm, and cell shape prior, so as to implement cell segmentation.
while the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (9)
1. A cervical cell image segmentation method based on machine vision is characterized by comprising the following steps:
Step 1: carrying out dicing treatment on the original image of the cervical cells according to specific requirements to obtain a plurality of cervical cell subgraphs;
step 2: removing objects influencing cell cutting in the image by denoising treatment on each cervical cell subgraph, wherein the objects comprise fuel speed, dark lines of liquid and burrs caused by uneven dye;
And step 3: processing the data type and format of the image and reducing the size of the image further uniformly for all the cervical cell subgraphs which are subjected to denoising processing to obtain all the preprocessed cervical cell subgraphs;
and 4, step 4: calculating all preprocessed cervical cell subgraphs by adopting a clustering method to obtain superpixels, separating the foreground from the background by utilizing a threshold value method based on the obtained superpixels, and performing open operation on a cytoplasm block to obtain a cytoplasm region;
and 5: performing nucleus detection on all the preprocessed cervical cell subgraphs by adopting an improved maximum stable extremum region method to obtain screened nuclei;
step 6: and further carrying out cell region estimation by combining the obtained cytoplasm region and the screened cell nucleus, and refining the estimation result to obtain the cervical cell image segmentation result.
2. The method for cervical cell image segmentation based on machine vision according to claim 1, wherein the step 1 specifically comprises: and carrying out dicing treatment on the original image of the cervical cells in different sizes according to different requirements by using equipment with different processing capabilities on the image on the basis of ensuring the resolution of the image to obtain a plurality of cervical cell subgraphs.
3. The method for segmenting cervical cellular image based on machine vision according to claim 1, wherein the denoising process in the step 2 is a bilateral filtering denoising method.
4. the method for cervical cell image segmentation based on machine vision according to claim 1, wherein the step 3 specifically comprises: and further processing the data types and formats of the images uniformly for all the cervical cell subgraphs which are subjected to denoising processing, namely unifying the color spaces of the image formats into an RGB format, unifying the image data values into 0-255, and reducing the size to obtain all the preprocessed cervical cell subgraphs.
5. The method for cervical cell image segmentation based on machine vision according to claim 1, wherein the step 4 specifically comprises the following sub-steps:
Step 41: performing quick shift clustering on all the preprocessed cervical cell subgraphs to obtain a clustering label;
step 42: obtaining a region adjacency graph by using an average color intensity calculation method for the obtained clustering label;
step 43: carrying out similar region combination on the obtained region adjacency graphs to obtain super-pixel images;
Step 44: obtaining a foreground by using a threshold value method based on the superpixel image, and realizing the separation of a cytoplasm lump and a background;
step 45: and (4) applying mathematical morphological opening operation to the cytoplasm lumps to obtain cytoplasm regions.
6. The method for cervical cell image segmentation based on machine vision according to claim 1, wherein the step 5 comprises the following sub-steps:
Step 51: converting all the preprocessed cervical cell subgraphs into gray images, and detecting cell nuclei in the images by adopting a maximum stable extremum region algorithm to obtain cell nucleus candidate items;
step 52: filtering the cell nucleus candidate items according to cytology prior knowledge and scoring the remaining cell nucleus candidate items after filtering in the cell nucleus candidate items;
Step 53: and according to the obtained filtered remaining cell nucleus candidate items and the corresponding scores, obtaining screened cell nuclei by removing repeated cell nucleus candidate items by using a non-maximum suppression algorithm.
7. The method for cervical cell image segmentation based on machine vision according to claim 1, wherein the step 6 comprises the following sub-steps:
Step 61: and further screening the unqualified cell nucleuses by carrying out a delivery operation on the obtained cytoplasm lumps and the screened cell nucleuses, wherein the corresponding mathematical description formula is as follows:
Wherein N represents the selected nucleus, M represents the cytoplasmic mass, NT represents the nucleus obtained by further selection, and NF represents the nucleus removed by further selection;
step 62: evaluating the cell morphology in the image through the further screened cell nucleus combination prospect to obtain an evaluation result;
And step 63: and refining the estimation result to obtain the cervical cell image segmentation result.
8. the method of claim 7, wherein the step 62 comprises the following sub-steps:
Step 621: associating each point of the foreground boundary with the cell nucleus with the closest corresponding straight line distance;
step 622: obtaining the cell boundary of each overlapping region by tracing extreme plexus boundary points associated with the cell nucleus using interpolation;
Step 623: carrying out distance transformation on cell boundaries to obtain a geometric centroid;
step 624: and calculating a single shape prior by utilizing the cell boundary further in combination with the geometric center, wherein the obtained shape prior is an evaluation result.
9. the method of claim 8, wherein the shape prior in step 624 is calculated by the following formula:
h(x)=max h(x),h∈C
where C represents the intra-cluster cell set, hc represents the shape prior of each cluster, β represents the free parameter estimated by cross-validation, and t (-) represents the distance of the computed point x from the geometric centroid of the initial segmentation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910725869.5A CN110544262B (en) | 2019-08-07 | 2019-08-07 | Cervical cell image segmentation method based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910725869.5A CN110544262B (en) | 2019-08-07 | 2019-08-07 | Cervical cell image segmentation method based on machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110544262A true CN110544262A (en) | 2019-12-06 |
CN110544262B CN110544262B (en) | 2023-05-02 |
Family
ID=68710079
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910725869.5A Active CN110544262B (en) | 2019-08-07 | 2019-08-07 | Cervical cell image segmentation method based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110544262B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111429761A (en) * | 2020-02-28 | 2020-07-17 | 中国人民解放军陆军军医大学第二附属医院 | Artificial intelligent simulation teaching system and method for bone marrow cell morphology |
CN111724381A (en) * | 2020-06-24 | 2020-09-29 | 武汉互创联合科技有限公司 | Microscopic image cell counting and posture identification method based on multi-view cross validation |
CN116309280A (en) * | 2022-12-16 | 2023-06-23 | 上海药明康德新药开发有限公司 | Lymphocyte labeling method and system |
CN117253228A (en) * | 2023-11-14 | 2023-12-19 | 山东大学 | Cell cluster space constraint method and system based on nuclear image distance intra-coding |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831607A (en) * | 2012-08-08 | 2012-12-19 | 深圳市迈科龙生物技术有限公司 | Method for segmenting cervix uteri liquid base cell image |
CN107256558A (en) * | 2017-05-18 | 2017-10-17 | 深思考人工智能机器人科技(北京)有限公司 | The cervical cell image automatic segmentation method and system of a kind of unsupervised formula |
CN109815888A (en) * | 2019-01-21 | 2019-05-28 | 武汉兰丁医学高科技有限公司 | A kind of novel Papanicolau staining process and abnormal cervical cells automatic identifying method |
-
2019
- 2019-08-07 CN CN201910725869.5A patent/CN110544262B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831607A (en) * | 2012-08-08 | 2012-12-19 | 深圳市迈科龙生物技术有限公司 | Method for segmenting cervix uteri liquid base cell image |
CN107256558A (en) * | 2017-05-18 | 2017-10-17 | 深思考人工智能机器人科技(北京)有限公司 | The cervical cell image automatic segmentation method and system of a kind of unsupervised formula |
CN109815888A (en) * | 2019-01-21 | 2019-05-28 | 武汉兰丁医学高科技有限公司 | A kind of novel Papanicolau staining process and abnormal cervical cells automatic identifying method |
Non-Patent Citations (1)
Title |
---|
郑欣等: "基于YOLO模型的宫颈细胞簇团智能识别方法", 《液晶与显示》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111429761A (en) * | 2020-02-28 | 2020-07-17 | 中国人民解放军陆军军医大学第二附属医院 | Artificial intelligent simulation teaching system and method for bone marrow cell morphology |
CN111724381A (en) * | 2020-06-24 | 2020-09-29 | 武汉互创联合科技有限公司 | Microscopic image cell counting and posture identification method based on multi-view cross validation |
CN111724381B (en) * | 2020-06-24 | 2022-11-01 | 武汉互创联合科技有限公司 | Microscopic image cell counting and posture identification method based on multi-view cross validation |
CN116309280A (en) * | 2022-12-16 | 2023-06-23 | 上海药明康德新药开发有限公司 | Lymphocyte labeling method and system |
CN117253228A (en) * | 2023-11-14 | 2023-12-19 | 山东大学 | Cell cluster space constraint method and system based on nuclear image distance intra-coding |
CN117253228B (en) * | 2023-11-14 | 2024-02-09 | 山东大学 | Cell cluster space constraint method and system based on nuclear image distance intra-coding |
Also Published As
Publication number | Publication date |
---|---|
CN110544262B (en) | 2023-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110544262A (en) | cervical cell image segmentation method based on machine vision | |
CN109522908B (en) | Image significance detection method based on region label fusion | |
WO2018107939A1 (en) | Edge completeness-based optimal identification method for image segmentation | |
US10229488B2 (en) | Method and system for determining a stage of fibrosis in a liver | |
Jiang et al. | A novel white blood cell segmentation scheme using scale-space filtering and watershed clustering | |
CN106815853B (en) | Method and device for segmenting retinal blood vessels in fundus image | |
CN110458835B (en) | Image processing method, device, equipment, system and medium | |
CN106327507B (en) | A kind of color image conspicuousness detection method based on background and foreground information | |
CN110009638B (en) | Bridge inhaul cable image appearance defect detection method based on local statistical characteristics | |
CN103984958A (en) | Method and system for segmenting cervical caner cells | |
CN111402267A (en) | Segmentation method, device and terminal for epithelial cell nucleus in prostate cancer pathological image | |
CN105828691B (en) | Image processing apparatus, image processing method | |
CN107492084B (en) | Typical clustering cell nucleus image synthesis method based on randomness | |
CN106504261B (en) | A kind of image partition method and device | |
CN111126162A (en) | Method, device and storage medium for identifying inflammatory cells in image | |
CN114494318B (en) | Cornea contour extraction method based on cornea dynamic deformation video of Ojin algorithm | |
CN114677525B (en) | Edge detection method based on binary image processing | |
CN112529853A (en) | Method and device for detecting damage of netting of underwater aquaculture net cage | |
JP6819445B2 (en) | Information processing equipment, control methods, and programs | |
CN109242854A (en) | A kind of image significance detection method based on FLIC super-pixel segmentation | |
CN104835155A (en) | Fractal-based early-stage breast cancer calcification point computer auxiliary detection method | |
JPWO2016117018A1 (en) | Image processing apparatus, image processing method, and image processing program | |
CN113850792A (en) | Cell classification counting method and system based on computer vision | |
CN113723314A (en) | Sugarcane stem node identification method based on YOLOv3 algorithm | |
CN111429461A (en) | Novel segmentation method for overlapped exfoliated epithelial cells |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |