CN112365471B - Cervical cancer cell intelligent detection method based on deep learning - Google Patents
Cervical cancer cell intelligent detection method based on deep learning Download PDFInfo
- Publication number
- CN112365471B CN112365471B CN202011259420.3A CN202011259420A CN112365471B CN 112365471 B CN112365471 B CN 112365471B CN 202011259420 A CN202011259420 A CN 202011259420A CN 112365471 B CN112365471 B CN 112365471B
- Authority
- CN
- China
- Prior art keywords
- cell
- classification
- area
- nucleus
- cells
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 20
- 206010008342 Cervix carcinoma Diseases 0.000 title claims abstract description 19
- 208000006105 Uterine Cervical Neoplasms Diseases 0.000 title claims abstract description 19
- 201000010881 cervical cancer Diseases 0.000 title claims abstract description 19
- 238000013135 deep learning Methods 0.000 title claims abstract description 15
- 210000004027 cell Anatomy 0.000 claims abstract description 127
- 210000003855 cell nucleus Anatomy 0.000 claims abstract description 76
- 206010028980 Neoplasm Diseases 0.000 claims abstract description 52
- 201000011510 cancer Diseases 0.000 claims abstract description 52
- 238000000034 method Methods 0.000 claims abstract description 49
- 210000004940 nucleus Anatomy 0.000 claims abstract description 27
- 238000003745 diagnosis Methods 0.000 claims abstract description 19
- 238000012216 screening Methods 0.000 claims abstract description 18
- 238000002360 preparation method Methods 0.000 claims abstract description 3
- 238000013145 classification model Methods 0.000 claims description 28
- 230000011218 segmentation Effects 0.000 claims description 23
- 238000012549 training Methods 0.000 claims description 13
- 230000000694 effects Effects 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 5
- 238000004140 cleaning Methods 0.000 claims description 4
- 230000003631 expected effect Effects 0.000 claims description 4
- 230000002159 abnormal effect Effects 0.000 claims description 3
- 239000010410 layer Substances 0.000 claims 4
- 239000000592 Artificial Cell Substances 0.000 claims 1
- 238000004458 analytical method Methods 0.000 claims 1
- 238000010191 image analysis Methods 0.000 claims 1
- 238000007781 pre-processing Methods 0.000 claims 1
- 239000002344 surface layer Substances 0.000 claims 1
- 238000002474 experimental method Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000001575 pathological effect Effects 0.000 description 3
- 108010077544 Chromatin Proteins 0.000 description 2
- 210000003483 chromatin Anatomy 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 206010066054 Dysmorphism Diseases 0.000 description 1
- 206010027476 Metastases Diseases 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000000270 basal cell Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000011362 coarse particle Substances 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 230000009401 metastasis Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000010186 staining Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
Abstract
An intelligent cervical cancer cell detection method based on deep learning. The invention relates to classification of nuclei by a deep learning method. The invention aims to solve the problems of low accuracy, long time consumption and the like of the existing traditional diagnosis mode for detecting cancer cells. In order to solve the problem, the invention provides an intelligent cervical cancer cell screening method based on deep learning. The method comprises the following specific processes: firstly, preparing data; secondly, dividing cell nucleuses; thirdly, classifying cell nucleuses; and fourthly, screening cancer cells. In the cell nucleus classification part, data is expanded and classified by an active learning method; the model is based on ResNeSt, doctor diagnosis experience is introduced, and more accurate model is trained by extracting diagnosis indexes and coacting. Experiments show that the accuracy rate of the cell nucleus classification method is higher than that of the original model, and in addition, the invention also provides a more effective data preparation method for expanding data and subdividing classes. The invention is applied to the field of medical image classification.
Description
Technical Field
The invention relates to classification of nuclei by deep learning methods
Background
Cervical cancer is the second leading killer of women, second only to breast cancer in morbidity and mortality. Cancer is hoped to be cured only in the early cases of finding it, and is incurable after metastasis in the late stages. Early detection and early treatment is therefore the only way to cope with cancer outbreaks. The current cervical cancer diagnosis mainly depends on manual film reading of doctors, and has the defects of large workload, high misdiagnosis rate and incapability of carrying out large-area screening. With the development of artificial intelligence technology, the cervical cancer cell intelligent detection method appearing in recent years can effectively reduce the diagnosis workload of doctors and improve the diagnosis accuracy rate by automatically shooting pathological images and automatically analyzing and identifying cancer cells. However, the pathological automatic diagnosis system is not really applied at present, and many methods are realized only by means of deep learning. There is no more diagnosis experience considering the actual diagnosis of the doctor. These problems present a significant challenge to the application of intelligent detection methods for cervical cancer cells.
The main research can be applied to the actual cervical cancer cell intelligent detection method, and the diagnosis work of doctors is effectively assisted. Only in this way, the film reading time of the doctor can be reduced, so that misdiagnosis caused by eye fatigue of the doctor is prevented, and the doctor can be helped to master cell data information of the patient on a more accurate whole disc. In recent years, technologies such as deep learning, computer vision, pattern recognition and the like have made breakthrough progress, and artificial intelligence technology has been deeply explored in the fields of medical images and pathological images. With the continuous development and maturity of artificial intelligence technology, the policy system encouraging artificial intelligence and medical treatment is also continuously issued by the nation in recent years. At present, there are two ways for artificially and intelligently detecting cervical cancer cells, one is to divide cell nucleus and classify the cell nucleus to obtain cancer cells, the other is to directly classify and identify cancer cells, and the rest normal cells are not considered. The two methods have important key technologies of cell segmentation, cell detection and identification.
Currently, the classical segmentation method applied to medical images is U-Net. And in the years, a plurality of scholars improve the U-Net segmentation model further. The improvement mainly comprises three directions of loss function, attention adding mechanism and jump link. There are also some segmentation methods like FCN, ACNet etc. There are also many methods for image classification, the most common being methods such as ResNet, VGG, AlexNet, etc. Among them, the ResNet model has become the backbone network of many object detection networks. In 2019, a Li-bath team is further improved in a ResNet basic network, a split-Attention mechanism is added, a ResNeSt network is provided, and the ResNeSt network has a stronger effect in the field of segmentation and classification.
Another method for directly detecting cervical cancer cells is the YOLO series method and Mask R-CNN, etc. The YOLO series image recognition is a method of directly returning the position of a bounding box and the category to which the bounding box belongs in an output layer by using the whole image as the input of a network, and is the fastest detection method. Mask R-CNN is a two-stage framework, where the first stage scans images and generates proposals (i.e., areas that may contain an object), and the second stage classifies proposals and generates bounding boxes and masks, which is a method that can both perform segmentation and object detection.
The first method of dividing before classifying can analyze every cell of the patient comprehensively, but has low efficiency and long detection time; although the method for directly recognizing cancer cells is efficient, it is impossible to grasp all the cell information of a patient on a whole basis, and a miss detection may occur. Therefore, according to the specific requirements of doctors, the method adopts a detection method of firstly segmenting and then classifying, introduces the diagnosis experience of doctors, and connects the features extracted by deep learning and the features extracted by manual work to jointly act on the classification model, thereby better classifying the cervical cells.
Disclosure of Invention
The invention aims to solve the problems of low diagnosis accuracy, strong subjectivity and long diagnosis time of doctors, thereby intelligently, quickly and accurately detecting cancer cells. The invention is divided into four stages: data preparation, nucleus segmentation, nucleus classification, cancer cell screening. The method comprises the steps of firstly detecting cell nuclei by using an improved U-Net segmentation method, then classifying the detected cell nuclei, and finally screening cancer cells for cell categories according to doctor experience and the characteristics of each category, wherein the general flow is shown in figure 1.
The invention is mainly realized by the following technical scheme:
step one, collecting slide images of a patient under a microscope, and converting all the images into the size required by a model;
and step two, segmenting all images of the patient. In this step, the invention provides an improved U-Net segmentation method:
firstly, an attention mechanism is added to a U-Net segmentation model, and then symmetrical convolution is changed into asymmetrical convolution, so that the efficiency and the precision of network segmentation are improved. The improved network design for the U-Net segmentation model is shown in FIG. 2;
and thirdly, performing classified data expansion and classification by using an active learning method. The specific method comprises the following steps:
the step utilizes an active learning method to expand and subdivide the classified data. Firstly, preparing cell images of patients with different forms, and segmenting the cell images by using a two-step segmentation model method. Then, the segmented cell nucleus is stored by a rectangular cutting of the cell nucleus outline with the minimum external rectangle expanded by 10px, and a batch of initial data D1 is obtained. First, the manual work is roughly classified into four categories. Initial data D1 was then used to train a mlp classification model, which was used to cut and roughly classify a large number of new cell images. After rough classification, the data of each class is cleaned according to the classification condition of the class, and the class is analyzed for errors and improved, such as classification or combination. Finally, the misclassification result is manually adjusted or is re-classified into a new class, and the new class is expanded to a training data set D1. The subdivision is circulated until the data set types reach 2000.
And after the types of the data sets reach 2000, the ResNeSt classification model is trained by utilizing the newly adjusted training data set D1 again, the classification effect is observed, and if the expected effect is reached, the operation is stopped. If not, the training data set is expanded and data is balanced, and the category is adjusted. The specific method comprises the steps of continuously classifying new images by using a ResNeSt classification model, cleaning and correcting the classified cell nucleus images, observing whether classification is unreasonable or data is unbalanced, and then manually adjusting. The reseest classification model is trained again until the desired effect is achieved.
Finally, the category with the best classification effect and the number of each category are determined. The expanded classification data and class subdivision flow is shown in FIG. 3.
And step four, carrying out nucleus classification by using a ResNeSt classification model introducing doctor diagnosis experience. And (3) expanding the nucleus segmented in the second step outwards by 10px according to the circumscribed rectangle to segment the nucleus, so that the integrity of the nucleus is maintained, and the information of cytoplasm is carried. The nuclei are then classified. The ResNeSt classification model introduced with doctor diagnosis experience is used for carrying out the cell nucleus classification, and the specific method is as follows:
firstly, automatically extracting image characteristics CFeature by using a ResNeSt classification model migrated on an ImageNet data set, and then extracting a diagnostic standard which can be quantified as characteristics MFeature from the ResNeSt classification model according to a book of cervical cytology Bethesda report system, wherein the extracted diagnostic standard is as follows:
(1) deep staining of nuclei with changes in size and shape.
(2) The chromatin is uniformly distributed, often in the form of coarse particles, and sometimes the chromatin is in the form of coalball or condensed opaque.
(3) The cell nucleus is enlarged 2.5-3 times of the middle cell nucleus.
The cell-forming characteristics MFeature are refined according to the above criteria as follows:
(1) area of cell nucleus
AreanNumber of pixels of nucleus
Wherein, AreanIs the area of the nucleus.
(2) Coefficient of nuclear area
Wherein A ismeanIs the mean value of the area of nucleus in the layer, AnIs the area of the nucleus.
(3) Nuclear dysmorphism
Nuclear roundness
Ovality
Stretching length
The nuclear abnormity degree calculation formula is summarized by the nuclear roundness, the ellipticity and the stretching length:
wherein L is the length of the nucleus, D is the diameter of the nucleus, AnIs the area of the nucleus.
(4) Nuclear gray variance
Wherein x isiThe gray value of each pixel of the cell nucleus is M, and the average value of the cell nucleus gray values is M. n is the number of the cell nucleus pixels.
Finally, the features CFeature and MFeature are connected, and then further classified through the full connection layer, and the classification model network is designed as shown in FIG. 4. And selecting the feature combination of the MFeature which enables the model accuracy to be highest.
And step five, obtaining the types of all cell nucleuses of the patient according to the steps. And (4) screening cancer cells with different strategies according to categories by taking the patient as a unit. The cancer cell class was directly recommended as cancer cell BC 1. Calculating area average values M of other cell types such as surface cell, middle cell nucleus, and atypical cell0,M1,M2And selecting cells with the area more than 2.5 times of the average value, and recommending the cells to be cancer cells BC 2. The screening method comprises the following steps:
BC2=CArea0>2.5*M0 or CArea1>2.5*M1 or CArea2>2.5*M2
wherein CAreai is the area of a nucleus of a cell of type i, M0,M1,M2The area mean values of three types of cell nuclei, namely surface cell nuclei, middle cell nuclei and atypical cell nuclei are respectively.
The score calculation formulas for BC1 and BC2 are as follows:
Score=Arean/10000+Areaindex
Areanarea of the nucleus, AreaindexIs the area factor of the nucleus.
Finally, the two types of cancer cells are combined to be the recommended cancer cell BC, and the formula is as follows:
BC=BC1+BC2。
and step six, sorting the recommended cancer cells BC in a descending order according to the scores of the recommended cancer cells BC, recommending the first 30 cancer cells to a doctor, and setting the number of the recommended cancer cells by the doctor.
At this point, the cancer cell screening is complete.
Effects of the invention
The invention provides an intelligent cervical cancer cell screening method based on deep learning. Firstly, a cell nucleus is segmented by using an improved U-Net segmentation model, then the cell nucleus is cut out by using a minimum circumscribed rectangle of which the cell nucleus outline is externally expanded by 10px, then the cell nucleus is classified, and in this part, the data set is expanded and the classification is subdivided by adopting an active learning method. And the extracted characteristics of the doctor diagnosis experience are introduced into the ResNeSt classification model, so that the model classification is more accurate. Finally, screening cancer cells meeting the standard according to the categories. Experiments show that the invention can enable doctors to better master the information of each cell of patients, and the cervical cancer cells can be more accurately screened according to the characteristics of various types of cells. The specific classification accuracy effect is shown in table 1.
Drawings
FIG. 1 is a flow chart of the overall detection method;
FIG. 2 is a network structure of a segmentation model method;
FIG. 3 is a diagram of an active learning method for augmenting classification data and performing class subdivision;
fig. 4 is a network structure of the classification model method.
Table 1 is a classification accuracy comparison table of the classification model method.
Detailed description of the invention
In this embodiment, the intelligent cervical cancer cell screening method based on deep learning is specifically implemented according to the following steps:
the invention is applied to a 'cancer cell automatic detection system' which is self-developed by people, and the system consists of a PC, a full-automatic microscope, a full-high-definition camera and software.
The automatic cancer cell identification part in the system applies the cervical cancer cell screening method, firstly, a user firstly places a prepared slide glass with a cell sample on an object stage for scanning, then acquires all cell images of a patient, and then specifically identifies the images as follows:
reading all pictures under a patient folder, putting the pictures into a picture queue to be segmented, reading the segmentation queue, putting the segmentation queue into a U-Net segmentation model, obtaining information such as cell nucleus area, circumscribed rectangle and the like, cutting the cell nucleus into a classification queue by using the minimum circumscribed rectangle of the cell nucleus outline image of the circumscribed rectangle of 10px, and putting the cell nucleus into the classification queue. And recording information such as cell contour, cell nucleus area, cell external rectangle and the like of all cells in the image as B0,B1,…BN。
And step two, preparing and expanding classification data only in the model training stage. The specific method comprises the following steps:
firstly, preparing patient cell images with different forms, and segmenting the cell images by utilizing a U-Net segmentation model method. The segmented nuclei are then saved as rectangular cuts with a minimum bounding rectangle of the nucleus contour extending outward by 10px, resulting in a batch of initial data D1. Firstly, the artificial method roughly divides the artificial method into four types M4. Initial data D1 was then used to train a mlp classification model, which was used to cut and initially classify a large number of new cell images roughly. After rough classification, the data of each class is cleaned according to the classification condition of the class, and the class is analyzed for errors and improved, such as classification or combination. Finally, the misclassification result is manually adjusted or is re-classified into a new class, and the new class is expanded to a training data set D1. The subdivision is circulated until the data set types reach 2000.
And after the types of the data sets reach 2000, the ResNeSt classification model is trained by utilizing the newly adjusted training data set D1 again, the classification effect is observed, and if the expected effect is reached, the operation is stopped. If not, the training data set is expanded and the data is balanced and the class is adjusted. The specific method comprises the steps of continuously classifying new images by using a ResNeSt classification model, cleaning and correcting the classified cell nucleus images, observing whether classification is unreasonable or data is unbalanced, and then manually adjusting. The reseest classification model is trained again until the desired effect is achieved.
Finally, the classification is determined to be 10 types, which are respectively: superficial layer cells, middle layer cell nucleus, atypical cells, basal cells, neutrophile macrobolus, mass of 2 cells, mass of more than 2 cells, abnormal cells, and impurities, wherein the data volume of each type is 8000.
And when the model is actually applied, the model is trained, and the second step is skipped without preparing classification data.
And step three, reading the classification queue to obtain a cell nucleus image, and calculating four characteristics of the area, the area coefficient, the cell nucleus profile and the gray variance of each cell nucleus to be marked as MFeature. On the other hand, a ResNeSt classification model is used for extracting the characteristics of the cell nucleus image and marking the cell nucleus image as CFeature, and finally two types of characteristics are connected, a full connection layer is used for classification, and the category and the score are given.
Step four, after classifying the cell nucleus images, classifying the class information BN+1And adding statistical information, combining the cell nucleus information of each image, and then statistically combining the cell nucleus information of the whole patient image.
And step five, screening cancer cells with different strategies according to categories by taking the patient as a unit. If the abnormal cell category is directly recommended as cancer cells and the BC1 is sorted by category score. Calculating area average values M of other cell types such as surface cell, middle cell nucleus, and atypical cell0,M1,M2And selecting cells with the area more than 2.5 times of the average value, and recommending the cells to be cancer cells BC 2. The screening method and the fraction formula are as follows:
BC2=CArea0>2.5*M0 or CArea1>2.5*M1 or CArea2>2.5*M2
wherein CAreai is the area of a cell of type i, M0,M1,M2The area mean values of three types of cells of surface cell nucleus, middle cell nucleus and atypical cell nucleus are respectively.
The score calculation formulas for BC1 and BC2 are as follows:
Score=Area/10000+Areaindex
area is the Area of the cell, AreaindexIs the area factor of the cell.
Finally, the two types of cancer cells are combined to be the recommended cancer cell BC, and the formula is as follows:
BC=BC1+BC2。
and step six, sorting the recommended cancer cells BC in a descending order according to the scores, screening the first 30 cancer cells and recommending the cancer cells to a doctor, wherein the doctor can also set the number of the recommended cancer cells.
The algorithm is generally a flow chart, as shown in fig. 1.
The present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof, and it should be understood that various changes and modifications can be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (4)
1. The cervical cancer cell intelligent detection method based on deep learning is characterized by comprising the following steps of data preparation and cell nucleus classification:
s1, segmenting cell nuclei by using a U-Net segmentation model;
s2, expanding classification data and subdividing the classification data by using an active learning method;
s3, performing cell nucleus classification by using a ResNeSt classification model introducing doctor diagnosis experience: automatically extracting characteristics CFeature and artificial characteristic index characteristics MFeature from a ResNeSt classification model for characteristic connection, and then performing further classification through a full connection layer; selecting a feature combination of MFeature which enables the model accuracy to be highest;
s4, counting indexes, and making classification strategies to screen cancer cells: screening cancer cells with different strategies according to categories by taking a patient as a unit; a direct recommendation for classification as cancer cells is cancer cell BC 1; calculating area average values M _0, M _1 and M _2 of the cells classified into surface layer cells, middle layer cell nuclei and atypical cells respectively, selecting cells 2.5 times larger than the area average value, and recommending the cells to be cancer cells BC 2; then the scores of the two types of cancer cells are calculated by using a formula, and the two types of cancer cells are sorted according to the descending order of the scores, and the first 30 types of cancer cells are recommended to doctors.
2. The intelligent cervical cancer cell detection method based on deep learning as claimed in claim 1, wherein the active learning method in step S2 is used for data expansion and category segmentation, and the method comprises the following steps:
firstly, preparing patient cell images with different forms, and segmenting the cell images by utilizing a U-Net segmentation model; cutting and storing the segmented cell nucleus by a rectangle of which the minimum circumscribed rectangle of the cell nucleus outline expands outwards by 10px to obtain a batch of initial data D1; firstly, roughly classifying the artificial cells into four types of abnormal cells, superficial cells, middle cells and atypical cells; then, training a mlp classification model by using the initial data D1, cutting a large number of new cell images by using the classification model and carrying out primary rough classification; after rough classification, cleaning the data according to the classification condition of each class, and performing error analysis and improvement on the classes to subdivide or combine the classes; finally, manually adjusting the misclassification result or reclassifying the misclassification result into a new class, and expanding the new class to a training data set D1; training and adjusting in such a circulating way until each type of the data set reaches 2000; after the types of the data sets reach 2000, the newly adjusted training data set D1 is reused to train the ResNeSt classification model, the classification effect is observed, and if the expected effect is achieved, the operation is stopped; if not, expanding and balancing data of the training data set, and adjusting the class, specifically, classifying new images by using a ResNeSt classification model, cleaning and correcting the classified cell nucleus images, observing whether the class division is unreasonable or the data is unbalanced, and then manually adjusting; training the ResNeSt classification model again until the expected effect is achieved; finally, the category with the best classification effect and the number of each category are determined.
3. The intelligent cervical cancer cell detection method based on deep learning of claim 1, wherein the cell nucleus classification method of the ResNeSt classification model introduced with doctor diagnosis experience in step S3 is as follows:
firstly, automatically extracting the characteristic CFeature by using a ResNeSt classification model pre-trained in ImageNet; then, designing an artificial characteristic index MFeature according to the diagnosis experience of a doctor; MFeature is designed with the following characteristics and formulas:
(1) area of cell
AreanThe number of pixel points of the cell nucleus,
wherein, AreanThe area of the cell nucleus;
(2) coefficient of nuclear area
Wherein A ismeanIs the mean value of the Area of nucleus of the layer, AreanThe area of the cell nucleus;
(3) cell profile
Nuclear roundness:
ovality:
stretching length:
the nuclear abnormity degree calculation formula is summarized by the nuclear roundness, the ellipticity and the stretching length:
wherein L is the length of the nucleus, D is the diameter of the nucleus, AreanThe area of the cell nucleus;
(4) nuclear gray variance
Wherein x isiThe gray value of each pixel of the cell nucleus is obtained, and M is the mean value of the cell nucleus gray values; n is the number of cell nucleus pixels;
finally, carrying out characteristic connection on the characteristic CFeature and the characteristic MFeature, and then carrying out further classification through a full connection layer; and selecting the feature combination of the MFeature which enables the model accuracy to be highest.
4. The intelligent cervical cancer cell detection method based on deep learning of claim 1, wherein the statistical indicators in step S4 are used to make classification strategies for screening cancer cells as follows:
the method detects cancer cells in units of patients, and image analysis in units of patient images; firstly, reading all images of a patient, carrying out cell nucleus segmentation on the images, and cutting out cell nuclei by using a circumscribed rectangle with 10px of outward expansion of cell nucleus outlines; preprocessing the cell nucleus image, counting the cell outline, the cell nucleus area and the cell external rectangle information of the image and marking as B0,B1,…BN(ii) a Classifying the cell nucleus image by using the classification model of the step S3 to obtain the class information BN+1Adding statistical information; firstly, combining the cell nucleus information of each image, and then statistically combining the cell nucleus information of the patient images; finally, taking the patient as a unit, and screening cancer cells with different strategies according to categories; the cancer cell category is directly recommended as cancer cell BC 1; calculating area average values M of the other cell types including surface cell, middle cell nucleus and atypical cell0,M1,M2And selecting cells which are 2.5 times larger than the area mean value and recommending the cells to be cancer cells BC2, wherein the selection mode is as follows:
BC2=CArea0>2.5*M0 or CArea1>2.5*M1 or CArea2>2.5*M2
wherein CAreai is the area of a nucleus of a cell of type i, M0,M1,M2The area mean values of three types of cell nucleuses of surface cell nucleuses, middle cell nucleuses and atypical cell nucleuses are respectively;
the score calculation formulas for BC1 and BC2 are as follows:
Score=Arean/10000+Areaindex
Areanarea of the nucleus, AreaindexThe area coefficient of the cell nucleus; finally, combining the two types of cancer cells into a recommended cancer cell BC; the formula is as follows:
BC=BC1+BC2
the recommended cancer cells BC were ranked in descending order of their score, with the first 30 cancer cells recommended to the physician.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011259420.3A CN112365471B (en) | 2020-11-12 | 2020-11-12 | Cervical cancer cell intelligent detection method based on deep learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011259420.3A CN112365471B (en) | 2020-11-12 | 2020-11-12 | Cervical cancer cell intelligent detection method based on deep learning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112365471A CN112365471A (en) | 2021-02-12 |
CN112365471B true CN112365471B (en) | 2022-06-24 |
Family
ID=74514417
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011259420.3A Expired - Fee Related CN112365471B (en) | 2020-11-12 | 2020-11-12 | Cervical cancer cell intelligent detection method based on deep learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112365471B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113052806B (en) * | 2021-03-15 | 2023-02-28 | 黑龙江机智通智能科技有限公司 | Canceration degree grading system |
CN112951427B (en) * | 2021-03-16 | 2023-12-08 | 黑龙江机智通智能科技有限公司 | System for classifying abnormal cells |
CN113516022B (en) * | 2021-04-23 | 2023-01-10 | 黑龙江机智通智能科技有限公司 | Fine-grained classification system for cervical cells |
CN113269724A (en) * | 2021-04-28 | 2021-08-17 | 西安交通大学 | Fine-grained cancer subtype classification method |
CN118333957A (en) * | 2024-04-07 | 2024-07-12 | 广州锟元方青医疗科技有限公司 | Specific image feature mining method and system for cervical cancer cell nuclei |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109102498A (en) * | 2018-07-13 | 2018-12-28 | 华南理工大学 | A kind of method of cluster type nucleus segmentation in cervical smear image |
CN109190567A (en) * | 2018-09-10 | 2019-01-11 | 哈尔滨理工大学 | Abnormal cervical cells automatic testing method based on depth convolutional neural networks |
CN110675411A (en) * | 2019-09-26 | 2020-01-10 | 重庆大学 | Cervical squamous intraepithelial lesion recognition algorithm based on deep learning |
CN111458269A (en) * | 2020-05-07 | 2020-07-28 | 厦门汉舒捷医疗科技有限公司 | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image |
CN111783518A (en) * | 2020-05-14 | 2020-10-16 | 北京三快在线科技有限公司 | Training sample generation method and device, electronic equipment and readable storage medium |
CN112233085A (en) * | 2020-10-14 | 2021-01-15 | 南通大学 | Cervical cell image segmentation method based on pixel prediction enhancement |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11455718B2 (en) * | 2018-03-09 | 2022-09-27 | Case Western Reserve University | Predicting overall survival in early stage lung cancer with feature driven local cell graphs (FEDEG) |
CN111274903B (en) * | 2020-01-15 | 2022-12-06 | 合肥工业大学 | Cervical cell image classification method based on graph convolution neural network |
-
2020
- 2020-11-12 CN CN202011259420.3A patent/CN112365471B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109102498A (en) * | 2018-07-13 | 2018-12-28 | 华南理工大学 | A kind of method of cluster type nucleus segmentation in cervical smear image |
CN109190567A (en) * | 2018-09-10 | 2019-01-11 | 哈尔滨理工大学 | Abnormal cervical cells automatic testing method based on depth convolutional neural networks |
CN110675411A (en) * | 2019-09-26 | 2020-01-10 | 重庆大学 | Cervical squamous intraepithelial lesion recognition algorithm based on deep learning |
CN111458269A (en) * | 2020-05-07 | 2020-07-28 | 厦门汉舒捷医疗科技有限公司 | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image |
CN111783518A (en) * | 2020-05-14 | 2020-10-16 | 北京三快在线科技有限公司 | Training sample generation method and device, electronic equipment and readable storage medium |
CN112233085A (en) * | 2020-10-14 | 2021-01-15 | 南通大学 | Cervical cell image segmentation method based on pixel prediction enhancement |
Non-Patent Citations (1)
Title |
---|
ResNeSt: Split-Attention Networks;Hang Zhang et al;《Computer Vision and Pattern Recognition》;20200419;第1-22页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112365471A (en) | 2021-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112365471B (en) | Cervical cancer cell intelligent detection method based on deep learning | |
US12008076B2 (en) | End-to-end attention pooling-based classification method for histopathology images | |
CN110245657B (en) | Pathological image similarity detection method and detection device | |
CN112508850B (en) | Deep learning-based method for detecting malignant area of thyroid cell pathological section | |
CN109447998B (en) | Automatic segmentation method based on PCANet deep learning model | |
Zhang et al. | Automated semantic segmentation of red blood cells for sickle cell disease | |
CN111986150B (en) | The method comprises the following steps of: digital number pathological image Interactive annotation refining method | |
CN111985536A (en) | Gastroscope pathological image classification method based on weak supervised learning | |
CN112215117A (en) | Abnormal cell identification method and system based on cervical cytology image | |
Pan et al. | Mitosis detection techniques in H&E stained breast cancer pathological images: A comprehensive review | |
CN109635846A (en) | A kind of multiclass medical image judgment method and system | |
CN101900737A (en) | Automatic identification system for urinary sediment visible components based on support vector machine | |
CN111079620A (en) | Leukocyte image detection and identification model construction method based on transfer learning and application | |
CN113378831B (en) | Mouse embryo organ identification and scoring method and system | |
CN110415250A (en) | A kind of overlapped chromosome dividing method and device based on deep learning | |
CN113658174A (en) | Microkaryotic image detection method based on deep learning and image processing algorithm | |
CN113782184A (en) | Cerebral apoplexy auxiliary evaluation system based on facial key point and feature pre-learning | |
CN114037868B (en) | Image recognition model generation method and device | |
CN113177927A (en) | Bone marrow cell classification and identification method and system based on multiple features and multiple classifiers | |
CN114240822A (en) | Cotton cloth flaw detection method based on YOLOv3 and multi-scale feature fusion | |
CN115471838A (en) | Cervical squamous lesion cell detection method based on depth self-adaptive feature extraction | |
CN114898866A (en) | Thyroid cell auxiliary diagnosis method, equipment and storage medium | |
CN107194319A (en) | The mitotic mapping sorted based on SVMs and knowledge method for distinguishing | |
Isidoro et al. | Automatic Classification of Cervical Cell Patches based on Non-geometric Characteristics. | |
CN111815554A (en) | Cervical cell image segmentation method based on edge search MRF model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220624 |
|
CF01 | Termination of patent right due to non-payment of annual fee |