CN117809301A - Analysis method based on ASCUS cervical sample pathology results - Google Patents

Analysis method based on ASCUS cervical sample pathology results Download PDF

Info

Publication number
CN117809301A
CN117809301A CN202311867585.2A CN202311867585A CN117809301A CN 117809301 A CN117809301 A CN 117809301A CN 202311867585 A CN202311867585 A CN 202311867585A CN 117809301 A CN117809301 A CN 117809301A
Authority
CN
China
Prior art keywords
cell
image
ascus
color
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311867585.2A
Other languages
Chinese (zh)
Inventor
李�诚
郝宗杰
曹得华
严姗
庞宝川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Lanting Intelligent Medicine Co ltd
Original Assignee
Wuhan Lanting Intelligent Medicine Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Lanting Intelligent Medicine Co ltd filed Critical Wuhan Lanting Intelligent Medicine Co ltd
Priority to CN202311867585.2A priority Critical patent/CN117809301A/en
Publication of CN117809301A publication Critical patent/CN117809301A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Image Analysis (AREA)

Abstract

The application provides an analysis method based on an ASCUS cervical sample pathology result, which comprises the following steps: s1, scanning a whole glass slide to obtain a cervical cell digital image; s2, inputting an image with a specific size into a target detection model to detect suspicious positive cells, outputting the position of the cells in the image, and recording; s3, cutting suspicious positive cells from the large graph, inputting fixed sizes into a yin-yang cell classification model, and subdividing negative and positive of the cells. S4, extracting the feature vector of the positive cells in the step S3, and inputting a positive sample qualitative model to judge the positive category of the positive cells. S5, inputting the positive cells and the auxiliary classification cell characteristic parameters obtained in the step S4 into a yin-yang classification model, and further judging the corresponding pathological results of the ASCUS sample. The diagnosis accuracy is improved, the treatment decision is guided, the disease prediction capability is improved, and the resource allocation is optimized by carrying out yin-yang classification on ASCUS type cells.

Description

Analysis method based on ASCUS cervical sample pathology results
Technical Field
The invention relates to the field of medical image processing, in particular to an analysis method based on an ASCUS cervical sample pathology result.
Background
With the continuous advancement of technology, computer vision and machine learning are increasingly used in the medical field. Among them, cell analysis is one of important tasks in medical diagnosis and research. By analyzing morphological characteristics and color differences of cells, the cells can be classified and diagnosed, and important basis is provided for early prediction and treatment of diseases.
ASCUS is an abbreviation in the field of cytology, and is collectively referred to as "Atypical Squamous Cells of Undetermined Significance", i.e. "atypical squamous cells of uncertain significance". ASCUS is a cytological examination generally used to describe some atypical squamous cells found in cytological specimens, but it is not clear whether a potential lesion is present. It is a relatively common outcome that requires further evaluation and diagnosis to determine whether a potential lesion exists. By classifying yin and yang of ASCUS type cells, whether potential lesions exist or not can be determined more accurately, misdiagnosis or missed diagnosis is avoided, treatment decisions can be guided, disease prediction capability is improved, resource allocation is optimized, and therefore better medical services are provided for patients.
The nuclear area parameter, the color difference parameter and the edge infiltration evaluation parameter are characteristic parameters commonly used in cell analysis. The cell nucleus area parameter can be used for evaluating the size and morphological characteristics of cells, the color difference parameter can be used for evaluating the color difference between cell nuclei and cytoplasm, and the edge infiltration evaluation parameter can be used for evaluating the edge infiltration condition of the cell nuclei.
However, there may be limitations to using these parameters alone for cell classification. Therefore, combining these parameters with the classification model can improve the performance and accuracy of the classification model. Specifically, a feature fusion method can be adopted, the parameters are used as auxiliary classification standards, and the robustness of the classification model is improved by combining with other feature parameters; the method of regional marking or weight assignment can also be adopted, and the cells are marked or given different weights according to the area and color difference of the cell nuclei, so that the accuracy of the classification model is improved; in addition, the output result of the classification model can be analyzed and optimized by a result analysis and post-processing method, so that the classification accuracy is further improved.
Disclosure of Invention
The invention aims to solve the technical problems in the background and provides an analysis method based on an ASCUS cervical sample pathology result, which comprises the following steps:
s1, scanning a whole glass slide to obtain a cervical cell digital image;
s2, inputting an image with a specific size into a target detection model to detect suspicious positive cells, outputting the position of the cells in the image, and recording;
s3, cutting the suspicious positive cell picture into image blocks with fixed sizes, inputting a yin-yang cell classification model, and classifying the cells in a yin-yang manner;
s4, extracting the feature vector of the positive cells in the step S3, inputting a positive sample qualitative model, and judging the positive type of the cervical sample;
s5, further judging the corresponding pathological result of the ASCUS sample through the qualitative result and the characteristic vector obtained in the step S4.
In a preferred embodiment, in step s2, three classification steps are used, including: s21, separating and deleting the impurity images;
s22, separating a negative cell image;
s23, in parallel with S22, separates positive cell images;
the suspicious positive cell image was obtained from the above steps.
The technical effects are as follows: by adopting the scheme of three-step separation, the accuracy of each step can be greatly improved, and for the segmentation of the negative cell image and the positive cell image, a stricter threshold condition can be adopted, so that the accuracy of the separation effect is ensured, and the accuracy of the subsequent detection result is further improved.
In a preferred embodiment, the step S4 further includes the following steps:
s41, extracting a cell nucleus area ratio parameter, a color difference parameter and an edge infiltration evaluation parameter for each suspicious positive cell image;
s42, before the feature vector of the positive cell is input into the positive sample qualitative model, calculating to obtain ASCUS probability parameters of the sample according to the cell nucleus area ratio parameters, the color difference parameters and the edge infiltration evaluation parameters.
In a preferred embodiment, in step S41, the suspicious positive cell image is divided into a plurality of copies, and each copy is subjected to the processing of edge enhancement, area color homogenization and color brightness comprehensive range expansion.
In a preferred scheme, the strengthening edge comprises initially extracting the edge, comprehensively comparing pixels near the edge, and increasing the preset value n to the pixels with larger values 1 Less valued pixels minus a preset value n 2 Extracting edges again for the newly generated image to obtain a more accurate edge profile;
the comprehensive comparison means that the color value and the brightness value of the image are summed according to different weights, and then the comparison is carried out;
the comprehensive comparison has different schemes, but the core idea is to set different weights for the color value and the brightness value according to parameters such as color, tone, brightness and the like of the image, and then sum the weights, so that the difference of edges is more obvious, which is beneficial to more accurately identifying the edge contour, the values of the color value and the brightness value can need to be taken in different color modes, such as taking the color value in an RGB mode, setting the weights according to the tone of the color, preferably taking a larger weight for the color opposite to the tone, such as yellow, giving a larger weight to the color value of blue, which is beneficial to making the edge contour more prominent. And then taking the brightness value in the LAB space.
The region color homogenization refers to an operation of filling a certain pixel number range with a color within a certain threshold interval, wherein the value of the filling color is determined by the value of the color with the largest pixel number in the pixel number range, so as to obtain a more accurate color region.
Preferably, before the region color is homogenized, the impurity points are removed within a certain pixel number range, that is, the non-main stream color with the pixel number smaller than the threshold value is removed, for example, in a yellow tone range, the cyan, blue, red and green pixels with the pixel value smaller than or equal to 3 out of the yellow tone range are removed, and the removed pixels become pixels of the color to be filled.
In a preferred scheme, the color brightness comprehensive range expansion processing refers to extracting a brightness image from an LAB space, expanding the brightness image space, namely setting the maximum brightness value of the current brightness image as the maximum value of the current space, setting the minimum brightness value as the minimum value of the current space, performing equal-ratio expansion on the rest colors (for example, 255,0, putting bracket contents into the embodiment of the specification), wherein the equal-ratio expansion adopts a differential method, namely taking the intermediate value of the image corresponding to the intermediate value of the current brightness image after expansion, then respectively calculating the intermediate value corresponding to each part, and after the processing for many times, respectively attributing the rest values in each region to be uniformly distributed;
selecting the color of the dominant hue, extracting the color image of the dominant hue, expanding the color image space, and expanding the color image space by the same method as the brightness expansion;
and superposing the expanded color image and the brightness image to obtain a clearer edge infiltration image.
In a preferred embodiment, the calculation of the nuclear area parameter is as follows:
s4201, extracting an edge point set from the edge of the cell;
s4202, fitting the edge point set by using an ellipse fitting algorithm, and particularly, finding the optimal ellipse parameters by minimizing the distance from the edge point to the ellipse contour;
s4203, after fitting, obtaining ellipse parameters corresponding to the cells and ellipse parameters corresponding to the cell nuclei, which respectively comprise: a major axis, a minor axis, a center coordinate, and a rotation angle;
s4204, calculating the ratio of the product of the major axis and the minor axis of the ellipse corresponding to the cell nucleus to the product of the major axis and the minor axis of the ellipse corresponding to the cell, wherein the calculated result is the area ratio of the cell nucleus in the cell as the cell nucleus area parameter;
s4205, normalizing the cell nucleus area parameters.
In a preferred embodiment, the color difference parameter is calculated as follows:
s4211, converting RGB values of the cell image into LAB color space;
s4212, calculating the color difference between the two pixels, and calculating the distance between the two pixels in the LAB space by using the Euclidean distance, wherein the Euclidean distance is calculated by the following formula:
wherein L1, a1, b1 are LAB values of a first pixel, L2, a2, b2 are LAB values of a second pixel, Δe represents a color difference between the two pixels, and a larger value represents a larger color difference;
s4213, normalizing the color difference parameters.
In a preferred embodiment, the edge infiltration evaluation parameters are calculated as follows:
s4311, preparing a data set containing cell images, extracting the edge profile of the cells by using an edge detection algorithm, and finally obtaining the major axis, the minor axis and the inclination angle of the ellipse of the cells by fitting the ellipse to the edge profile of the cells;
s4312, manually marking the infiltration degree of each cell sample, using a label of 0 to represent non-infiltration, using a label of 1 to represent infiltration, and then performing image size normalization, graying and contrast increasing operation on the images and the labels;
s4313, respectively giving weights to the major axis, the minor axis and the inclination angle of the ellipse, multiplying the characteristic value of each cell with the corresponding weight, and summing to obtain the infiltration degree parameter of the cell;
s4314, normalizing the infiltration degree parameters of the cells.
In a preferred embodiment, step S42 specifically includes the following steps: and respectively giving corresponding weights to the edge infiltration degree, the color difference and the cell nucleus area of each sample, multiplying the edge infiltration degree evaluation parameter, the color difference parameter and the cell nucleus area parameter by the corresponding weights to obtain corresponding reference values w1, w2 and w3, finally adding the w1, w2 and w3 to obtain ASCUS probability parameters corresponding to each sample, marking the samples according to the ASCUS probability parameters, and inputting the samples into a positive sample qualitative model.
The beneficial effects of the invention are as follows: the method can improve the accuracy of yin-yang classification of cells by combining the cell nucleus area parameter, the color difference parameter and the edge infiltration evaluation parameter with the classification model. These parameters serve as auxiliary classification criteria, which can help the model to judge the cell type more accurately, and reduce the possibility of misclassification. The feature fusion method can combine a plurality of feature parameters and improve the robustness of the model. By comprehensively considering information of multiple aspects such as cell nucleus area, color difference, edge infiltration and the like, the model can evaluate the characteristics of cells more comprehensively and reduce the influence of external interference. By using the regional marking or weight assignment method, the cells can be marked more finely or given different weights according to the characteristics of the cell nucleus. Therefore, the classification result of the model can be better explained, the doctor is helped to understand how the model judges the cell type, and the interpretability of the result is improved.
Drawings
FIG. 1 is a flow chart of an ASCUS cell-based pathology outcome prediction method in the present invention.
Detailed Description
The method for predicting pathological results based on ASCUS cells is shown in FIG. 1, and comprises the following specific steps:
step S1: the whole slide is scanned from the cervical cell sample to obtain a digital image.
Step S2: the resulting image is input into a target detection model, suspicious positive cells are detected, and their positions in the image are recorded.
Step S3: cutting the suspicious positive cell picture into image blocks with fixed sizes, inputting a yin-yang cell classification model, and carrying out yin-yang classification on cells.
Step S4: and (3) extracting a feature vector from the cells classified as positive in the step (S3), inputting a positive sample qualitative model, and judging the positive type of the cervical sample.
Step S5: and (3) further judging the pathological result of the ASCUS sample by using the qualitative result and the characteristic vector obtained in the step S4.
In step S2, it is further subdivided into three classification steps:
step S21: and separating and deleting impurities in the image to improve the accuracy of the subsequent steps.
Step S22: negative cell images were isolated.
Step S23: positive cell images are isolated and these images are further processed and classified.
In step S4, specific steps are added:
step S41: for each suspicious positive cell image, the nuclear area duty cycle parameter, the color difference parameter, and the edge infiltration evaluation parameter are extracted.
Step S42: and calculating to obtain ASCUS probability parameters of the sample according to the cell nucleus area occupation ratio parameters, the color difference parameters and the edge infiltration evaluation parameters.
These parameters will be used to further judge the positive class of cells.
In step S41, the suspicious positive cell image is divided into a plurality of copies, and the processing of strengthening the edges, homogenizing the color of the region, and expanding the comprehensive range of the color brightness is performed respectively, so as to improve the definition of the image and the visibility of the edge infiltration.
In step S42, the cell nucleus area parameter, the color difference parameter and the edge infiltration evaluation parameter are used as auxiliary classification criteria, and positive cells are classified more accurately according to these parameters, and the ASCUS probability parameter is calculated. The weights of the parameters can be adjusted according to actual conditions so as to better adapt to the characteristics of different samples.
Through the cell detection and classification process of the embodiment, the accuracy and the reliability of detection and classification are improved. The method and the technology have important significance for pathological diagnosis of cervical cells, and can help doctors to better judge the positive category and pathological result of the cells.
In one embodiment, a method of predicting pathological outcome based on an ASCUS cervical sample comprises the steps of:
1. image preprocessing: graying the cell image, and converting the color image into a gray image for subsequent processing; the gray scale image is sharpened to enhance details of the image, and some classical image enhancement algorithms, such as laplacian, may be used.
2. Cell segmentation: dividing cells from the background using an image segmentation algorithm, such as a method based on threshold, edge detection, etc.; for the case of unclear cell edges, edge enhancement algorithms, such as Canny edge detection, can be used to improve segmentation accuracy.
3. Cell classification: extracting characteristics of the cell images obtained by segmentation, wherein the characteristics comprise cell morphological characteristics and cell nuclear characteristics; cells are classified into two categories, positive and negative, using a machine learning algorithm or an image processing algorithm.
4. Judging whether the cells are ASCUS: judging whether the cells are of ASCUS grade according to morphological characteristics of the cells, such as the shape, the size, the nuclear concentration degree and the like of the cell nuclei; according to the color difference evaluation of the cell nucleus, RGB values can be used for calculating color difference parameters, and the color change degree of the cell nucleus can be measured; according to the cell edge infiltration evaluation, the blurring degree of the cell edge is inspected; and calculating the proportion of the area of the cell nucleus to the total area of the cell according to the area proportion of the cell nucleus.
5. And (3) result judgment: the final ASCUS judgment is performed on the cells by comprehensively considering the morphological characteristics, nuclear characteristics and other clinical and pathological information of the cells.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. An analysis method based on ASCUS cervical sample pathology results is characterized by comprising the following steps:
s1, scanning a whole glass slide to obtain a cervical cell digital image;
s2, inputting an image with a specific size into a target detection model to detect suspicious positive cells, outputting the position of the cells in the image, and recording;
s3, cutting the suspicious positive cell picture into image blocks with fixed sizes, inputting a yin-yang cell classification model, and classifying the cells in a yin-yang manner;
s4, extracting the feature vector of the positive cells in the step S3, inputting a positive sample qualitative model, and judging the positive type of the cervical sample;
s5, further judging the corresponding pathological result of the ASCUS sample through the qualitative result and the characteristic vector obtained in the step S4.
2. The method for analyzing pathological results based on ASCUS cervical samples according to claim 1, characterized in that: in step s2, three classification steps are employed, including: s21, separating and deleting the impurity images;
s22, separating a negative cell image;
s23, in parallel with S22, separates positive cell images;
the suspicious positive cell image was obtained from the above steps.
3. The method for analyzing pathological results based on ASCUS cervical samples according to claim 1, characterized in that: the step S4 further includes the following steps:
s41, extracting a cell nucleus area ratio parameter, a color difference parameter and an edge infiltration evaluation parameter for each suspicious positive cell image;
s42, before the feature vector of the positive cell is input into the positive sample qualitative model, calculating to obtain ASCUS probability parameters of the sample according to the cell nucleus area ratio parameters, the color difference parameters and the edge infiltration evaluation parameters.
4. The method for analyzing pathological results based on ASCUS cervical samples according to claim 1, characterized in that: in step S41, the suspicious positive cell image is divided into a plurality of copies, and each copy is subjected to the processing of strengthening the edge, homogenizing the color of the region and expanding the comprehensive range of the color and the brightness.
5. The method for analyzing pathological results based on an ASCUS cervical sample according to claim 4, wherein the method comprises the following steps: the strengthening edge comprises the steps of firstly initially extracting the edge, comprehensively comparing pixels near the edge, adding a preset value n1 to the pixel with larger value, subtracting the preset value n2 from the pixel with smaller value, and extracting the edge again for the newly generated image so as to obtain a more accurate edge profile;
the comprehensive comparison means that the color value and the brightness value of the image are summed according to different weights, and then the comparison is carried out;
the region color homogenization refers to an operation of filling a certain pixel number range with a color within a certain threshold interval, wherein the value of the filling color is determined by the value of the color with the largest pixel number in the pixel number range, so as to obtain a more accurate color region.
6. The method for analyzing pathological results based on an ASCUS cervical sample according to claim 4, wherein the method comprises the following steps: the color brightness comprehensive range expansion processing is to extract a brightness image from the LAB space, expand the brightness image space, namely, set the maximum brightness value of the current brightness image as the maximum value of the current space, set the minimum brightness value as the minimum value of the current space, and expand the rest colors in equal ratio;
selecting the color of the dominant hue, extracting the color image of the dominant hue, expanding the color image space, and expanding the color image space by the same method as the brightness expansion;
and superposing the expanded color image and the brightness image to obtain a clearer edge infiltration image.
7. The method for analyzing pathological results based on ASCUS cervical samples according to claim 2, characterized in that: the calculation process of the cell nucleus area parameter is as follows:
s4201, extracting an edge point set from the edge of the cell;
s4202, fitting the edge point set by using an ellipse fitting algorithm, and particularly, finding the optimal ellipse parameters by minimizing the distance from the edge point to the ellipse contour;
s4203, after fitting, obtaining ellipse parameters corresponding to the cells and ellipse parameters corresponding to the cell nuclei, which respectively comprise: a major axis, a minor axis, a center coordinate, and a rotation angle;
s4204, calculating the ratio of the product of the major axis and the minor axis of the ellipse corresponding to the cell nucleus to the product of the major axis and the minor axis of the ellipse corresponding to the cell, wherein the calculated result is the area ratio of the cell nucleus in the cell as the cell nucleus area parameter;
s4205, normalizing the cell nucleus area parameters.
8. The method for analyzing pathological results based on ASCUS cervical samples according to claim 2, characterized in that: the color difference parameter is calculated as follows:
s4211, converting RGB values of the cell image into LAB color space;
s4212, calculating the color difference between the two pixels, and calculating the distance between the two pixels in the LAB space by using the Euclidean distance, wherein the Euclidean distance is calculated by the following formula:
wherein L1, a1, b1 are LAB values of a first pixel, L2, a2, b2 are LAB values of a second pixel, Δe represents a color difference between the two pixels, and a larger value represents a larger color difference;
s4213, normalizing the color difference parameters.
9. The method for analyzing pathological results based on ASCUS cervical samples according to claim 2, characterized in that: the edge infiltration evaluation parameters were calculated as follows:
s4311, preparing a data set containing cell images, extracting the edge profile of the cells by using an edge detection algorithm, and finally obtaining the major axis, the minor axis and the inclination angle of the ellipse of the cells by fitting the ellipse to the edge profile of the cells;
s4312, manually marking the infiltration degree of each cell sample, using a label of 0 to represent non-infiltration, using a label of 1 to represent infiltration, and then performing image size normalization, graying and contrast increasing operation on the images and the labels;
s4313, respectively giving weights to the major axis, the minor axis and the inclination angle of the ellipse, multiplying the characteristic value of each cell with the corresponding weight, and summing to obtain the infiltration degree parameter of the cell;
s4314, normalizing the infiltration degree parameters of the cells.
10. The method for analyzing pathological results based on ASCUS cervical samples according to claim 2, characterized in that: step S42 specifically comprises the following steps: and respectively giving corresponding weights to the edge infiltration degree, the color difference and the cell nucleus area of each sample, multiplying the edge infiltration degree evaluation parameter, the color difference parameter and the cell nucleus area parameter by the corresponding weights to obtain corresponding reference values w1, w2 and w3, finally adding the w1, w2 and w3 to obtain ASCUS probability parameters corresponding to each sample, marking the samples according to the ASCUS probability parameters, and inputting the samples into a positive sample qualitative model.
CN202311867585.2A 2023-12-29 2023-12-29 Analysis method based on ASCUS cervical sample pathology results Pending CN117809301A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311867585.2A CN117809301A (en) 2023-12-29 2023-12-29 Analysis method based on ASCUS cervical sample pathology results

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311867585.2A CN117809301A (en) 2023-12-29 2023-12-29 Analysis method based on ASCUS cervical sample pathology results

Publications (1)

Publication Number Publication Date
CN117809301A true CN117809301A (en) 2024-04-02

Family

ID=90423264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311867585.2A Pending CN117809301A (en) 2023-12-29 2023-12-29 Analysis method based on ASCUS cervical sample pathology results

Country Status (1)

Country Link
CN (1) CN117809301A (en)

Similar Documents

Publication Publication Date Title
EP3486836B1 (en) Image analysis method, apparatus, program, and learned deep learning algorithm
US10839510B2 (en) Methods and systems for human tissue analysis using shearlet transforms
US7979212B2 (en) Method and system for morphology based mitosis identification and classification of digital images
Kothari et al. Eliminating tissue-fold artifacts in histopathological whole-slide images for improved image-based prediction of cancer grade
US9135700B2 (en) Method and apparatus for image scoring and analysis
CN111986150B (en) The method comprises the following steps of: digital number pathological image Interactive annotation refining method
Xu et al. Computerized classification of prostate cancer gleason scores from whole slide images
WO2005076197A2 (en) Method and system for morphology based mitosis identification and classification of digital images
Percannella et al. A classification-based approach to segment HEp-2 cells
EP3140778B1 (en) Method and apparatus for image scoring and analysis
Sreedevi et al. Papsmear image based detection of cervical cancer
CN109147932B (en) Cancer cell HER2 gene amplification analysis method and system
CN115909006A (en) Mammary tissue image classification method and system based on convolution Transformer
CN115170518A (en) Cell detection method and system based on deep learning and machine vision
Chidester et al. Discriminative bag-of-cells for imaging-genomics
US11790673B2 (en) Method for detection of cells in a cytological sample having at least one anomaly
Lezoray et al. Segmentation of cytological images using color and mathematical morphology
US20090021753A1 (en) Methods and Systems for Refining Text Color in a Digital Image
WO2020035550A1 (en) System and method for analysis of microscopic image data and for generating an annotated data set for classifier training
CN117809301A (en) Analysis method based on ASCUS cervical sample pathology results
CN114037868B (en) Image recognition model generation method and device
Taher et al. Extraction of sputum cells using thresholding techniques for lung cancer detection
Abrol et al. An automated segmentation of leukocytes using modified watershed algorithm on peripheral blood smear images
Lezoray et al. Automatic segmentation and classification of cells from broncho alveolar lavage
Taher et al. A thresholding approach for detection of sputum cell for lung cancer early diagnosis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination