CN112927215A - Automatic analysis method for digestive tract biopsy pathological section - Google Patents
Automatic analysis method for digestive tract biopsy pathological section Download PDFInfo
- Publication number
- CN112927215A CN112927215A CN202110281259.8A CN202110281259A CN112927215A CN 112927215 A CN112927215 A CN 112927215A CN 202110281259 A CN202110281259 A CN 202110281259A CN 112927215 A CN112927215 A CN 112927215A
- Authority
- CN
- China
- Prior art keywords
- sub
- matrix
- region
- image
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000001575 pathological effect Effects 0.000 title claims abstract description 60
- 238000001574 biopsy Methods 0.000 title claims abstract description 35
- 210000001035 gastrointestinal tract Anatomy 0.000 title claims abstract description 18
- 238000004458 analytical method Methods 0.000 title claims abstract description 11
- 239000011159 matrix material Substances 0.000 claims abstract description 102
- 238000003745 diagnosis Methods 0.000 claims abstract description 84
- 230000003902 lesion Effects 0.000 claims abstract description 51
- 238000013139 quantization Methods 0.000 claims abstract description 33
- 239000013598 vector Substances 0.000 claims abstract description 33
- 230000004927 fusion Effects 0.000 claims abstract description 18
- 238000010586 diagram Methods 0.000 claims abstract description 17
- 238000000605 extraction Methods 0.000 claims abstract description 17
- 238000013145 classification model Methods 0.000 claims abstract description 14
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 11
- 230000009466 transformation Effects 0.000 claims abstract description 9
- 238000000034 method Methods 0.000 claims description 76
- 238000005070 sampling Methods 0.000 claims description 31
- 238000012549 training Methods 0.000 claims description 27
- 230000008569 process Effects 0.000 claims description 17
- 238000010276 construction Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 6
- 125000004432 carbon atom Chemical group C* 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 230000011218 segmentation Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 claims description 3
- 230000036285 pathological change Effects 0.000 claims description 3
- 231100000915 pathological change Toxicity 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 238000011426 transformation method Methods 0.000 claims description 2
- 230000004931 aggregating effect Effects 0.000 claims 1
- 230000007170 pathology Effects 0.000 abstract description 5
- 230000002496 gastric effect Effects 0.000 description 15
- 238000013527 convolutional neural network Methods 0.000 description 10
- 238000012216 screening Methods 0.000 description 6
- 230000008520 organization Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000002372 labelling Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 208000005016 Intestinal Neoplasms Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000005718 Stomach Neoplasms Diseases 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 206010017758 gastric cancer Diseases 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 201000002313 intestinal cancer Diseases 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000010827 pathological analysis Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 201000011549 stomach cancer Diseases 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Molecular Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an automatic analysis method for a biopsy pathological section of a digestive tract, which comprises the following steps: dividing the pathological section image into a plurality of sub-regions; based on the feature extraction model, feature extraction is carried out on each subregion one by one to obtain a feature matrix of each subregion; classifying each feature vector in each feature matrix based on an image classification model to obtain a corresponding lesion probability matrix; performing distance quantization on each eigenvector in each characteristic matrix based on a distance transformation algorithm to obtain a distance quantization characteristic matrix; cascading the feature matrix of each sub-region and the distance quantization feature matrix to obtain a fusion feature matrix; generating an organizational chart of each subregion based on the fusion characteristic matrix and the lesion probability matrix of each subregion; and carrying out classification prediction on each organizational structure diagram based on the graph convolution network model to generate a diagnosis result of each sub-region. The invention can screen the biopsy pathology, automatically output the diagnosis conclusion and assist the doctor to accurately and efficiently finish the work.
Description
Technical Field
The invention relates to the technical field of medical image processing, in particular to an automatic analysis method for a biopsy pathological section of a digestive tract.
Background
The gastrointestinal tract tissue biopsy is an important means for screening cancers in the gastrointestinal tract system such as gastric cancer, intestinal cancer and the like, and a pathologist finishes screening by checking microscopic sections of biopsy tissues one by one. However, the gastrointestinal endoscopy of the digestive tract has a plurality of cases, the diagnosis task of a pathologist is heavy, and misdiagnosis is easily caused by long-time high-intensity work. With the rapid development of computer and microscopic imaging technology, digital pathological images can be acquired conveniently and rapidly, and a computer automatic analysis algorithm suitable for digital pathological full-section images becomes a research hotspot in the direction in recent years.
Because the resolution of the digital pathological image is far higher than that of the natural scene image, the whole pathological image is difficult to directly process by a computer vision algorithm. In order to classify the whole slice, the existing algorithm mostly adopts a full-slice image blocking mode, firstly, a local area is classified, then, on the basis of a local prediction result, a specific strategy is formulated to complete the classification of the full slice, and the following three common strategies are provided:
1) whole-slice classification strategy based on Majority voting (Majority voting)
The strategy regards the classification result of the image blocks contained in the slice as a vote for the full slice category, and directly outputs the category with the largest number of votes as the category of the slice. The method is simple and intuitive, and when the image blocks which are decisive for diagnosis in the slice occupy the number dominance, the correct result can be obtained. However, in pathological diagnosis, the image blocks that determine the full-slice diagnosis result sometimes occupy no dominance in number, even the number only occupies less than 1% of the total number of the image blocks, and in such a case, the majority voting strategy is difficult to give the correct full-slice classification result.
2) Full-slice classification strategy based on convolutional neural network
The strategy arranges the image block classification result or the image block features into a three-dimensional tensor according to the position relation in the full slice, then trains a Convolutional Neural Network (CNN) network by taking the three-dimensional tensor as a sample and the class of the slice as a label to realize the classification of the full slice, as shown in FIG. 1. The method can effectively relieve the problem of unequal number of the image blocks in the strategy 1), but the strategy is limited by a CNN model, has poor adaptability to the pixel resolution and the length-width ratio of a full slice, and is difficult to meet the requirement of practical application.
3) Classification strategy based on key image block sampling
After the classification of the image blocks is completed, the strategy samples the image blocks according to a certain rule (for example, selecting the image blocks with the classification confidence coefficient higher than a threshold value T) so as to reduce the number of the image blocks with smaller decision-making effect or side effect; and then, establishing a multi-instance learning and other set classification model by means of the image block set obtained by sampling, so as to realize the classification of the full slice. The method is compared with the strategy 2) to increase the adaptability to the slice size, but the absolute position information of the image blocks in the full slice and the relative position information between the image blocks are abandoned in the sampling process, so that the classification precision is reduced.
In order to solve the problems of the existing full-section classification strategy, how to provide a method which can effectively screen the pathological screening of the gastrointestinal biopsy and improve the classification precision and automatically analyze the pathological section of the gastrointestinal biopsy based on the automatic classification method of the digital pathological full-section image computer becomes a problem which needs to be solved by the technicians in the field urgently.
Disclosure of Invention
In view of the above, the present invention provides an automatic analysis method for pathological sections of gastrointestinal biopsy, which can be used for pathological screening of gastrointestinal biopsy, automatically output a diagnosis conclusion, and assist a doctor to complete work accurately and efficiently.
In order to achieve the purpose, the invention adopts the following technical scheme:
an automatic analysis method for pathological section of digestive tract biopsy comprises the following steps:
dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one;
determining a current diagnosis sub-area a, extracting the characteristics of the current diagnosis sub-area a by matching with a sliding window method based on a pre-constructed characteristic extraction model to obtain a characteristic matrix F(a);
The feature matrix F is subjected to image classification model based on pre-construction(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a);
Feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a The feature matrix F(a)And the distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
Based on the fusion feature matrixAnd the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-region a(a);
The organizational structure graph G of the current diagnosis sub-area a based on a pre-trained graph convolution network model(a)Carrying out classification prediction;
generating a diagnosis result c of the current diagnosis sub-region a based on the classification prediction result(a)。
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the dividing the pathological section image into a plurality of sub-regions and diagnosing each of the sub-regions one by one includes:
converting the pathological section image into a gray image from an RGB three-channel image;
processing the gray level image by using a threshold segmentation method to obtain a binary template M of the tissue region;
performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
carrying out connected region detection on the closed operation result, taking the external rectangle of each connected region as a boundary to intercept a rectangular region from the tissue region binary template M, and obtaining a subregion binary template M(a);
One by one aiming at each subregion binary template M(a)And (6) carrying out diagnosis.
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the feature matrix F(a)The expression method is as follows:
in the above formula, the first and second carbon atoms are,the length of the window corresponding to the ith row and the jth column is dfThe feature vector of (2); [*]Represents rounding down; when the ith row and the jth column window do not contain the current diagnosis sub-region, the window is not subjected to feature extraction, and F is directly assignedij=0。
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the lesion probability matrix P is(a)The expression method is as follows:
in the above formula, C represents the number of lesion types involved in the automatic classification task;representing the predicted probability of the image block in the ith row and jth column window over C lesion types.
Preferably, in the above method for automatically analyzing pathological sections of gastrointestinal biopsy, the distance quantization is performed on each feature vector in the feature matrix based on a distance transform algorithm, so as to obtain a distance quantization feature matrix; cascading the feature matrix and the distance quantization feature matrix to obtain a fusion feature matrix, wherein the fusion feature matrix comprises the following steps:
distance transformation method based feature vector F for solving any image block in current diagnosis sub-areaijIn the feature matrix F(a)Shortest coordinate distance d of medium distance 0 vector or boundaryij;
Using the formula for dijAnd (3) carrying out distance conversion:
in the above formula, the first and second carbon atoms are,representing the degree of the current image block close to the boundary of the tissue area, wherein tau represents a temperature coefficient, and is set according to the actual application effect, and tau is taken as 16;
in the above formula, dhRepresents the length of the quantization code; hijRepresenting distance quantization feature vectors; h isijkRepresents HijThe value of the kth element;
respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image blockThe length of d ═ df+dh;
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the fusion feature matrix is used as a basis for the analysis of the pathological section of gastrointestinal biopsyAnd the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)The method comprises the following steps:
the fused feature matrix for the current diagnostic sub-region using the formulaNetwork sampling is carried out;
wherein, χgridRepresenting a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-region and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection;
obtaining a lesion probability matrix P of a current diagnosis subarea(a)Top N with highest probability of medium lesionconfEach image block and constructing a confidence coefficient sampling set; the expression for the confidence sample set is as follows:
wherein, χconfRepresenting a set of confidence samples; α represents a confidence sampling threshold;representing a pre-constructed image classification model;
collecting x' network samplesgridAnd confidence sample set χconfAnd performing union to obtain a set χ, wherein the representation method of the set χ is as follows:
wherein N isg| χ |, representing the length of the set χ;
constructing an adjacency matrix A by using the following formula;
apqrepresents any element in the adjacency matrix a;representing fused feature vectorsAndin the original feature matrixEuclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To representAndin the original feature matrixCoordinates of (5);
construction of the histogra gram G of the current diagnostic sub-region a(a),G(a)=(A,X)。
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the sampling step S is calculated as follows:
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of the expected image blocks after grid sampling;
the confidence sample threshold α is calculated as:
calculating the prediction probability P of the image block of the ith row and the jth column window of the sliding window on C lesion types by using the following formulaijAnd probability of lesion rij;
Pij=[pij1,pij2,...,pijC];
rij=1-pij1;
Probability of lesion for all image blocks obtained by sliding windowijGet the set r 'in descending order'1,r'2,...};
According to a pathological change probability set { r'1,r'2,.. } obtain a confidence sample threshold a,
preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the diagnosis result c of the current diagnosis sub-region a is obtained based on the classification prediction result(a)Then, the method further comprises the following steps: generating diagnosis results of other sub-regions one by one, and sorting and outputting the diagnosis results of each sub-region according to a preset rule; diagnosis result c of current diagnosis sub-area a(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample setThe expression method of (2) is as follows:
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational structure graph of the current diagnosis sub-region a based on the trained graph convolution network model, wherein the expression is as follows:
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;representing the trained graph convolution network model;
generating a diagnosis result c of the current diagnosis sub-area a by using the following formula(a);
c(a)=argmax(z(a))。
10. Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the histogramming map of the current diagnosis sub-region a is classified and predicted based on a pre-trained map convolutional network model, and a diagnosis result c of the current diagnosis sub-region a is obtained(a)The method also comprises the following steps:
based on the lesion probability matrix P(a)Generating a heat map of each sub-region, correspondingly superposing the heat map on the surface of the original image of the corresponding sub-region to obtain a diagnosis region map; the process of generating the heat map comprises the following steps:
if c is(a)>1, extracting the lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, indicating that the corresponding sub-region is free of lesions, no thermodynamic diagram is output.
Preferably, in the above method for automatically analyzing pathological section of gastrointestinal biopsy, the method further includes:
sorting and outputting the diagnosis result and/or the diagnosis area map of each sub-area according to a preset rule; the diagnosis result of each sub-region is expressed by the following formula;
in the above formula, naRepresenting the number of sub-regions contained in the pathological section image;
the diagnosis result of the pathological section image is represented by the following formula:
Zcrow c representing Z;the probability that the pathological section image belongs to each category is indicated.
According to the technical scheme, compared with the prior art, the invention discloses and provides an automatic analysis method for the pathologic section of the alimentary tract biopsy, which has the following beneficial effects:
the invention can classify the biopsy pathology of the digestive tract, and can provide the lesion area in a full-section image in a thermodynamic diagram mode to assist doctors in diagnosing the pathology.
The method balances the information quantity of the tissue areas with different sizes by combining the grid sampling and the confidence coefficient sampling, and has wider application range.
The invention introduces the tissue boundary quantization distance in the tissue region classification to represent the absolute position of the feature in the tissue, describes the space relative position before the feature by using the tissue structure diagram, and integrates the information by using the graph convolution network model to finally complete the classification of the tissue region. The introduction of absolute position and relative position information enables the method of the invention to significantly improve the classification accuracy of certain lesion types diagnosed by means of different tissue morphological distributions.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a block flow diagram of an automatic analysis method for pathological section of digestive tract biopsy according to the present invention;
FIG. 2 is a flow chart illustrating an application of the method for automatically analyzing pathological section of digestive tract biopsy according to the present invention;
FIG. 3 is a flow chart illustrating the creation of a full-section histology map in accordance with the present invention;
fig. 4 is a schematic diagram of a pathological section area, a doctor labeling result and a lesion area model of a certain pathological section in a training sample set provided by the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1-3, the embodiment of the invention discloses a method for automatically analyzing pathological sections of digestive tract biopsy, which comprises the following steps:
s1, dividing the pathological section image into a plurality of sub-regions as shown in fig. 2b, and diagnosing each sub-region one by one;
s2, determining the current diagnosis sub-region a, extracting the features of the current diagnosis sub-region a (shown in figure 2 c) by matching with a sliding window method based on a pre-constructed feature extraction model, and obtaining a feature matrix F(a)As shown in fig. 2 d;
s3, feature matrix F based on pre-constructed image classification model(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a)As shown in fig. 2 e;
s4, feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a As shown in fig. 2 f; the feature matrix F(a)Sum distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
S5 based onFusion feature matrixAnd the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)As shown in fig. 2 h;
s6, based on the graph convolution network model trained in advance, organizing the structure graph G of the current diagnosis sub-area a(a)Carrying out classification prediction;
s7, obtaining the diagnosis result c of the current diagnosis subarea a based on the classification prediction result(a)As shown in fig. 2 i.
And S8, repeatedly executing S2-S6, and generating diagnosis results of other sub-regions one by one.
And S9, sorting and outputting the diagnosis results of each sub-region according to a preset rule.
More advantageously, in order to assist the doctor in locating the lesion area quickly, the method further comprises:
s10, generating a heat map of each sub-region one by one based on the probability value of the lesion probability matrix of each sub-region, and correspondingly superimposing the heat map on the original image surface of the corresponding sub-region to obtain a diagnosis region map of each sub-region, as shown in fig. 2 j; the process of generating the heat map is as follows:
if c is(a)>1, extracting a lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, the current diagnosis subarea is not provided with a lesion, and a thermodynamic diagram is not output.
And sorting and outputting the diagnosis result and the diagnosis area map of the whole pathological section image according to a preset rule.
The above steps are described in detail below:
s1, dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one, wherein the method comprises the following steps:
in the case of biopsy pathology, where multiple layers of tissue are often placed side by side on the same glass slide (as shown in fig. 2 a), embodiments of the present invention require that all sub-regions be predicted one by one. The method specifically comprises the following steps:
s111, converting the pathological section image I into a gray image from an RGB three-channel image;
s112, processing the gray level image by using a threshold segmentation method (Dajin threshold segmentation algorithm) to obtain a binary template M of the tissue region, as shown in FIG. 2 b;
s113, performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
s114, carrying out connected region detection on the closed operation result, and taking an external rectangle of each connected region as a boundary to intercept a rectangular region from the binary template M of the tissue region to obtain a sub-region binary template, as shown in FIG. 2c or 3 b; defining the binary template of the a-th sub-region as M(a)Defining the corresponding area in the original drawing I as I(a)As indicated by the rectangular box in fig. 2 a;
s115, one-by-one binary template M for each subregion(a)And (6) carrying out diagnosis.
The feature extraction model in S2 and the image classification model in S3 are constructed as follows:
1. slice labeling and data collection
As shown in FIG. 4, the method of the present invention depends on the labeling of the pathologist, and requires the pathologist to outline a typical lesion region in the original section map, and then converts the region outlined by the doctor into a lesion region template through a closed curve filling algorithm. EngagementRepresenting an RGB three-channel pathological full-section image with pixel resolution of w x h under high microscopic magnification (such as resolution of 0.46um/pixel), marking K slices, and establishing a training data setWherein IkRepresenting the k slice image in the training set, EkRepresenting the generated lesion region template, its size and image IkThe same is used for recording the lesion category number, l, corresponding to each pixel point in the imagekA category number indicating the overall diagnosis of the noted slice.
At the same time, it is necessary to match the training setThe data in (1) is processed, specifically, the slice in the data is divided into image blocks in a sliding window mode, the window size is defined as t × t, and the sliding window step length is t/2. Order toRepresenting the nth image block obtained by sliding window, the submatrix of the same area of the image block in the lesion area template is YnThe image block data set created by the sliding window is represented asWherein y isnValue of (a) is represented by YnAll values in (a) make most voting decisions.
2. Model building
Training set for image blocksAny image block T in (1)nExtracting image features can be expressed as the formula:
whereinA model for the extraction of the features is represented,expressed as length dfThe feature vector of (2). The invention has no special limitation on the feature extraction model, and can select a digital image feature extraction method according to requirements, such as traditional image features of texture features, color histogram features, shape features, frequency domain transformation features and the like, or a feature extraction method based on machine learning of a self-coding network, a convolutional neural network and the like.
In the above image feature vector fnAnd establishing an image classification model on the basis, and classifying the lesion types of the image blocks. The expression of the image classification model is as follows:
whereinRepresenting an image classification model, pn∈(0,1)CRepresenting the prediction probability, C representing the number of lesion types involved in the automatic classification task, pncRepresents the probability that the nth image block belongs to the class c and satisfiesFor convenience of subsequent description, designating c as 1 denotes a normal region category, c>1 represents a lesion category. The invention classifies the imageWithout special requirements, a support vector machine, a random forest or a classification model based on a neural network can be selected according to requirements.
In order to obtain higher automatic classification precision, the invention adopts a Convolutional Neural Network (CNN) to establish the aboveAndCNN is an end-to-end machine learning model, and needs to use the aboveCompleting the training of CNN, and after the training is completed, using the last full-connection-Softmax structure in the CNN structure as a classification modelThe entire network except the classification layer is used as a feature extraction model
Feature matrix F of the current diagnostic sub-region a in S2(a)The construction process comprises the following steps:
using the above feature extraction modelAnd (3) extracting the image characteristics of the current diagnosis subarea by matching with a sliding window method, wherein the window size is t multiplied by t, the window size is the same as the window size in the training set preprocessing process, and the sliding window step length is t. The full-slice feature matrix obtained by feature extraction is represented asWherein [. X]Indicating a rounding down. Matrix elementsThat is, the feature corresponding to the ith row and jth column window, in order to avoid unnecessary calculation, when the ith row and jth column window does not contain an organization region (determined according to the organization region template shown in fig. 3 b), feature extraction is not performed on the window, and F is directly assignedij=0。
In S3, the lesion probability matrix P(a)The specific construction process comprises the following steps:
classifying the images into modelsAct on F(a)Any one of FijObtaining the probability matrix of lesion probability of the current diagnosis subregion aWherein the matrix elementsNamely the image block in the ith row and jth column window is pre-positioned on C lesionsProbability of measurement, as shown in FIG. 3d
The distance transformation process of the current diagnosis sub-region a in S4 is as follows:
in order to describe the position of a local tissue area in a tissue block, the invention adds a tissue boundary distance quantization feature in an image block feature extraction node, and uses distance transformation to solve any feature FijIn the feature matrix F(a)Distance of shortest coordinate of vector or boundary of middle distance 0, using dijThe effect of the implementation is shown in fig. 3 e.
In order to make the distance coding more sensitive to the boundary parts of the tissue region (corresponding to the epithelial parts possible in tissue pathology), d is corrected using the following formulaijImplementing a transformation
In the above formula, the first and second carbon atoms are,the degree of the current image block close to the boundary of the tissue area is represented, tau represents a temperature coefficient, the temperature coefficient is set according to the actual application effect, and tau is preferably 16 for the pathological section of the digestive tract biopsy;
in order to make better use of this feature for subsequent machine learning models, the following pairs are usedDistance quantization coding
Wherein d ishIndicating the length of the quantization code, HijValue of the k-th element in (1), HijRepresenting distance-quantized feature vectors, hijkTo representHijThe value of the kth element.
For convenience of description, the image block distance quantization coding process at the ith row and jth column window is expressed as:
to avoid unnecessary calculation, when the ith row and jth column window does not contain an organization region (judged according to the organization region template shown in fig. 3 b), distance quantization is not performed on the window, and H is directly assignedij=0。
Respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image blockThe length of d ═ df+dh;
The specific process of generating the organizational chart in S5 is as follows:
considering that the size difference of the pathological image tissue area is large, the tissue area may contain hundreds of thousands of image blocks (windows) in an extreme case, and directly using all image block features to construct a graph (graph) can cause the scale of the graph to be too large, thereby affecting the calculation of a subsequent graph convolution network model. Therefore, the invention balances the number of image blocks used for the composition (graph constraint) of different tissue slices in a sampling mode.
1) First, grid sampling (grid sampling) is performed, and a fusion feature matrix of the current diagnostic sub-region is obtained by using the following formulaNetwork sampling is carried out;
wherein, χgridRepresenting a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-region and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection.
The calculation formula of the sampling step length S is as follows:
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of expected image blocks after grid sampling.
2)χgridOnly considering the reduction of the number of image blocks on a space structure, in order to ensure that image blocks which play a decisive role in classification in a slice can participate in composition, the embodiment of the invention additionally adopts confidence coefficient sampling to obtain a probability matrix P of the current diagnosis subregion lesion(a)Top N with highest probability of medium lesionconfAnd constructing a confidence coefficient sampling set.
For convenience of description, P will be mentionedijIs shown in detail as Pij=[pij1,pij2,...,pijC]Defining the probability of pathological changes of the ith row and the jth column of image blocks obtained by sliding window as rij=1-pij1Probability of lesions for all image blocks obtained for the sliding window { rijSorting from big to small to obtain a set of { r'1,r'2,.., defining confidence sample thresholds based thereon According to the threshold value alpha, defining the confidence coefficient sampling result as:
wherein, χconfRepresenting a set of confidence samples; α represents a confidence sampling threshold;representing a pre-constructed image classification model.
Collecting x' network samplesgridAnd confidence sample set χconfAnd performing union to obtain a set χ, wherein the representation method of the set χ is as follows:
wherein N isgAnd | χ |, which represents the length of the set χ.
The adjacency matrix a is defined as a matrix of,wherein the element apqThe value rule is as follows:
apqrepresents any element in the adjacency matrix a;representing fused feature vectorsAndin the original feature matrixEuclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To representAndin the original feature matrixCoordinates of (2).
In conclusion, the process of constructing the organizational chart is completed and is marked as G(a)(a, X), as shown in fig. 3 (g).
For convenience of description, the feature matrix will be composed ofAnd defining the whole process of constructing the organization region structure diagram by the prediction probability matrix P as a diagram construction modelThe composition process is described as
And (8) training and classifying and predicting the graph convolution network model in S6.
Building graph convolution network model for graph classification predictionFor mapping tissue regions GFor classification of lesion types, the present invention is preferably applied to pathological sections of digestive tract biopsies using the DiffPool model. Diagnosis result c of the current diagnosis sub-area a(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample setThe expression method of (2) is as follows:
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational structure diagram of the current diagnosis sub-area a based on a trained graph convolution network model, wherein the expression is as follows:
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;representing the trained graph convolution network model;
generating a diagnosis result c of the current diagnosis sub-area a by using the following formula(a);
c(a)=argmax(z(a))。
The diagnosis results of the sub-regions are generated one by one according to S2-S6.
S10, outputting the diagnosis result of case
Prediction result z from screening all sub-regions (sometimes distributed in multiple full slices) of pathological section image(a)Arranged into a matrixWherein n isaRepresenting the number of the sub-regions contained in the case, the screening result of the whole case is defined asWherein
Wherein ZcLine c of Z.I.e. the probability that the case belongs to each category, can be output to the doctor user according to the application requirements and certain rules.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. An automatic analysis method for pathological sections of digestive tract biopsy is characterized by comprising the following steps:
dividing the pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one;
determining a current diagnosis sub-area a, extracting the characteristics of the current diagnosis sub-area a by matching with a sliding window method based on a pre-constructed characteristic extraction model to obtain a characteristic matrix F(a);
The feature matrix F is subjected to image classification model based on pre-construction(a)The feature vectors in the step (a) are classified to obtain a lesion probability matrix P(a);
Feature matrix F based on distance transformation algorithm(a)Performing distance quantization on each eigenvector to obtain a distance quantization characteristic matrix H(a)(ii) a The feature matrix F(a)And the distance quantization feature matrix H(a)Cascading to obtain a fusion characteristic matrix
Based on the fusion feature matrixAnd the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a);
The org-chart G of the current diagnostic sub-region a based on a pre-trained graph convolution network model(a)Carrying out classification prediction;
generating a diagnosis result c of the current diagnosis sub-region a based on the classification prediction result(a)。
2. The method for automatically analyzing pathological section of alimentary canal biopsy according to claim 1, wherein dividing pathological section image into a plurality of sub-regions, and diagnosing each sub-region one by one, comprises:
converting the pathological section image into a gray image from an RGB three-channel image;
processing the gray level image by using a threshold segmentation method to obtain a binary template M of the tissue region;
performing closed operation on the binary template M of the tissue region to obtain a closed operation result;
carrying out connected region detection on the closed operation result, taking the external rectangle of each connected region as the boundary to intercept the rectangular region from the tissue region binary template M, and obtaining a subregion binary template M(a);
One by one aiming at each subregion binary template M(a)And (6) carrying out diagnosis.
3. The method for automatically analyzing pathological section of digestive tract biopsy according to claim 1, wherein the feature matrix F(a)The expression method is as follows:
in the above formula, the first and second carbon atoms are,the length of the window corresponding to the ith row and the jth column is dfThe feature vector of (2); [*]Represents rounding down; when the ith row and the jth column window do not contain the current diagnosis sub-region, the window is not subjected to feature extraction, and F is directly assignedij=0。
4. The method as claimed in claim 3, wherein the lesion probability matrix P is a matrix of lesion probability(a)The expression method is as follows:
5. The method for automatically analyzing pathological section of digestive tract biopsy as claimed in claim 4, wherein the distance-based transformation algorithm performs distance quantization on each feature vector in the feature matrix to obtain a distance quantized feature matrix; cascading the feature matrix and the distance quantization feature matrix to obtain a fusion feature matrix, wherein the step of cascading the feature matrix and the distance quantization feature matrix comprises the following steps:
distance transformation method based feature vector F for solving any image block in current diagnosis sub-areaijIn the feature matrix F(a)Shortest coordinate distance d of medium distance 0 vector or boundaryij;
Using the formula for dijAnd (3) carrying out distance conversion:
in the above formula, the first and second carbon atoms are,representing the degree of the current image block close to the boundary of the tissue area, wherein tau represents a temperature coefficient, and is set according to the actual application effect, and tau is taken as 16;
in the above formula, dhRepresents the length of the quantization code; hijRepresenting distance quantization feature vectors; h isijkRepresents HijThe value of the kth element;
respectively converting each image block feature vector F in the current diagnosis subareaijSum distance quantization feature vector HijCascading to obtain the fusion characteristic vector of each image blockThe length of d ═ df+dh;
6. The method of claim 5, wherein the fused feature matrix is based onAnd the lesion probability matrix P(a)Generating a histogramming map G of the current diagnostic sub-area a(a)The method comprises the following steps:
the fused feature matrix for the current diagnostic sub-region using the formulaNetwork sampling is carried out;
wherein,representing a set of network samples; s is a sampling step length which is determined according to the number N of image blocks contained in the current diagnosis sub-area and the upper limit N of the number of expected image blocks after grid samplingmaxCalculating to obtain; | represents the length of the collection;
obtaining a lesion probability matrix P of a current diagnosis subarea(a)Top N with highest probability of medium lesionconfEach image block and a confidence coefficient sampling set are constructed; the expression for the confidence sample set is as follows:
wherein,representing a set of confidence samples; α represents a confidence sampling threshold;representing a pre-constructed image classification model;
aggregating network samplesAnd confidence sample setPerforming union to obtain a setCollectionThe expression method of (2) is as follows:
constructing an adjacency matrix A by using the following formula;
apqrepresents any element in the adjacency matrix a;representing fused feature vectorsAndin the original feature matrixEuclidean distance in coordinate space; (i)p,jp) And (i)q,jq) To representAndin the original feature matrixCoordinates of (5);
construction of the histogra gram G of the current diagnostic sub-region a(a),G(a)=(A,X)。
7. The method for automatically analyzing pathological section of digestive tract biopsy according to claim 6, wherein the sampling step S is calculated as follows:
in the above formula, N represents the number of image blocks included in the current diagnostic sub-region; n is a radical ofmaxRepresenting the upper limit of the number of the expected image blocks after grid sampling;
the confidence sample threshold α is calculated as:
calculating the prediction probability P of the image block of the ith row and the jth column window of the sliding window on C lesion types by using the following formulaijAnd probability of lesion rij;
Pij=[pij1,pij2,...,pijC];
rij=1-pij1;
Probability of lesion for all image blocks obtained by sliding windowijGet the set r 'in descending order'1,r′2,...};
8. the method as claimed in claim 6, wherein the diagnosis result C of the current diagnosis sub-area a is obtained(a)The diagnosis process comprises the following steps:
constructing a graph convolution network model based on DiffPool;
acquiring a training sample set, and preprocessing the training sample set by using a sliding window method; preprocessed training sample setThe expression method of (2) is as follows:
wherein G iskA tissue region diagram representing the k pathological section in the training sample set; lkRepresents GkA corresponding slice label;
training the graph convolution network model by utilizing the preprocessed training sample set;
classifying the organizational chart of the current diagnosis sub-area a based on the trained graph convolution network model, wherein the expression is as follows:
wherein z is(a)∈(0,1)CA probability that the org-chart G is divided into C categories;representing the trained graph convolution network model;
generation of the Current diagnostics Using the following equationDiagnosis result c of region a(a);
c(a)=argmax(z(a))。
9. The method as claimed in claim 6, wherein the map convolutional network model trained in advance is used to perform classification prediction on the histogramming map of the current diagnosis sub-region a and obtain the diagnosis result c of the current diagnosis sub-region a(a)The method also comprises the following steps:
generating a heat map of each sub-region one by one based on the probability value in the lesion probability matrix of each sub-region, and correspondingly overlapping the heat map to the surface of the original image of the corresponding sub-region to obtain a diagnosis region map of each sub-region; the process of generating the heat map comprises the following steps:
if c is(a)> 1, extracting the lesion probability matrix P(a)C of (a)(a)One channel acts as a thermodynamic diagram if c(a)1, indicating that the corresponding sub-region is free of lesions, no thermodynamic diagram is output.
10. The method of claim 9, further comprising:
sorting and outputting the diagnosis result and/or the diagnosis area map of each sub-area according to a preset rule; the diagnosis result of each sub-region is expressed by the following formula;
in the above formula, naRepresenting the number of sub-regions contained in the pathological section image;
the diagnosis result of the pathological section image is represented by the following formula:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110281259.8A CN112927215A (en) | 2021-03-16 | 2021-03-16 | Automatic analysis method for digestive tract biopsy pathological section |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110281259.8A CN112927215A (en) | 2021-03-16 | 2021-03-16 | Automatic analysis method for digestive tract biopsy pathological section |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112927215A true CN112927215A (en) | 2021-06-08 |
Family
ID=76175566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110281259.8A Pending CN112927215A (en) | 2021-03-16 | 2021-03-16 | Automatic analysis method for digestive tract biopsy pathological section |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112927215A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114140445A (en) * | 2021-12-06 | 2022-03-04 | 上海派影医疗科技有限公司 | Breast cancer pathological image identification method based on key attention area extraction |
CN114359280A (en) * | 2022-03-18 | 2022-04-15 | 武汉楚精灵医疗科技有限公司 | Gastric mucosa image boundary quantification method, device, terminal and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170236271A1 (en) * | 2015-08-06 | 2017-08-17 | Lunit Inc. | Classification apparatus for pathologic diagnosis of medical image, and pathologic diagnosis system using the same |
US20200365268A1 (en) * | 2019-05-14 | 2020-11-19 | Tempus Labs, Inc. | Systems and methods for multi-label cancer classification |
CN111985536A (en) * | 2020-07-17 | 2020-11-24 | 万达信息股份有限公司 | Gastroscope pathological image classification method based on weak supervised learning |
-
2021
- 2021-03-16 CN CN202110281259.8A patent/CN112927215A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170236271A1 (en) * | 2015-08-06 | 2017-08-17 | Lunit Inc. | Classification apparatus for pathologic diagnosis of medical image, and pathologic diagnosis system using the same |
US20200365268A1 (en) * | 2019-05-14 | 2020-11-19 | Tempus Labs, Inc. | Systems and methods for multi-label cancer classification |
CN111985536A (en) * | 2020-07-17 | 2020-11-24 | 万达信息股份有限公司 | Gastroscope pathological image classification method based on weak supervised learning |
Non-Patent Citations (1)
Title |
---|
崔浩阳等: "基于细胞图卷积的组织病理图像分类研究", 《计算机工程与应用》, 17 November 2020 (2020-11-17), pages 223 - 228 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114140445A (en) * | 2021-12-06 | 2022-03-04 | 上海派影医疗科技有限公司 | Breast cancer pathological image identification method based on key attention area extraction |
CN114140445B (en) * | 2021-12-06 | 2022-10-28 | 上海派影医疗科技有限公司 | Breast cancer pathological image identification method based on key attention area extraction |
CN114359280A (en) * | 2022-03-18 | 2022-04-15 | 武汉楚精灵医疗科技有限公司 | Gastric mucosa image boundary quantification method, device, terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111985536B (en) | Based on weak supervised learning gastroscopic pathology image Classification method | |
CN109886179B (en) | Image segmentation method and system of cervical cell smear based on Mask-RCNN | |
WO2021139258A1 (en) | Image recognition based cell recognition and counting method and apparatus, and computer device | |
CN107665492B (en) | Colorectal panoramic digital pathological image tissue segmentation method based on depth network | |
CN106056595B (en) | Based on the pernicious assistant diagnosis system of depth convolutional neural networks automatic identification Benign Thyroid Nodules | |
CN108830326B (en) | Automatic segmentation method and device for MRI (magnetic resonance imaging) image | |
CN112101451B (en) | Breast cancer tissue pathological type classification method based on generation of antagonism network screening image block | |
CN110021425B (en) | Comparison detector, construction method thereof and cervical cancer cell detection method | |
CN112733950A (en) | Power equipment fault diagnosis method based on combination of image fusion and target detection | |
Yang et al. | Colon polyp detection and segmentation based on improved MRCNN | |
CN112380900A (en) | Deep learning-based cervical fluid-based cell digital image classification method and system | |
CN112085714B (en) | Pulmonary nodule detection method, model training method, device, equipment and medium | |
CN108830149B (en) | Target bacterium detection method and terminal equipment | |
CN108305253A (en) | A kind of pathology full slice diagnostic method based on more multiplying power deep learnings | |
CN110796661B (en) | Fungal microscopic image segmentation detection method and system based on convolutional neural network | |
CN110766670A (en) | Mammary gland molybdenum target image tumor localization algorithm based on deep convolutional neural network | |
CN115909006B (en) | Mammary tissue image classification method and system based on convolution transducer | |
CN112132166A (en) | Intelligent analysis method, system and device for digital cytopathology image | |
CN110021019B (en) | AI-assisted hair thickness distribution analysis method for AGA clinical image | |
CN113378792A (en) | Weak supervision cervical cell image analysis method fusing global and local information | |
CN112927215A (en) | Automatic analysis method for digestive tract biopsy pathological section | |
WO2020066257A1 (en) | Classification device, classification method, program, and information recording medium | |
CN116740435A (en) | Breast cancer ultrasonic image classifying method based on multi-mode deep learning image group science | |
CN114445356A (en) | Multi-resolution-based full-field pathological section image tumor rapid positioning method | |
CN116563647A (en) | Age-related maculopathy image classification method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |