CN111458269A - Artificial intelligent identification method for peripheral blood lymph micronucleus cell image - Google Patents
Artificial intelligent identification method for peripheral blood lymph micronucleus cell image Download PDFInfo
- Publication number
- CN111458269A CN111458269A CN202010384474.6A CN202010384474A CN111458269A CN 111458269 A CN111458269 A CN 111458269A CN 202010384474 A CN202010384474 A CN 202010384474A CN 111458269 A CN111458269 A CN 111458269A
- Authority
- CN
- China
- Prior art keywords
- micronucleus
- cell image
- foreground
- cells
- peripheral blood
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000004027 cell Anatomy 0.000 title claims abstract description 126
- 238000000034 method Methods 0.000 title claims abstract description 82
- 210000005259 peripheral blood Anatomy 0.000 title claims abstract description 33
- 239000011886 peripheral blood Substances 0.000 title claims abstract description 33
- 210000002751 lymph Anatomy 0.000 title claims abstract description 29
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims abstract description 18
- 230000011218 segmentation Effects 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims abstract description 9
- 238000012549 training Methods 0.000 claims abstract description 8
- 239000000463 material Substances 0.000 claims abstract description 5
- 238000004458 analytical method Methods 0.000 claims description 32
- 238000010586 diagram Methods 0.000 claims description 15
- 239000012535 impurity Substances 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000000926 separation method Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 2
- 210000004940 nucleus Anatomy 0.000 abstract description 25
- 230000008569 process Effects 0.000 description 10
- 230000009977 dual effect Effects 0.000 description 9
- 238000013527 convolutional neural network Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 210000004698 lymphocyte Anatomy 0.000 description 5
- 238000012795 verification Methods 0.000 description 5
- 210000000349 chromosome Anatomy 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 210000001616 monocyte Anatomy 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000001444 catalytic combustion detection Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000035519 G0 Phase Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000002285 radioactive effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000028571 Occupational disease Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000012550 audit Methods 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 230000022131 cell cycle Effects 0.000 description 1
- 230000032823 cell division Effects 0.000 description 1
- 238000005119 centrifugation Methods 0.000 description 1
- 210000002230 centromere Anatomy 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 230000002354 daily effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 210000005104 human peripheral blood lymphocyte Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000000338 in vitro Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 238000010186 staining Methods 0.000 description 1
- 210000000130 stem cell Anatomy 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/1023—Microstructural devices for non-optical measurement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/01—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials specially adapted for biological cells, e.g. blood cells
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N2015/1006—Investigating individual particles for cytology
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N2015/1024—Counting particles by non-optical means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N2015/103—Particle shape
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Analytical Chemistry (AREA)
- Data Mining & Analysis (AREA)
- Dispersion Chemistry (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Evolutionary Biology (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an artificial intelligent identification method of peripheral blood lymph micronucleus cell image, which comprises the steps of collecting a sample to be analyzed through a collecting device to obtain a cell image; carrying out segmentation processing on the obtained cell image to obtain a region to be detected with a determined foreground; then identifying, classifying and counting cells on the foreground to-be-detected region; saving cell type information on a foreground to-be-detected area and position information of the cell type information on a cell image to form traceable information; and forming a graphic report according to the traceable information, and storing the graphic report as an artificial intelligence system learning and training material. The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image can realize automatic identification and classification of specific images of peripheral blood lymph blasts, micronucleus cells and naked nucleuses. Compared with the existing microscope manual microscopic examination method, the method has the advantages that the statistics is accurate, and the detection speed is remarkably improved.
Description
Technical Field
The invention relates to the field of lymph micronucleus cell examination, in particular to an artificial intelligent identification method for peripheral blood lymph micronucleus cell images.
Background
Human peripheral blood lymphocytes are mostly in the G0 phase of the cell cycle, and after in vitro culture in a medium containing PHA, lymphocytes originally in the G0 phase can be transformed into lymphoblasts, restoring the ability to divide. During cell division, damage to chromosomes of lymphoblasts can be caused by the action of chemicals or radiation, so that chromosomes break, and chromosome fragments without centromere cannot move into daughter nuclei along with chromosomes, and as a result, micronuclei are formed in cytoplasm.
Therefore, peripheral blood lymph micronucleus cell detection is one of the necessary methods for judging radioactive irradiation damage, and is widely applied to occupational disease inspection projects in which daily work such as radiology, nuclear power plants, customs and the like is exposed to radiation or radioactive rays.
For a long time, the examination method of the lymph micronucleus cells depends on visual identification by a doctor in a clinical laboratory under a common optical microscope, and the doctor usually needs 30-60 minutes of visual inspection and counting to finish an examined sample. The workload of the examining doctor is large, the efficiency is low, and only 5-10 sample examinations can be completed every day.
Disclosure of Invention
In order to solve the problems of large workload, low efficiency and low traceability of results caused by the prior human eye detection method in the background art, the invention provides an artificial intelligent identification method of peripheral blood lymph micronucleus cell images, which is characterized in that: the method comprises the following steps:
s10, collecting a sample to be analyzed through a collecting device to obtain a cell image;
s20, carrying out segmentation processing on the obtained cell image to obtain a region to be detected with a determined foreground; then identifying, classifying and counting cells on the foreground to-be-detected region;
s30, storing the cell type information on the foreground to-be-detected area and the position information of the cell type information on the cell image to form traceable information;
and S40, forming a graphic report according to the traceable information, and storing the graphic report as an artificial intelligence system learning training material.
Further, the step S20 is specifically as follows:
s21, marking the cells on the cell image, and determining the classification names of the marked cells;
s22, separating the marked cell entities, and acquiring images again through an acquisition device;
s23, performing foreground segmentation on the cell image acquired after separation, and determining a foreground to-be-detected region;
s24, identifying, classifying and counting cells in the foreground to-be-detected area to obtain a micronucleus area image and calculate the micronucleus cell rate.
Further, the classification names in the step S21 include lymphoblasts, micronucleus cells, nude nucleus cells, and impurities.
Further, the method for determining the foreground to-be-detected region comprises at least one of the following steps: RPN method and histogram method.
Further, the RPN method is specifically as follows:
inputting the images acquired in the step S22 into a shared convolution network by taking the shared convolution network as a backbone, and extracting a characteristic diagram;
selecting a frame;
and processing the generated candidate frame through NMS, and removing excess to obtain an optimal detection frame, namely the foreground to-be-detected region.
Further, the shared convolutional network comprises VGG, DenseNet, ResNet, ResNest, SENEt, YO L O.
Further, the histogram method is specifically as follows:
performing histogram threshold analysis on the plurality of images acquired in the step S22, and distinguishing two parts, namely background and foreground cells, wherein three demarcation points represent a background central threshold, a foreground background segmentation limit and a foreground central threshold from low to high;
setting a histogram interval 1 and a histogram interval 2 according to the obtained three demarcation points, and extracting a lymphoblast image, a micronucleus cell image and a naked nucleus cell image according to the histogram interval 1; extracting a background image according to the histogram interval 2;
performing pixel analysis on the extracted lymphoblast image micronucleus cell image and the extracted lymphoblast image micronucleus cell image, and setting an object communication value to ensure that each pixel needs to be connected with adjacent non-zero-value pixels, wherein the connection number is N;
setting the area ranges of the images of lymphoblasts, micronucleus cells and naked-nucleus cells, identifying lymphoblasts images, micronucleus cells images and naked-nucleus cells images according to the area ranges, removing suspected lymphoblasts images, micronucleus cells images and naked-nucleus cells images, and forming a foreground to-be-detected area.
Further, the obtained foreground to-be-detected area is classified and identified through FC full connection.
Further, the collecting device comprises one or more of the following: automatic scanning microscope, digital section scanner, CCD optical microscope.
The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image can realize automatic identification and classification of specific images of lymphoblast, micronucleus cell and naked nucleus cell (hereinafter referred to as blast, micronucleus and naked nucleus for short, and for the case of dual nucleus, referred to as dual nucleus blast, dual nucleus micronucleus and dual nucleus naked nucleus for short) of peripheral blood. The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image provided by the invention is full-automatic in the identification and classification processes, and has the advantages of high speed and high accuracy. Compared with the existing microscope manual microscopic examination method, the detection base number is enlarged, and the working efficiency is improved; meanwhile, in the automatic identification process, manual intervention in the working period is avoided, compared with manual judgment, the automatic identification result has no significant difference in false positive rate and false negative rate, the counting accuracy is higher, and the detection speed is remarkably improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flow chart of an artificial intelligence recognition method for a peripheral blood lymph micronucleus cell image provided by the present invention;
FIG. 2 is an image of a dual kernel (left) and a single kernel (right);
FIG. 3 is a monocyte signature panel;
FIG. 4 is a graph of extracted features in the RPN method
FIG. 5 is a DenseNet working diagram;
FIG. 6 is a block diagram of the generation of multiple candidates on an anchor point;
FIG. 7 is a diagram of the RPN network operation;
FIG. 8 is a flow chart of a candidate box generation algorithm;
FIG. 9 is a schematic diagram of NMS algorithm processing candidate boxes;
FIG. 10 is a diagram illustrating the effect of the practical application of FIG. 9;
FIG. 11 is a diagram of NMS removing redundant candidate blocks;
FIG. 12 is a method of 5 multi-feature fusion;
FIG. 13 is a schematic diagram of extracting histograms;
FIG. 14 is an original image, a blast and micronucleus image and a nude nucleus image, a background image extracted in a histogram;
FIG. 15 is an enlarged view of the middle panel of FIG. 14;
FIG. 16 is an enlarged view of the right drawing of FIG. 14;
FIG. 17 is a diagram illustrating the resulting segmentation thresholds for generating masks in the merged histogram method;
FIG. 18 is a diagram of an artificial intelligent CNN model;
FIG. 19 is an anchor point diagram obtained after artificial intelligence CNN analysis;
FIG. 20 is a FC full connectivity layer classification diagram;
FIG. 21 is a schematic diagram of a convolution model in a FC full join layer;
FIG. 22 is a schematic diagram of the fast-rcnn network;
FIG. 23 is a schematic view of cell uniformity;
FIG. 24 is a plot of the loss function equation.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides an artificial intelligence identification method for a peripheral blood lymph micronucleus cell image, which includes the following steps:
s10, collecting a sample to be analyzed through a collecting device to obtain a cell image; in the step, the acquisition device acquires images and obtains cell images based on various microscopes; acquisition devices include, but are not limited to, an automatic scanning microscope, a digital slide scanner, a CCD optical microscope; the obtained cell image can be in color or black and white; meanwhile, before image acquisition, sample culture is required, and the culture process of the peripheral blood lymph micronucleus cells is as follows: 2ml of peripheral blood is extracted under aseptic operation, the peripheral blood is injected into lymphocyte culture solution under aseptic operation, then the lymphocyte culture solution is placed in a carbon dioxide incubator at 37 ℃ for 72 hours, and then centrifugation, paper sheet and staining are carried out to obtain a detection sample slide;
20. carrying out segmentation processing on the obtained cell image to obtain a region to be detected with a determined foreground; then, identifying, classifying and counting cells and backgrounds of the cells on the foreground to-be-detected area;
s30, storing the cell type information on the foreground to-be-detected area and the position information of the cell type information on the cell image to form traceable information; in the step, the position, coordinates (x, y) and sizes (w, h) of each identification image in the whole image are required to be recorded and can be traced to the source and audited;
and S40, forming a graphic report according to the traceable information, and storing the graphic report as an artificial intelligence system learning training material.
The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image can realize automatic identification and classification of specific images of lymphoblast, micronucleus cell and naked nucleus cell (hereinafter referred to as blast, micronucleus and naked nucleus for short, and for the case of dual nucleus, referred to as dual nucleus blast, dual nucleus micronucleus and dual nucleus naked nucleus for short) of peripheral blood. The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image provided by the invention is full-automatic in the identification and classification processes, and has the advantages of high speed and high accuracy. Compared with the existing microscope manual microscopic examination method, the detection base number is enlarged, and the working efficiency is improved; meanwhile, in the automatic identification process, manual intervention in the working period is avoided, compared with manual judgment, the automatic identification result has no significant difference in false positive rate and false negative rate, the counting accuracy is higher, and the detection speed is remarkably improved. Peripheral blood mononuclear micronucleus lymphocyte analysis (abbreviated as mononuclear analysis), and peripheral blood dinuclear lymphoid micronucleus cell analysis (abbreviated as binuclear analysis).
Moreover, the analysis method carries out accurate query through the design that the detection information can be traced.
Image features applicable to image analysis of peripheral blood lymph micronucleus cells:
the collected images in the analysis method are divided into two specifications of color and black and white;
the image background is a background color of approximately white or off-white. The image foreground includes three types of cells: lymphoid micronucleus, lymphoblast, and nude nucleus cells. And also contains impurities resulting from the slide treatment process. As shown in fig. 2, the left image is a dual-core image; the right image is a single-core image;
the mononuclear and binuclear lymphocyte images were characterized as follows:
1. the image background is approximately white, which is different from the background of a natural color photo and is easy to distinguish;
2. the foreground objects are cells, the shooting angle has no phase difference, and the natural color picture objects generally have phase difference, namely have phase problems;
3. the foreground object is free from shadow and interference information such as shadow angles and the like;
4. the image picture brightness is basically consistent, because the backlight brightness and the color temperature of the scanning microscope are set when shooting.
The analysis method provided by the embodiment of the invention is designed based on the image characteristics.
In a specific implementation, the step S20 is specifically as follows:
s21, marking the cells on the cell image, and determining the classification names of the marked cells; in this step, the object is marked: marking the object of interest; in this section, it is necessary to label the detected object and classify it, for example, label "monocyte" (shaped as shown in fig. 3), and determine a classification name for the labeled object, and the label name of the mononuclear analysis is: mononuclear micronucleus cells, mononuclear blasts, mononuclear nude cells; the label name of the binuclear analysis is: binuclear micronuclear cells, binuclear blasts, and binuclear naked monocytes.
22. Separating the marked cell entities, and acquiring images again through an acquisition device; in the step, the separation of the entities is realized by vibration and the like; the purpose of vibration is to avoid the influence of too dense cells on the detection result;
s23, performing foreground segmentation on the cell image acquired after separation, and determining a foreground to-be-detected region; in this step, the foreground of the image is segmented, i.e. possible foreground regions to be detected are selected, and called anchor points (anchors). The method for determining the foreground to-be-detected region comprises at least one of the following steps: the RPN (region pro-polysal network) method and the histogram method, wherein:
(1) the method for selecting the anchor point by the RPN (region pro common network) comprises the following steps:
the RPN method needs to extract a feature map (feature map) through a shared convolutional network as a backbone, taking a color map as an example (the color map is 3 channels, and a black-and-white map is only a single channel and is not repeated), and the working principle is as shown in fig. 4;
although several candidate backbones can be selected, the artificial intelligent convolution network model of the backbones includes, but is not limited to, VGG, DenseNet, ResNeXt, SENEet, YO L O series, etc., although the backbones have respective unique advantages, in the project, only the common basic function of the backbones, namely the selection of candidate anchor points, the precision difference and the speed difference between the anchor points are large, one of the anchor points is selected, and the parameter adjustment is performed.
Taking DenseNet as an example (as shown in fig. 5 and fig. 7):
BN-Re L U is an activation function
Conv: convolution with a bit line
Xn: image of a person
Hn: convolution + activation layer
Translation L eye
DenseNet characteristics: each layer will accept the output of all previous layers. Similar to the memory function. This particular characteristic, however, is not necessary within this term. Similarly, other models have their own features, and we are not concerned here with the personalized features of this model, which are basically the same in function and are applicable.
Setting parameters:
IoU: foreground/background thresholding.
pretrained:
Selecting a pre-training set; or directly using the image number of the user without using a pre-training set
Accordingly, the effect is better, but a greater amount of training is required.
stages: the number is set.
Other parameters are specifically adjusted according to the selected model, and are not described herein again.
Through the machine-learned RPN network, the required anchor points are generated. An example of RPN producing candidate frames is that, for a possible anchor point, a plurality of candidate frames are produced (as shown in fig. 6), then CNN feature extraction is performed, then classification and regression processing are performed on the extracted features, and a candidate frame generation algorithm is shown in fig. 8;
using a loss function including but not limited to that shown in fig. 24;
nms (nonmaximum representation) as shown in fig. 11, post-processes a large number of generated candidate frames, removes redundant candidate frames, and obtains an optimal detection frame.
Mainly judges 2 pieces of information: score and IOU pick the candidate box with the highest Score, i.e. retaining the anchor point that fits best (as shown in fig. 9), and its actual application is shown in fig. 10.
The algorithm formula of NMS is specifically as follows:
wherein:
Biis a prediction box set;
Siin order to score the matching degree of the object region and the prediction frame, a plurality of scores and score sets exist in a plurality of frames. Each biCorresponds to one Si;
iou is the superposition ratio of the prediction frame and the real mark frame, and is 0-100%;
sigma is variance, and when each MNS is processed, the value is understood as a constant of the current processing;
m is a set of scores S and b, S with the highest score is found through iteration, others are removed from M to a D set, the highest score value is left, and a repeated frame is removed.
The operation flow of the iterative formula is as follows:
1. finding out s with the highest fractional number in M;
2. removing the corresponding box from M and b;
3. and adding the removed box to D;
4. deleting other boxes with the box overlapping area corresponding to M larger than the threshold Nt from the b; nt is a preset value, constant;
5. repeating the steps 1-4.
The quality of anchor point identification depends on multi-feature fusion, and a plurality of feature layers need to be fused: the goal is to minimize impurity interference and omissions: as shown in fig. 12, one of the 5 multi-feature fusion methods may be selected. Due to the difference of image acquisition, the method cannot be fixed as a method, but needs to be selected according to the characteristics of the actually acquired image. The 2 combinations thereof are selected, and the effect is often obtained. This step may not be employed if it is desired to improve operating efficiency. So far, the anchor point analysis is finished, and a plurality of alternative frames (anchor points) are obtained
(2) Selecting an anchor point based on a histogram method:
the method for selecting the anchor point by the histogram is slightly complex to realize, but compared with the RPN method, the method does not generate a plurality of candidate frames, has high judgment efficiency, and completes anchor point detection and candidate frame identification of the whole picture at one time.
The method for selecting the anchor point based on the histogram method specifically comprises the following steps:
firstly, extracting a histogram: as shown in fig. 13, the left image is an acquired original image, and the right image is a histogram; the image has double peaks, 3 demarcation points are found through 4 times of curve fitting analysis, background color distribution 197 and cell distribution 51 can be distinguished, and the demarcation points are 117; wherein the three cut points represent, from low to high, a background center threshold, a foreground background segmentation limit, and a foreground center threshold.
Then, histogram analysis is performed, and the histogram analysis process is as follows: taking the single-core analysis as an example (the double-core analysis process is completely the same) attention object extraction: as shown in fig. 14, the left image is an original image, the middle image is a histogram zone 1 set according to three demarcation points of a histogram, and the target extracts a mother cell, a micronucleus cell and a nude nucleus cell anchor point; the right graph is a histogram interval 2 set according to three demarcation points of the histogram, and the aim is to extract the background; then, setting each object link value and an area range, wherein each object link value defines a link object, each pixel must be connected with adjacent non-0-value pixels, the number of the connections is between 1 and 8 (the value can be set according to actual needs), and the 0-value pixels correspond to a white background. Setting an area range, namely, the area range of each identified object, the target: and removing the oversize and ultra-small suspected objects.
For naked nuclear cells, a roundish judgment is also included to remove non-prototypical and aggregated cells. In the step, the circle-like judgment method comprises the following steps:
circularity = actual area/minimum circumscribed area
Minimum circumscribed circle = π r2。
By the above steps, the coordinate data of micronucleus, blast and bare nucleus can be extracted, the data information is merged, artificial intelligence training is carried out, and the cells which are not classified are identified (taking binuclear analysis as an example, mononuclear analysis is completely the same), and the specific method is as follows:
as shown in fig. 17, the upper left image is the original image; the upper graph on the right side is a mask generated by histogram analysis step and histogram detection; the lower left image is a segmented image extracted according to the mask, including cells, impurities, and the like; the bottom right graph is the histogram and threshold, 174 vertical bars indicate the appropriate segmentation threshold;
then using standard artificial intelligence CNN analysis, filtering and selecting alternative anchor points, the CNN analysis model is as shown in FIG. 18, and standard convolution analysis is adopted
The convolution layer can be 3-7 layers;
inputting candidate anchor points obtained for 9.2;
the output is 1-3 FC full connection layers;
through the CNN artificial intelligence convolution analysis, candidate anchor points obtained in the histogram process are filtered, and required information, object coordinates, shapes, areas and the like are extracted, and the obtained image is shown in fig. 19;
and carrying out anchor point identification by an RPN (RegionProposalNet) method or a histogram method, and after obtaining alternative anchor points, carrying out classification identification on the candidate anchor points. In this example, the classification of lymphoblasts, micronucleus, nude nuclei and impurities was achieved by using FC full junction layer air intake classification (as shown in fig. 20). The convolution analysis method in this step takes a batch of 256 graphs as an example: as shown in fig. 21, there is an intermediatelayer based on featuremap, which is replaced by a convolutional layer of 256 convolution kernels 3 x 3. Example numbers, see e.g. to ensure agreement with the number of channels of featuremap. (note, the graph of multiple channels is convolved by 1 × 1, that is, multiple channels are added after multiplying corresponding pixels by weights, which is equivalent to communicating all channels), and a single channel uses a convolution kernel of 3 × 3.
The invention provides an artificial intelligent identification method of peripheral blood lymph micronucleus cell images, which uses an object detection neural network for classification, mainly an RCNN neural network, wherein the RCNN neural network mainly comprises Faster-RCNN, mask-RCNN, ssd, rcnnYO L O series and the like, the working principles are very similar, the principle of the fast-RCNN network is shown in figure 22.
The invention provides an artificial intelligent identification method of peripheral blood lymph micronucleus cell images, which is characterized in that lymphoblasts, micronucleus cells and naked nucleus cells are artificially and intelligently identified and micronucleus cell rate and micronucleus rate are calculated aiming at sample pictures collected by all microscopes, linear (planar) digital slice scanners and optical microscopes through CCDs; the examining doctor does not need to see the film under the microscope by naked eyes, only needs to check and examine, and issues a traceability image-text report, thereby greatly improving the working efficiency of the examining doctor. The prior art of calculating the micronucleus cell rate is not described herein again.
In this embodiment, before image acquisition, image marking and scanning area preview image in the previous stage are also required, and verification data set marking, audit marking, error classification correction and statistics are performed in the later stage of medicine return.
The marking is used as an important part of deep learning, the existing marking method can be adopted as the marking method, the marking needs to be accurate, and the marking classification method, the marking mode, the marking data accord with the randomness of statistical sampling, and the marking sample accords with the objective fact of real classification, which are all contents to be considered during marking. The marker data needs to include lymphoblasts, micronucleus, nude nucleus cells, impurities. While the impurity classification is based on, image instructions, and analysis requirements, and can be identified and counted, then an impurity classification needs to be added, or ignored, and then no tag analysis is needed.
Among them, 3 kinds of data, i.e., lymphoblasts, micronucleus cells, and naked monocytes, belong to data required for clinical medicine (or necessary information needs to be calculated from these data). The identification of the impurities belongs to the data which is not needed to be discarded after classification, and also needs to be classified and identified. The types of the impurities are different due to the difference of two algorithms of the RPN method and the histogram method, so that the extracted and segmented impurities are the same, and the impurity classification is designed and needs to be selected and marked according to specific conditions.
In addition, the scanning preview aims at whether the cell distribution in the scanning area of the full image is uniform, and the scanning area is not overlapped and not sparse. If the cell density is too large or too small, the computer analysis is not suitable. The slide needs to be rescanned in the proper repositioning. A moderately dense, uniformly distributed image is shown in fig. 23.
The verification of the data marking is one of the necessary processes for ensuring the marking is correct, so as to ensure the marking data is correct. The verification can be realized by positioning the position of the marked cell in the whole image based on traceable information, so that the background is conveniently checked during verification, and the verification method can be realized by adopting manual or artificial intelligence.
Finally, it is also possible to review by manually labeled micronucleated cells: and marking the disputed cell images by the expert doctor A, then performing auditing by the expert doctor B, negotiating, performing final auditing, modifying errors in the marks, and returning to the artificial intelligence system to be used as a learning material. Only the correct micronucleus cells after examination are retained here, while others can be discarded.
Most of the existing image detection methods are combination of an image enhancement method and a feature extraction method, assist in judgment, cannot be well adapted to visual form difference, image acquisition operation difference and image interference difference of cells of the same species, and reflect high consistency of the image method which strongly depends on data. Clinical tests show that the image detection method combining the common image enhancement method and the feature extraction method has the recognition accuracy of not more than 50% and the omission ratio (false negative rate) of at least 50%, and cannot meet the clinical application requirements.
The artificial intelligent identification method for the peripheral blood lymph micronucleus cell image changes the working mode of the image identification, can well adapt to the characteristics and the change of the actual collection of the cell image, the omission (false negative rate) of AI analysis can be reduced to about 5%, the quality change of clinical work is changed into a small amount of data examination, a large amount of work is completed in advance by AI, the omission is reduced, the time efficiency is improved by 10-50 times, and the inspection accuracy is improved at the same time.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.
Claims (9)
1. An artificial intelligent identification method for peripheral blood lymph micronucleus cell image is characterized in that: the method comprises the following steps:
s10, collecting a sample to be analyzed through a collecting device to obtain a cell image;
s20, carrying out segmentation processing on the obtained cell image to obtain a region to be detected with a determined foreground; then identifying, classifying and counting cells on the foreground to-be-detected region;
s30, storing the cell type information on the foreground to-be-detected area and the position information of the cell type information on the cell image to form traceable information;
and S40, forming a graphic report according to the traceable information, and storing the graphic report as an artificial intelligence system learning training material.
2. The method for artificial intelligence recognition of the peripheral blood lymph micronucleus cell image according to claim 1, characterized in that: the step S20 is specifically as follows:
s21, marking the cells on the cell image, and determining the classification names of the marked cells;
s22, separating the marked cell entities, and acquiring images again through an acquisition device;
s23, performing foreground segmentation on the cell image acquired after separation, and determining a foreground to-be-detected region;
s24, identifying, classifying and counting cells in the foreground to-be-detected area to obtain a micronucleus area image and calculate the micronucleus cell rate.
3. The method for artificial intelligence recognition of the peripheral blood lymph micronucleus cell image according to claim 2, characterized in that: the classification names in the step S21 include lymphoblasts, micronucleus cells, naked-nucleus cells, and impurities.
4. The method for artificial intelligence recognition of the peripheral blood lymph micronucleus cell image according to claim 2, characterized in that: the method for determining the foreground to-be-detected region comprises at least one of the following steps: RPN method and histogram method.
5. The method for artificial intelligence identification of the peripheral blood lymph micronucleus cell image according to claim 4, wherein: the RPN method is specifically as follows:
inputting the images acquired in the step S22 into a shared convolution network by taking the shared convolution network as a backbone, and extracting a characteristic diagram;
generating one or more candidate boxes on a feature map;
and processing the generated candidate frame through NMS, and removing excess to obtain an optimal detection frame, namely the foreground to-be-detected region.
6. The method of claim 5, wherein the shared convolutional network comprises VGG, DenseNet, ResNet, ResNest, SENEet, YO L O.
7. The method for artificial intelligence identification of the peripheral blood lymph micronucleus cell image according to claim 4, wherein: the histogram method is as follows:
performing histogram threshold analysis on the plurality of images acquired in the step S22, and distinguishing two parts, namely background and foreground cells, wherein three demarcation points represent a background central threshold, a foreground background segmentation limit and a foreground central threshold from low to high;
setting a histogram interval 1 and a histogram interval 2 according to the obtained three demarcation points, and extracting a lymphoblast image, a micronucleus cell image and a naked nucleus cell image according to the histogram interval 1; extracting a background image according to the histogram interval 2;
performing pixel analysis on the extracted lymphoblast image, micronucleus cell image and naked-nucleus cell image, and setting an object communication value to ensure that each pixel needs to be connected with adjacent non-zero-value pixels, wherein the connection number is N;
setting the area ranges of the images of lymphoblasts, micronucleus cells and naked-nucleus cells, identifying lymphoblasts images, micronucleus cells images and naked-nucleus cells images according to the area ranges, removing suspected lymphoblasts images, micronucleus cells images and naked-nucleus cells images, and forming a foreground to-be-detected area.
8. The method for artificial intelligence recognition of the peripheral blood lymph micronucleus cell image according to claim 5 or 7, characterized in that: and carrying out classified identification on the obtained foreground to-be-detected area through FC full connection.
9. The method for artificial intelligence recognition of the peripheral blood lymph micronucleus cell image according to claim 1, characterized in that: the collection device comprises one or more of the following: automatic scanning microscope, digital section scanner, CCD optical microscope.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010384474.6A CN111458269A (en) | 2020-05-07 | 2020-05-07 | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010384474.6A CN111458269A (en) | 2020-05-07 | 2020-05-07 | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111458269A true CN111458269A (en) | 2020-07-28 |
Family
ID=71684748
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010384474.6A Pending CN111458269A (en) | 2020-05-07 | 2020-05-07 | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111458269A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111882548A (en) * | 2020-07-31 | 2020-11-03 | 北京小白世纪网络科技有限公司 | Method and device for counting cells in pathological image based on deep learning |
CN112365471A (en) * | 2020-11-12 | 2021-02-12 | 哈尔滨理工大学 | Cervical cancer cell intelligent detection method based on deep learning |
CN113011306A (en) * | 2021-03-15 | 2021-06-22 | 中南大学 | Method, system and medium for automatic identification of bone marrow cell images in continuous maturation stage |
CN113807319A (en) * | 2021-10-15 | 2021-12-17 | 云从科技集团股份有限公司 | Face recognition optimization method, device, equipment and medium |
CN117253229A (en) * | 2023-11-17 | 2023-12-19 | 浙江大学海南研究院 | Deep learning-based marine mussel micronucleus cell identification and counting method and application |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1553166A (en) * | 2003-12-19 | 2004-12-08 | 武汉大学 | Microscopic multispectral marrow and its peripheral blood cell auto-analyzing instrument and method |
CN101226155A (en) * | 2007-12-21 | 2008-07-23 | 中国人民解放军第八一医院 | Intelligentize lung cancer early cell pathological picture recognition processing method |
CN101639941A (en) * | 2009-01-13 | 2010-02-03 | 中国人民解放军军事医学科学院放射与辐射医学研究所 | Method for extracting binuclear lymphocyte accurately and quickly in CB method micronucleated cell image |
CN102297873A (en) * | 2011-05-03 | 2011-12-28 | 杭州一二八医院 | Method for identifying cancer cell images by soft X-ray microscopic imaging |
CN103489187A (en) * | 2013-09-23 | 2014-01-01 | 华南理工大学 | Quality test based segmenting method of cell nucleuses in cervical LCT image |
CN104751461A (en) * | 2015-03-29 | 2015-07-01 | 嘉善加斯戴克医疗器械有限公司 | White cell nucleus segmentation method based on histogram threshold and low rank representation |
CN106248559A (en) * | 2016-07-14 | 2016-12-21 | 中国计量大学 | A kind of leukocyte five sorting technique based on degree of depth study |
CN107527028A (en) * | 2017-08-18 | 2017-12-29 | 深圳乐普智能医疗器械有限公司 | Target cell recognition methods, device and terminal |
CN107730499A (en) * | 2017-10-31 | 2018-02-23 | 河海大学 | A kind of leucocyte classification method based on nu SVMs |
CN107977682A (en) * | 2017-12-19 | 2018-05-01 | 南京大学 | Lymph class cell sorting method and its device based on the enhancing of polar coordinate transform data |
US20180158189A1 (en) * | 2016-12-07 | 2018-06-07 | Samsung Electronics Co., Ltd. | System and method for a deep learning machine for object detection |
CN108885899A (en) * | 2017-04-01 | 2018-11-23 | 深圳前海达闼云端智能科技有限公司 | Processing method, device and the electronic equipment of medical image transmission data |
CN109191470A (en) * | 2018-08-18 | 2019-01-11 | 北京洛必达科技有限公司 | Image partition method and device suitable for big data image |
CN109636782A (en) * | 2018-11-30 | 2019-04-16 | 苏州深析智能科技有限公司 | A kind of cell type analysis model training method, device and analysis method |
CN110120056A (en) * | 2019-05-21 | 2019-08-13 | 闽江学院 | Blood leucocyte dividing method based on self-adapting histogram threshold value and contour detecting |
CN110132844A (en) * | 2019-05-13 | 2019-08-16 | 贵州大学 | A kind of cell image data collection system and method, information data processing terminal |
CN110261271A (en) * | 2019-07-03 | 2019-09-20 | 安徽科创中光科技有限公司 | A kind of horizontal pollution based on laser radar big data is traced to the source artificial intelligence identifying system |
CN110263656A (en) * | 2019-05-24 | 2019-09-20 | 南方科技大学 | A kind of cancer cell identification methods, devices and systems |
CN110473170A (en) * | 2019-07-10 | 2019-11-19 | 苏州卓融新能源科技有限公司 | A kind of artificial intelligence detection method suitable for the true and false determining defects of pcb board |
CN110598724A (en) * | 2019-01-17 | 2019-12-20 | 西安理工大学 | Cell low-resolution image fusion method based on convolutional neural network |
CN111062346A (en) * | 2019-12-21 | 2020-04-24 | 电子科技大学 | Automatic leukocyte positioning detection and classification recognition system and method |
-
2020
- 2020-05-07 CN CN202010384474.6A patent/CN111458269A/en active Pending
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1553166A (en) * | 2003-12-19 | 2004-12-08 | 武汉大学 | Microscopic multispectral marrow and its peripheral blood cell auto-analyzing instrument and method |
CN101226155A (en) * | 2007-12-21 | 2008-07-23 | 中国人民解放军第八一医院 | Intelligentize lung cancer early cell pathological picture recognition processing method |
CN101639941A (en) * | 2009-01-13 | 2010-02-03 | 中国人民解放军军事医学科学院放射与辐射医学研究所 | Method for extracting binuclear lymphocyte accurately and quickly in CB method micronucleated cell image |
CN102297873A (en) * | 2011-05-03 | 2011-12-28 | 杭州一二八医院 | Method for identifying cancer cell images by soft X-ray microscopic imaging |
CN103489187A (en) * | 2013-09-23 | 2014-01-01 | 华南理工大学 | Quality test based segmenting method of cell nucleuses in cervical LCT image |
CN104751461A (en) * | 2015-03-29 | 2015-07-01 | 嘉善加斯戴克医疗器械有限公司 | White cell nucleus segmentation method based on histogram threshold and low rank representation |
CN106248559A (en) * | 2016-07-14 | 2016-12-21 | 中国计量大学 | A kind of leukocyte five sorting technique based on degree of depth study |
US20180158189A1 (en) * | 2016-12-07 | 2018-06-07 | Samsung Electronics Co., Ltd. | System and method for a deep learning machine for object detection |
CN108885899A (en) * | 2017-04-01 | 2018-11-23 | 深圳前海达闼云端智能科技有限公司 | Processing method, device and the electronic equipment of medical image transmission data |
CN107527028A (en) * | 2017-08-18 | 2017-12-29 | 深圳乐普智能医疗器械有限公司 | Target cell recognition methods, device and terminal |
CN107730499A (en) * | 2017-10-31 | 2018-02-23 | 河海大学 | A kind of leucocyte classification method based on nu SVMs |
CN107977682A (en) * | 2017-12-19 | 2018-05-01 | 南京大学 | Lymph class cell sorting method and its device based on the enhancing of polar coordinate transform data |
CN109191470A (en) * | 2018-08-18 | 2019-01-11 | 北京洛必达科技有限公司 | Image partition method and device suitable for big data image |
CN109636782A (en) * | 2018-11-30 | 2019-04-16 | 苏州深析智能科技有限公司 | A kind of cell type analysis model training method, device and analysis method |
CN110598724A (en) * | 2019-01-17 | 2019-12-20 | 西安理工大学 | Cell low-resolution image fusion method based on convolutional neural network |
CN110132844A (en) * | 2019-05-13 | 2019-08-16 | 贵州大学 | A kind of cell image data collection system and method, information data processing terminal |
CN110120056A (en) * | 2019-05-21 | 2019-08-13 | 闽江学院 | Blood leucocyte dividing method based on self-adapting histogram threshold value and contour detecting |
CN110263656A (en) * | 2019-05-24 | 2019-09-20 | 南方科技大学 | A kind of cancer cell identification methods, devices and systems |
CN110261271A (en) * | 2019-07-03 | 2019-09-20 | 安徽科创中光科技有限公司 | A kind of horizontal pollution based on laser radar big data is traced to the source artificial intelligence identifying system |
CN110473170A (en) * | 2019-07-10 | 2019-11-19 | 苏州卓融新能源科技有限公司 | A kind of artificial intelligence detection method suitable for the true and false determining defects of pcb board |
CN111062346A (en) * | 2019-12-21 | 2020-04-24 | 电子科技大学 | Automatic leukocyte positioning detection and classification recognition system and method |
Non-Patent Citations (1)
Title |
---|
王积分 等, 机械工业出版社 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111882548A (en) * | 2020-07-31 | 2020-11-03 | 北京小白世纪网络科技有限公司 | Method and device for counting cells in pathological image based on deep learning |
CN112365471A (en) * | 2020-11-12 | 2021-02-12 | 哈尔滨理工大学 | Cervical cancer cell intelligent detection method based on deep learning |
CN112365471B (en) * | 2020-11-12 | 2022-06-24 | 哈尔滨理工大学 | Cervical cancer cell intelligent detection method based on deep learning |
CN113011306A (en) * | 2021-03-15 | 2021-06-22 | 中南大学 | Method, system and medium for automatic identification of bone marrow cell images in continuous maturation stage |
CN113807319A (en) * | 2021-10-15 | 2021-12-17 | 云从科技集团股份有限公司 | Face recognition optimization method, device, equipment and medium |
CN117253229A (en) * | 2023-11-17 | 2023-12-19 | 浙江大学海南研究院 | Deep learning-based marine mussel micronucleus cell identification and counting method and application |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111458269A (en) | Artificial intelligent identification method for peripheral blood lymph micronucleus cell image | |
CN110245657B (en) | Pathological image similarity detection method and detection device | |
CN111524137A (en) | Cell identification counting method and device based on image identification and computer equipment | |
CN109360196A (en) | Method and device based on deep learning processing oral cavity radiation image | |
CN107230203A (en) | Casting defect recognition methods based on human eye vision attention mechanism | |
CN112580748B (en) | Method for counting classified cells of stain image | |
CN109886932A (en) | Gear ring of wheel speed sensor detection method of surface flaw based on SVM | |
CN112365497A (en) | High-speed target detection method and system based on Trident Net and Cascade-RCNN structures | |
CN113470041B (en) | Immunohistochemical cell image cell nucleus segmentation and counting method and system | |
CN110796661A (en) | Fungal microscopic image segmentation detection method and system based on convolutional neural network | |
CN113160185A (en) | Method for guiding cervical cell segmentation by using generated boundary position | |
CN115294377A (en) | System and method for identifying road cracks | |
CN110414317B (en) | Full-automatic leukocyte classification counting method based on capsule network | |
CN111950544A (en) | Method and device for determining interest region in pathological image | |
CN114235539A (en) | PD-L1 pathological section automatic interpretation method and system based on deep learning | |
CN114037671A (en) | Microscopic hyperspectral leukocyte detection method based on improved fast RCNN | |
CN115541578B (en) | High-flux super-resolution cervical cell pathological section rapid scanning analysis system | |
Tosta et al. | Application of evolutionary algorithms on unsupervised segmentation of lymphoma histological images | |
CN115272055A (en) | Chromosome image analysis method based on knowledge representation | |
Amitha et al. | Developement of computer aided system for detection and classification of mitosis using SVM | |
CN113705531A (en) | Method for identifying alloy powder inclusions based on microscopic imaging | |
EP3895060A1 (en) | Classification of cell nuclei | |
CN109658382A (en) | Tongue body localization method based on image clustering and Gray Projection | |
CN117496276B (en) | Lung cancer cell morphology analysis and identification method and computer readable storage medium | |
Marcuzzo et al. | A hybrid approach for Arabidopsis root cell image segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200728 |