CN113887503A - Improved attention convolution neural network-based five-classification method for white blood cells - Google Patents

Improved attention convolution neural network-based five-classification method for white blood cells Download PDF

Info

Publication number
CN113887503A
CN113887503A CN202111231648.6A CN202111231648A CN113887503A CN 113887503 A CN113887503 A CN 113887503A CN 202111231648 A CN202111231648 A CN 202111231648A CN 113887503 A CN113887503 A CN 113887503A
Authority
CN
China
Prior art keywords
attention
output
network
white blood
convolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111231648.6A
Other languages
Chinese (zh)
Other versions
CN113887503B (en
Inventor
王慧慧
邵卫东
张旭
曾凡一
康家铭
张春旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Polytechnic University
Original Assignee
Dalian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Polytechnic University filed Critical Dalian Polytechnic University
Priority to CN202111231648.6A priority Critical patent/CN113887503B/en
Publication of CN113887503A publication Critical patent/CN113887503A/en
Application granted granted Critical
Publication of CN113887503B publication Critical patent/CN113887503B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the field of medical microscopic image classification, and provides an improved attention convolution neural network leukocyte five-classification-based method for recognizing blood cell images by using deep learning. The method takes ResNeXt-50 as a backbone network, a residual error module uses packet convolution to reduce the number of model parameters, an independent attention module structure is added at the end of each stage of the network in parallel, aiming at a white blood cell characteristic diagram output by a convolutional neural network at different stages, a white blood cell key region is extracted by using the attention part of the attention module, a prediction type and a confidence score are output by the output part of the attention module, the output of a final network model is obtained by weighted averaging of the prediction type and the confidence score, and five white blood cell classification is realized based on improvement under the original ResNeXt-50 network framework. The invention utilizes the parallel attention modules to output class prediction and confidence score at different stages of the network, thereby improving the accuracy of leukocyte classification.

Description

Improved attention convolution neural network-based five-classification method for white blood cells
Technical Field
The invention belongs to the field of medical microscopic image classification, and relates to a leukocyte classification method of a convolutional neural network embedded with an attention module in parallel.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Leukocytes are part of the immune system and are responsible for destroying and removing old or abnormal cells and cell debris, as well as attacking pathogens and foreign bodies. Leukocytes commonly found in the blood are mature neutrophils, lymphocytes and monocytes, with a lower number of eosinophils and basophils. An increase or decrease in the number of leukocytes may represent a sign of the onset of certain diseases, and the morphology and proportion of the various leukocytes may reflect a person's health status. The precise classification of leukocytes thus plays a crucial role in clinical diagnosis and therapy. The artificial microscopic examination is the 'gold standard' of clinical leukocyte detection, can accurately classify leukocytes and can observe pathological changes of the leukocytes. However, this method requires the blood sample to be processed into a blood smear, which is then examined under a microscope by a professional examiner, and the examination is complicated, takes a long time, and has a high repeatability, which may lead to fatigue of the examiner and misjudgment or missed judgment of the leukocyte category, thereby affecting diagnosis and treatment of diseases.
In recent years, convolutional neural networks have been widely used in the field of medical image classification because they exhibit good performance in processing data presented by multiple arrays. However, many current deep learning methods cannot fully utilize key features and important information when performing convolution operations, which greatly limits the performance of neural networks. Currently, attention mechanisms have been widely applied in artificial intelligence related fields such as natural language processing and computer vision. The use of the attention mechanism allows the model to learn attention-being able to ignore irrelevant information and focus on critical information, thereby improving the performance of the computer vision task. Some researchers have used convolutional neural networks with attention added mechanisms for the classification of leukocytes, mainly including:
the patent "automatic recognition method of multiple types of white blood cells based on deep convolutional neural network" (CN 110059568A) proposes a deep convolutional neural network with attention added to classify white blood cells. The deep convolutional neural network is formed by cascading 9 initiation modules, auxiliary classifiers are added to the 4 th initiation module and the 7 th initiation module respectively, and the attention of an SE-Net channel is added when each initiation module is cascaded. And pre-training the model in an ILSVRC data set, and finely adjusting the network on a training set to realize the classification of the white blood cells.
The disadvantages are as follows: the auxiliary classifier added in the method utilizes two intermediate feature layers to predict the classes only to prevent the problems of gradient disappearance and the like and does not provide more help for the final decision of the network; in addition, the concept network has too many over-parameters and poor generalization capability, and does not necessarily have a better result.
The patent "method and system for realizing automatic classification of leukocytes based on mixed residual attention residual network" (CN 113343799a) proposes a residual network of a mixed attention mechanism to automatically classify leukocytes, the residual network of the mixed attention mechanism forms a mixed attention residual network by stacking and using residual attention modules, and the feature graph obtained after convolution is used for extracting key features through the mixed attention module to realize classification of leukocytes.
The disadvantages are as follows: the network is formed by stacking a plurality of residual attention modules, the influence of noise on low-level features after traversing a plurality of residual attention connections is ignored, and the low-level features (such as textures) are important in fine-grained identification and can help to distinguish two similar classes.
Disclosure of Invention
Aiming at the problems of noise influence caused by stacking and using a residual error attention module and incapability of completely utilizing different hierarchical features in the existing method, the invention provides an improved attention convolution neural network-based five-classification method for white blood cells, wherein ResNext-50 is used as a backbone network, the number of model parameters can be greatly reduced by grouping convolution in the network, an independent attention mechanism module is embedded in the backbone network in parallel to extract key information from a low-level feature map and a high-level feature map, prediction categories and confidence scores of the different hierarchical feature maps are obtained through the attention module respectively, and finally, model output categories are obtained through confidence weighting and averaging for all prediction categories, so that the purpose of improving the accuracy of white blood cell classification is achieved.
In order to achieve the purpose, the invention adopts the technical scheme that:
a leukocyte classification method based on an improved attention convolution neural network comprises the following steps:
step (1): collecting a leukocyte image, cutting the complete blood microscopic image into an individual image, and labeling the leukocyte by a blood inspection expert;
step (2): carrying out image enhancement operation on the leukocyte micrographs acquired in the step (1), and carrying out pretreatment;
and (3): dividing the white blood cell image data set processed in the step (2) into a training set and a testing set randomly according to a certain proportion;
and (4): constructing an improved attention convolution neural network model, and performing patrol on the model by using the training set divided in the step (3), wherein the process is a forward propagation process;
and (5): after one-time forward propagation, calculating an error between a predicted value and a true value by using a cross entropy loss function, continuously reducing the loss error by using an Adam algorithm, and updating parameters of each layer of the network model, wherein the process is a one-time backward propagation process;
and (6): repeatedly performing the forward propagation in the step (4) and the backward propagation in the step (5), continuously updating the network layer parameters, converging the network model when the number of training rounds reaches the set maximum number of training rounds, finishing the training, and storing the network model with the highest accuracy of the training set as the optimal network model;
and (7): and (4) carrying out white blood cell classification by using the optimal network model stored in the step (6), inputting a white blood cell image into the trained model, and outputting the category of the white blood cells.
The specific implementation process of the step (1) comprises the following steps: firstly, a blood smear provided by a hospital is subjected to white blood cell microscopic image collection by using a microscope, a white blood cell image with the size of 256 x 256 is cut by taking a complete white blood cell as a center on the whole microscopic image, and the type of the cut white blood cell image is marked by a blood specialty.
The specific implementation process of the step (2) comprises the following steps: and performing data enhancement operation on the white blood cells, including up-down left-right cutting, random rotation, image contrast enhancement and mirror image turning.
And (3) randomly dividing the image data set processed in the step (2) into a training set and a testing set according to the proportion of 7: 3.
The network model takes a residual error network ResNeXt-50 as a framework, reduces the model parameter number by using grouping convolution, uses a light attention mechanism to be embedded into each stage of the residual error network ResNeXt-50 in parallel, generates a spatial attention heat map for feature maps of different layers of the network model, and outputs category prediction and confidence score based on local information. The final output prediction is weighted by all class predictions and normalized confidence scores.
The attention mechanism we have adopted can be added after each convolutional layer and does not change the structure of the network as a whole. The attention module mainly comprises two sub-modules: attention head H, it extracts the region where the feature map is most relevant to the class decision; and outputting a head O, generating a category prediction through global pooling and full connection, and outputting a confidence gate score for each attention head. Each attention mechanism will get a class prediction and a confidence score, and finally, all the class predictions are weighted and averaged by the confidence scores to get the final prediction class.
Compared with the prior art, the invention has the following advantages:
1. the improved attention convolution neural network simultaneously adopts an attention mechanism to extract the most interesting parts of the low-level features and the high-level features, and utilizes the features to output class prediction and confidence scores to help the classification decision of the final model.
2. The invention adopts the residual error network model, and uses the grouping convolution in the residual error module, so that the network learns different characteristics, network parameters are reduced, and the model precision is improved.
3. The invention adopts a light-weight attention mechanism, does not bring complex parameters to the model, has flexible attention mechanism modules, can be expanded at different depths and widths, and shows good performance.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 shows the residual block of ResNeXt-50(34 × 4d) used in the present invention.
FIG. 3 is a schematic view of an attention module for use in the present invention.
FIG. 4 is a schematic diagram of an attention convolution neural network in accordance with the present invention.
Fig. 5 is a microscopic image of five types of leukocytes.
Detailed Description
The following further describes a specific embodiment of the present invention with reference to the drawings and technical solutions.
As shown in fig. 1, the general steps of the present invention are as follows:
step 1: collection and preparation of data sets, blood smears made by blood test experts were subjected to the collection of leucocyte microscopic images under the same conditions using a biomicroscope (magnification 1000) equipped with an industrial camera. The whole image was cropped to have a size of 256 × 256 white blood cell image centered on the whole single white blood cell, and the blood test specialist labeled the white blood cell image with a size of 256 × 256 to accurately classify neutrophils, eosinophils, basophils, monocytes, and lymphocytes.
Step 2: and (2) performing data enhancement operation on the leukocyte images collected and labeled in the step (1), specifically, cutting the leukocyte images, respectively cutting out the leukocyte images with the size of 224 × 224 from the four corners and the center of the images, performing mirror image turning, and rotating by 30 degrees and 60 degrees to enhance the image contrast, so as to obtain the leukocyte images at different positions and under different environments, prevent overfitting of the model and increase the generalization capability of the model.
And step 3: randomly dividing the leukocyte data set enhanced in the step 2 into a training set and a test set according to the proportion of 7:3, wherein the training set is used for the parameter training process of the convolution network model, and the test set is used for checking the efficiency of the whole leukocyte five-classification recognition algorithm and updating the parameter weight;
and 4, step 4: an improved attention convolution neural network model is constructed, the network model designed by the invention takes ResNeXt-50 as a backbone network, and an attention mechanism is embedded in parallel after each stage to generate class prediction and confidence score to assist the decision of a final model by utilizing the most useful parts of different level features.
1) The ResNeXt-50 network deduces and deduces the excellent ideas in the indications, ResNet and VGG to obtain a powerful network structure with simple structure. ResNeXt-50 consists of a common convolution structure, some residual blocks, and a full link layer. As shown in fig. 2, the left half of each residual module is a convolution operation composed of two convolution kernels of 1 × 1 and 3 × 3, the right half is a quick connection operation, and the results of the two parts are output through an activation function by an addition operation. Specifically, for the convolution operation of the left half, firstly, 1 × 1 convolution kernel is used to realize the overall dimension of ascending and descending, then the idea of grouping convolution is adopted to divide the channel into 32 branches, the 4-channel feature maps of each 32 branches are respectively operated through 3 × 3 convolution kernel, and the obtained transformation results (feature maps) are aggregated. Similar to Resnet, there are 4 layer layers for the entire ResNeXt-50, layer1 contains 3 residual blocks, layer2 contains 4 residual blocks, layer3 contains 6 residual blocks, and layer4 contains 3 residual blocks.
2) Attention modules are embedded at the end of different stages of ResNeXt-50, as shown in FIG. 3, and each attention module comprises two major parts, namely an attention head and an output head. And (3) performing convolution operation on the feature map Z obtained after convolution by using a convolution kernel of 1 × 1, and outputting an attention heat map M by using a spatial softmax, wherein the product of the attention heat map M and the input feature map Z channel obtains the output H of the attention head through a broadcasting mechanism, wherein M is a 2-dimensional plane, and the spatial softmax is used for the most relevant region in the model learning image. The output H of the attention head of each attention module consists of a spatial dimension reduction layer (namely, a global pooling layer), and then a category prediction o is generated through a full-connection layer, and each attention module makes the category prediction o according to local information of the attention module. However, in some cases, the local features are not sufficient to output a good prediction. In order to alleviate the problem, each attention module and the skeleton network output are led to predict a confidence score c through an inner product of the confidence score and a weight matrix, then the confidence score is normalized through a softmax function to obtain a weight g, the final output of the network is a weighted sum of the class prediction and the confidence score of each output, and a calculation formula is that
output=gnet·outputnet+∑∑gl k·ol k
Where output is the final output of the entire network model, gnetNormalized confidence score, output, for the output of the backbone networknetFor class prediction of skeletal networks, gl kNormalized confidence score, o, for the output of each attention modulel kA class prediction for each attention module.
And (4) training the network model designed in the step (4) by using the leukocyte training set data obtained in the step (3), wherein the process is a forward propagation process.
After one-time forward propagation, the error between the predicted value and the true value is calculated by using a cross entropy loss function, the loss error is continuously reduced by using a random gradient descent algorithm, the parameter of each layer of the network model is updated, a learning rate fixed step size decreasing strategy is adopted, the step size is 7 epochs, the gamma coefficient is 0.1, and the process is a one-time backward propagation process.
And (5) repeatedly carrying out forward propagation in the step 5 and backward propagation in the step 6, continuously updating network layer parameters, converging the network model when the number of training rounds reaches the set maximum number of training rounds, finishing training, and storing the network model with the highest accuracy of the training set as the optimal network model.
And (4) performing five-classification prediction on the leukocyte test set by using the optimal model stored in the step (7), wherein the test accuracy is shown in the following table.
Figure BDA0003316132030000051
It should be understood that parts of the specification not set forth in detail are well within the prior art. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made without departing from the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (3)

1. A five-classification method for white blood cells based on an improved attention convolution neural network is characterized by comprising the following steps:
step (1): collecting a leukocyte image, cutting the complete blood microscopic image into separate images, and labeling the leukocyte with categories including neutrophils, eosinophils, basophils, monocytes and lymphocytes;
step (2): carrying out image enhancement operation on the leukocyte microscopic image acquired in the step (1), and carrying out pretreatment;
and (3): dividing the leukocyte microscopic image data set processed in the step (2) into a training set and a testing set randomly according to a proportion;
and (4): constructing an improved attention convolution neural network model, and training the improved attention convolution neural network model by using the training set divided in the step (3), wherein the process is a one-time forward propagation process;
the improved attention convolution neural network model takes ResNeXt-50 as a backbone network, and an attention mechanism is embedded behind each stage in parallel to generate class prediction and confidence score to assist decision of a final model by utilizing the most useful parts of different hierarchical features;
1) ResNeXt-50 consists of a common convolution structure, some residual blocks and a full connection layer; the left half part of each residual block is formed by convolution operation of two convolution kernels of 1 x 1 and convolution kernel of 3 x 3, the right half part is fast connection operation, and the results of the two parts are output by an activation function through addition operation;
2) embedding attention modules at the end of different stages of ResNeXt-50, wherein each attention module comprises an attention head and an output head; performing convolution operation on the feature map Z obtained after convolution by using a convolution kernel of 1 × 1 and outputting an attention heat map M by using a spatial softmax, wherein the product of the attention heat map M and an input feature map Z channel obtains an attention output H by a broadcasting mechanism, M is a 2-dimensional plane, and the spatial softmax is used for the most relevant region in the model learning image; the output H of the attention head of each attention module consists of a space dimension reduction layer, namely a global pooling layer, a category prediction o is generated through a full-connection layer, and each attention module makes the category prediction o according to local information of the attention module; however, in some cases, local features are not sufficient to output a good prediction; in order to alleviate the problem, each attention module and the skeleton network are output, a confidence score c is predicted through an inner product of the confidence score c and a weight matrix, then the confidence score is normalized through a softmax function to obtain a weight g, the final output of the network is the weighted sum of the class prediction and the confidence score of each output, and the calculation formula is
output=gnet·outputnet+∑∑gl ol k
Where output is the final output of the entire network model, gnetNormalized confidence score, output, for the output of the backbone networknetFor class prediction of skeletal networks, gl kNormalized confidence score, o, for the output of each attention modulel kA category prediction for each attention module;
training the network model designed in the step 4 by using the leukocyte training set data obtained in the step 3, wherein the process is one-time forward propagation;
and (5): after one-time forward propagation, calculating an error between a predicted value and a true value by using a cross entropy loss function, continuously reducing the loss error by using an Adam algorithm, and updating parameters of each layer of the network model, wherein the process is a one-time backward propagation process;
and (6): repeatedly performing the forward propagation in the step (4) and the backward propagation in the step (5), continuously updating the network layer parameters, converging the network model when the number of training rounds reaches the set maximum number of training rounds, finishing the training, and storing the network model with the highest accuracy of the training set as the optimal network model;
and (7): and (4) carrying out white blood cell classification by using the optimal network model stored in the step (6), inputting a white blood cell image into the trained model, and outputting the category of the white blood cells.
2. The five classification method of the attention convolution neural network white blood cells according to claim 1, characterized in that data enhancement operations including up-down-left-right clipping, random rotation, image contrast enhancement and mirror image inversion are performed on the white blood cells in the step (2).
3. The five classification method for the white blood cells of the attention convolution neural network according to claim 1 or 2, characterized in that in the step (3), the image data set processed in the step (2) is randomly divided into a training set and a testing set according to a 7:3 ratio.
CN202111231648.6A 2021-10-22 2021-10-22 Improved attention convolution neural network-based five-classification method for white blood cells Active CN113887503B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111231648.6A CN113887503B (en) 2021-10-22 2021-10-22 Improved attention convolution neural network-based five-classification method for white blood cells

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111231648.6A CN113887503B (en) 2021-10-22 2021-10-22 Improved attention convolution neural network-based five-classification method for white blood cells

Publications (2)

Publication Number Publication Date
CN113887503A true CN113887503A (en) 2022-01-04
CN113887503B CN113887503B (en) 2022-06-14

Family

ID=79004139

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111231648.6A Active CN113887503B (en) 2021-10-22 2021-10-22 Improved attention convolution neural network-based five-classification method for white blood cells

Country Status (1)

Country Link
CN (1) CN113887503B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114638878A (en) * 2022-03-18 2022-06-17 北京安德医智科技有限公司 Two-dimensional echocardiogram pipe diameter detection method and device based on deep learning
CN117422645A (en) * 2023-11-14 2024-01-19 中国科学院长春光学精密机械与物理研究所 Confidence aggregation-based radar point cloud shape completion method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751462A (en) * 2015-03-29 2015-07-01 嘉善加斯戴克医疗器械有限公司 White cell segmentation method based on multi-feature nonlinear combination
CN106897682A (en) * 2017-02-15 2017-06-27 电子科技大学 Leucocyte automatic identifying method in a kind of leukorrhea based on convolutional neural networks
CN109034045A (en) * 2018-07-20 2018-12-18 中南大学 A kind of leucocyte automatic identifying method based on convolutional neural networks
CN110059568A (en) * 2019-03-21 2019-07-26 中南大学 Multiclass leucocyte automatic identifying method based on deep layer convolutional neural networks
CN110059656A (en) * 2019-04-25 2019-07-26 山东师范大学 The leucocyte classification method and system for generating neural network are fought based on convolution
US20200340908A1 (en) * 2017-12-22 2020-10-29 Imec Vzw Fast and Robust Fourier Domain-Based Cell Differentiation
US20210139960A1 (en) * 2017-06-23 2021-05-13 FUNDAClÓ INSTITUT DE CIÈNCIES FOTÒNIQUES Method for quantifying protein copy-number
CN113343975A (en) * 2021-04-22 2021-09-03 山东师范大学 Deep learning-based white blood cell classification system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751462A (en) * 2015-03-29 2015-07-01 嘉善加斯戴克医疗器械有限公司 White cell segmentation method based on multi-feature nonlinear combination
CN106897682A (en) * 2017-02-15 2017-06-27 电子科技大学 Leucocyte automatic identifying method in a kind of leukorrhea based on convolutional neural networks
US20210139960A1 (en) * 2017-06-23 2021-05-13 FUNDAClÓ INSTITUT DE CIÈNCIES FOTÒNIQUES Method for quantifying protein copy-number
US20200340908A1 (en) * 2017-12-22 2020-10-29 Imec Vzw Fast and Robust Fourier Domain-Based Cell Differentiation
CN109034045A (en) * 2018-07-20 2018-12-18 中南大学 A kind of leucocyte automatic identifying method based on convolutional neural networks
CN110059568A (en) * 2019-03-21 2019-07-26 中南大学 Multiclass leucocyte automatic identifying method based on deep layer convolutional neural networks
CN110059656A (en) * 2019-04-25 2019-07-26 山东师范大学 The leucocyte classification method and system for generating neural network are fought based on convolution
CN113343975A (en) * 2021-04-22 2021-09-03 山东师范大学 Deep learning-based white blood cell classification system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DARIA BORTOLOTTI等: "Impact of HLA-G analysis in prevention, diagnosis and treatment of pathological conditions", 《WORLD JOURNAL OF METHODOLOGY》 *
陈畅等: "基于卷积神经网络的外周血白细胞分类", 《中国生物医学工程学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114638878A (en) * 2022-03-18 2022-06-17 北京安德医智科技有限公司 Two-dimensional echocardiogram pipe diameter detection method and device based on deep learning
CN114638878B (en) * 2022-03-18 2022-11-11 北京安德医智科技有限公司 Two-dimensional echocardiogram pipe diameter detection method and device based on deep learning
CN117422645A (en) * 2023-11-14 2024-01-19 中国科学院长春光学精密机械与物理研究所 Confidence aggregation-based radar point cloud shape completion method

Also Published As

Publication number Publication date
CN113887503B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
Dong et al. Classification of cataract fundus image based on deep learning
CN113887503B (en) Improved attention convolution neural network-based five-classification method for white blood cells
Baby et al. Leukocyte classification based on feature selection using extra trees classifier: Atransfer learning approach
CN110969191B (en) Glaucoma prevalence probability prediction method based on similarity maintenance metric learning method
CN111400536B (en) Low-cost tomato leaf disease identification method based on lightweight deep neural network
Bani-Hani et al. Classification of leucocytes using convolutional neural network optimized through genetic algorithm
CN111951246A (en) Multidirectional X-ray chest radiography pneumonia diagnosis method based on deep learning
Ji et al. Research on urine sediment images recognition based on deep learning
CN111104961A (en) Method for classifying breast cancer based on improved MobileNet network
CN112581450B (en) Pollen detection method based on expansion convolution pyramid and multi-scale pyramid
CN113343799A (en) Method and system for realizing automatic classification of white blood cells based on mixed attention residual error network
Shivaprasad et al. Deep learning-based plant leaf disease detection
Muhamad et al. A comparative evaluation of deep learning methods in automated classification of white blood cell images
Magpantay et al. A transfer learning-based deep CNN approach for classification and diagnosis of acute lymphocytic leukemia cells
Yadav Feature Fusion based Deep Learning method for Leukemia cell classification
Alam et al. Benchmarking deep learning frameworks for automated diagnosis of OCULAR TOXOPLASMOSIS: A comprehensive approach to classification and segmentation
Özdem et al. A ga-based cnn model for brain tumor classification
Ridoy et al. A lightweight convolutional neural network for white blood cells classification
Zou et al. Deep learning and its application in diabetic retinopathy screening
Sevinç et al. An effective medical image classification: transfer learning enhanced by auto encoder and classified with SVM
Taha et al. Automatic identification of malaria-infected cells using deep convolutional neural network
Rede et al. White blood cell image classification for assisting pathologist using deep machine learning: the comparative approach
Titoriya et al. PVT-CASCADE network on skin cancer dataset
Fawwaz et al. The Optimization of CNN Algorithm Using Transfer Learning for Marine Fauna Classification
Ahammed et al. Inception V3 Based Transfer Learning Model for the Prognosis of Acute Lymphoblastic Leukemia from Microscopic Images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant