CN112668624A - Breast ultrasound image tumor classification method based on attention neural network - Google Patents

Breast ultrasound image tumor classification method based on attention neural network Download PDF

Info

Publication number
CN112668624A
CN112668624A CN202011533794.XA CN202011533794A CN112668624A CN 112668624 A CN112668624 A CN 112668624A CN 202011533794 A CN202011533794 A CN 202011533794A CN 112668624 A CN112668624 A CN 112668624A
Authority
CN
China
Prior art keywords
layer
neural network
classification
breast ultrasound
attention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011533794.XA
Other languages
Chinese (zh)
Inventor
屈晓磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Erxiang Foil Technology Co ltd
Original Assignee
Suzhou Erxiang Foil Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Erxiang Foil Technology Co ltd filed Critical Suzhou Erxiang Foil Technology Co ltd
Priority to CN202011533794.XA priority Critical patent/CN112668624A/en
Publication of CN112668624A publication Critical patent/CN112668624A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a breast ultrasound image tumor classification method based on an attention neural network, which comprises the following steps: s1, data expansion, namely, image enhancement is carried out by using the breast ultrasonic image of the patient on the training set, and the data set of the training set is expanded; s2, performing feature recalibration, namely designing an attention mechanism module SE of neuron dimension, and performing feature recalibration on neuron features; s3 relieving overfitting, designing a classification module containing a global average pooling layer, a full connection layer, a batch normalization layer, a ReLu activation layer and a Dropout layer, reducing parameters and relieving overfitting; and S4 testing and evaluating, namely training on a training set, inputting the tested data set into a final model, evaluating the performance of the model through various classification indexes, and comparing with the traditional VGG16 network before improvement. In the application, the improved convolutional neural network based on the attention mechanism is applied to the mammary gland ultrasonic image, so that the breast ultrasonic image can be classified automatically.

Description

Breast ultrasound image tumor classification method based on attention neural network
Technical Field
The invention relates to the technical field of biomedicine, in particular to a breast ultrasound image tumor classification method based on an attention neural network.
Background
Nowadays, breast cancer has become the most common cancer in women worldwide, and early screening of breast cancer and immediate treatment can effectively reduce mortality. At present, the mainstream breast cancer screening method in China is a breast imaging technology, and mainly comprises mammography, nuclear magnetic resonance imaging and ultrasonic imaging. X-ray imaging has certain ionizing radiation, which can cause harm to human bodies; in addition, because the method can only generate two-dimensional images, tumors are difficult to distinguish in the images for dense breast breasts; and the compression of the device can cause pain to the patient. Mri imaging is long, difficult to prescribe and expensive. The ultrasonic imaging technology has small ionizing radiation, the imaging is a three-dimensional image, meanwhile, the screening process does not bring pain to patients, the imaging speed is high, and the price is economic. However, during the process of screening by sliding the probe, a large number of ultrasound images are scanned, and the doctor needs to read the images to judge the malignancy and malignancy of the tumor. However, the long-time reading of the images can cause fatigue of doctors and cause misreading, and different doctors have different experiences and also bring subjective factors and the like to the screening process. Therefore, there is a need for a method that can help physicians automatically classify whether a breast tumor is benign or malignant.
Currently, many networks have been proposed for image classification tasks. The VGG16 network was proposed by the Visual Geometry Group of oxford university, which demonstrated that increasing the network depth can affect the network's final performance to some extent.
The traditional breast ultrasound image classification algorithm generally needs manual feature engineering, and has complex process and poor robustness. For the defects of the traditional method, a method for automatically classifying breast ultrasound image tumors based on deep learning is needed. The invention applies the improved VGG16 network based on the attention mechanism to the breast ultrasound image classification task, so that the network automatically learns the image characteristics, and the classification accuracy and robustness are improved.
Disclosure of Invention
The invention aims to disclose a breast ultrasound image tumor classification method based on an attention neural network, which improves classification accuracy and robustness by automatically learning image features through the network.
In order to achieve the above object, the present invention provides a breast ultrasound image tumor classification method based on an attention neural network, comprising the following steps:
s1, data expansion, namely, image enhancement is carried out by using the breast ultrasonic image of the patient on the training set, and the data set of the training set is expanded;
s2, performing feature recalibration, namely designing an attention mechanism module SE of neuron dimension, and performing feature recalibration on neuron features;
s3 relieving overfitting, designing a classification module containing a global average pooling layer, a full connection layer, a batch normalization layer, a ReLu activation layer and a Dropout layer, reducing parameters and relieving overfitting;
and S4 testing and evaluating, namely training on a training set, inputting the tested data set into a final model, evaluating the performance of the model through various classification indexes, and comparing with the traditional VGG16 network before improvement.
As a further improvement of the present invention, the specific steps in S1 include: the data set is expanded by carrying out random rotation, vertical turnover, horizontal turnover, size reduction, size amplification, vertical movement, horizontal movement, random noise addition and elastic deformation on each mammary gland ultrasonic image of the training set.
As a further improvement of the present invention, S2 includes: and (4) obtaining a series of weights by utilizing an attention mechanism module SE learning, and screening the neurons which are input originally.
As a further improvement of the present invention, the attention mechanism module SE in S2 includes: the system comprises a global pooling layer, a first full-connection layer, a second full-connection layer and an activation layer.
As a further improvement of the present invention, S3 includes: replacing a full connection layer by using a global average pooling layer to reduce model parameters; the batch normalization layer, the ReLU activation and Dropout layers are added to alleviate the overfitting problem.
As a further improvement of the present invention, in the S3, before designing a classification module that alleviates overfitting, it is necessary to extract the conventional VGG16 network features.
As a further improvement of the present invention, the specific steps in S4 include: the attention mechanism module SE in S2 is combined with the classification module in S3 and combined with the conventional VGG16 network, and the final network structure is obtained through the final activation layer output.
As a further improvement of the invention, the activation layer adopts a Sigmoid activation function.
As a further improvement of the present invention, the S4 is a binary task, and the output of the binary task sets the probability value result of positive prediction to be between 0 and 1 of the closed interval.
As a further improvement of the present invention, the quantitative comparison and the ROC curve qualitative comparison are adopted in the S4.
Compared with the prior art, the invention has the beneficial effects that:
(1) the neural network is applied to classification of breast ultrasound images, so that the benign and malignant breast tumors are automatically classified, the automatic screening of the breast cancer is realized, and the classification precision is higher than that of the traditional VGG 16.
(2) Compared with the traditional VGG16 network, the invention designs the attention mechanism module SE of the neuron dimension to screen the neurons, and further avoids overfitting.
(3) Compared with the traditional VGG16 network, the invention designs a classification module comprising a global average pooling layer, a batch normalization layer, a ReLu activation layer and a Dropout layer in a classification layer part, reduces model parameters and relieves overfitting.
Drawings
FIG. 1 is a diagram illustrating an example of image enhancement of a breast ultrasound image tumor classification method based on an attention neural network according to the present invention;
FIG. 2 is a view of an attention mechanism SE structure of a breast ultrasound image tumor classification method based on an attention neural network according to the present invention;
FIG. 3 is a diagram of a classification module of a breast ultrasound image tumor classification method based on an attention neural network according to the present invention;
FIG. 4 is a final model structure diagram of a breast ultrasound image tumor classification method based on an attention neural network according to the present invention;
FIG. 5 is a ROC curve comparison graph of the final model of the breast ultrasound image tumor classification method based on the attention neural network and VGG 16.
Detailed Description
The present invention is described in detail with reference to the embodiments shown in the drawings, but it should be understood that these embodiments are not intended to limit the present invention, and those skilled in the art should understand that functional, methodological, or structural equivalents or substitutions made by these embodiments are within the scope of the present invention.
Please refer to fig. 1 to 5, which illustrate an embodiment of a breast ultrasound image tumor classification method based on an attention neural network according to the present invention.
In this embodiment, a breast ultrasound image tumor classification method based on an attention neural network includes the following steps: s1, data expansion, namely, image enhancement is carried out by using the breast ultrasonic image of the patient on the training set, and the data set of the training set is expanded; specifically, each ultrasonic image sample of the training set is randomly rotated, vertically turned, horizontally turned, reduced in size, enlarged in size, vertically moved, horizontally moved, randomly added with noise, elastically deformed and the like, so that the data set is continuously expanded, and the fitting capacity and robustness of the model are enhanced. An example of the enhancement results of a single sample image is shown in fig. 1.
S2, performing feature recalibration, namely designing an attention mechanism module SE of neuron dimension, and performing feature recalibration on neuron features; it should be noted that, in the design of the attention mechanism module SE of the neuron dimension, since neurons carry different weight information, part of the neurons are not important for the classification task, and the SE module obtains a series of weights through learning, screens the neurons which are originally input, and plays a role in avoiding overfitting to a certain extent. The structure of the neuron dimension attention mechanism module SE is shown in fig. 2.
S3 relieving an overfitting step, designing a classification module containing a global average pooling layer, a batch normalization layer, a ReLu activation layer and a Dropout layer, reducing parameters and relieving overfitting; specifically, after the VGG16 feature extraction network, a classification module that mitigates overfitting is designed. Firstly, replacing a full connection layer with a global average pooling layer, reducing model parameters, and adding a batch normalization layer, a ReLU activation layer and a Dropout layer to further relieve the overfitting phenomenon. The structure of the classification module is shown in fig. 3.
And S4 testing and evaluating, namely training on a training set, inputting the tested data set into a final model, evaluating the performance of the model through various classification indexes, and comparing with the traditional VGG16 network before improvement. The SE module in S2 is combined with the classification module in S3 and combined with a conventional VGG16 network. And the final network structure is obtained through the final activation layer output, as shown in fig. 4, the activation function adopts Sigmoid, and because the activation function is a binary classification task, the final output is set as the probability value of positive prediction, and the result is between [0 and 1 ].
Classification experiments were performed with the final model of the present invention compared to conventional VGG 16. The results of the quantitative classification of the two methods are shown in Table 1, and the results of the qualitative comparison of the ROC curves of the two methods are shown in FIG. 5. In quantitative comparison, the classification precision of the method is higher than that of VGG16 on each index; in qualitative comparison, the ROC curve bounding area of the present invention is greater than VGG 16. The above results all show that the present invention has a certain improvement in classification accuracy compared to VGG 16.
TABLE 1 improved model vs. VGG16 classification accuracy
Figure BDA0002849707200000051
The above-listed detailed description is only a specific description of a possible embodiment of the present invention, and they are not intended to limit the scope of the present invention, and equivalent embodiments or modifications made without departing from the technical spirit of the present invention should be included in the scope of the present invention.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (10)

1. A breast ultrasound image tumor classification method based on an attention neural network is characterized by comprising the following steps:
s1, data expansion, namely, image enhancement is carried out by using the breast ultrasonic image of the patient on the training set, and the data set of the training set is expanded;
s2, performing feature recalibration, namely designing an attention mechanism module SE of neuron dimension, and performing feature recalibration on neuron features;
s3 relieving overfitting, designing a classification module containing a global average pooling layer, a full connection layer, a batch normalization layer, a ReLu activation layer and a Dropout layer, reducing parameters and relieving overfitting;
and S4 testing and evaluating, namely training on a training set, inputting the tested data set into a final model, evaluating the performance of the model through various classification indexes, and comparing with the traditional VGG16 network before improvement.
2. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein the specific steps in S1 include: the data set is expanded by carrying out random rotation, vertical turnover, horizontal turnover, size reduction, size amplification, vertical movement, horizontal movement, random noise addition and elastic deformation on each mammary gland ultrasonic image of the training set.
3. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein said step S2 comprises: and (4) obtaining a series of weights by utilizing an attention mechanism module SE learning, and screening the neurons which are input originally.
4. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein said step S3 comprises: replacing a full connection layer by using a global average pooling layer to reduce model parameters; the batch normalization layer, the ReLU activation and Dropout layers are added to alleviate the overfitting problem.
5. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein said attention mechanism module SE in S2 comprises: the system comprises a global pooling layer, a first full-connection layer, a second full-connection layer and an activation layer.
6. The method of claim 1, wherein in the step S3, before designing a classification module for alleviating overfitting, conventional VGG16 network features need to be extracted.
7. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein the specific steps in S4 include: the attention mechanism module SE in S2 is combined with the classification module in S3 and combined with the conventional VGG16 network, and the final network structure is obtained through the final activation layer output.
8. The method for classifying breast ultrasound image tumors based on the attention neural network as claimed in claim 5 or 7, wherein the activation layer adopts Sigmoid activation function.
9. The method as claimed in claim 7, wherein the step S4 is a binary task, and the output of the classification task is a probability value between 0 and 1 for positive prediction.
10. The method for classifying tumors in breast ultrasound images based on attention neural network as claimed in claim 1, wherein quantitative comparison and ROC curve qualitative comparison are employed in said S4.
CN202011533794.XA 2020-12-21 2020-12-21 Breast ultrasound image tumor classification method based on attention neural network Withdrawn CN112668624A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011533794.XA CN112668624A (en) 2020-12-21 2020-12-21 Breast ultrasound image tumor classification method based on attention neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011533794.XA CN112668624A (en) 2020-12-21 2020-12-21 Breast ultrasound image tumor classification method based on attention neural network

Publications (1)

Publication Number Publication Date
CN112668624A true CN112668624A (en) 2021-04-16

Family

ID=75407853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011533794.XA Withdrawn CN112668624A (en) 2020-12-21 2020-12-21 Breast ultrasound image tumor classification method based on attention neural network

Country Status (1)

Country Link
CN (1) CN112668624A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113421240A (en) * 2021-06-23 2021-09-21 深圳大学 Mammary gland classification method and device based on ultrasonic automatic mammary gland full-volume imaging
CN113688931A (en) * 2021-09-01 2021-11-23 什维新智医疗科技(上海)有限公司 Ultrasonic image screening method and device based on deep learning

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113421240A (en) * 2021-06-23 2021-09-21 深圳大学 Mammary gland classification method and device based on ultrasonic automatic mammary gland full-volume imaging
CN113421240B (en) * 2021-06-23 2023-04-07 深圳大学 Mammary gland classification method and device based on ultrasonic automatic mammary gland full-volume imaging
CN113688931A (en) * 2021-09-01 2021-11-23 什维新智医疗科技(上海)有限公司 Ultrasonic image screening method and device based on deep learning
CN113688931B (en) * 2021-09-01 2024-03-29 什维新智医疗科技(上海)有限公司 Deep learning-based ultrasonic image screening method and device

Similar Documents

Publication Publication Date Title
Jadoon et al. Three‐class mammogram classification based on descriptive CNN features
Wei et al. A benign and malignant breast tumor classification method via efficiently combining texture and morphological features on ultrasound images
Cao et al. Breast mass detection in digital mammography based on anchor-free architecture
CN116097302A (en) Connected machine learning model with joint training for lesion detection
Hizukuri et al. Computer-aided diagnosis scheme for distinguishing between benign and malignant masses on breast DCE-MRI images using deep convolutional neural network with Bayesian optimization
CN112668624A (en) Breast ultrasound image tumor classification method based on attention neural network
Safdarian et al. Detection and classification of breast cancer in mammography images using pattern recognition methods
Chen et al. Breast tumor classification in ultrasound images by fusion of deep convolutional neural network and shallow LBP feature
Cabral et al. Fractal analysis of breast masses in mammograms
Zhou et al. Deep learning-based breast region extraction of mammographic images combining pre-processing methods and semantic segmentation supported by Deeplab v3+
Ramadhani A Review Comparative Mamography Image Analysis on Modified CNN Deep Learning Method
Altan Breast cancer diagnosis using deep belief networks on ROI images
Sarosa et al. Breast cancer classification using GLCM and BPNN
Girija Mammogram pectoral muscle removal and classification using histo-sigmoid based ROI clustering and SDNN
Alzahrani et al. Deep learning approach for breast ultrasound image segmentation
Boudouh et al. Breast cancer: New mammography dual-view classification approach based on pre-processing and transfer learning techniques
Midya et al. Edge weighted local texture features for the categorization of mammographic masses
Wei et al. Multi-feature fusion for ultrasound breast image classification of benign and malignant
Wisudawati et al. Feature extraction optimization with combination 2D-discrete wavelet transform and gray level co-occurrence matrix for classifying normal and abnormal breast tumors
Chugh et al. TransNet: a comparative study on breast carcinoma diagnosis with classical machine learning and transfer learning paradigm
Hizukuri et al. Computerized Segmentation Method for Nonmasses on Breast DCE-MRI Images Using ResUNet++ with Slice Sequence Learning and Cross-Phase Convolution
Hassan et al. A deep learning model for mammography mass detection using mosaic and reconstructed multichannel images
Saini et al. DMAeEDNet: Dense Multiplicative Attention Enhanced Encoder Decoder Network for Ultrasound-Based Automated Breast Lesion Segmentation
Dong et al. A classification method for breast images based on an improved VGG16 network model
Abed et al. Detection and Segmentation of Breast Cancer Using Auto Encoder Deep Neural Networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210416