CN108052977B - Mammary gland molybdenum target image deep learning classification method based on lightweight neural network - Google Patents

Mammary gland molybdenum target image deep learning classification method based on lightweight neural network Download PDF

Info

Publication number
CN108052977B
CN108052977B CN201711343994.7A CN201711343994A CN108052977B CN 108052977 B CN108052977 B CN 108052977B CN 201711343994 A CN201711343994 A CN 201711343994A CN 108052977 B CN108052977 B CN 108052977B
Authority
CN
China
Prior art keywords
image
breast
neural network
layer
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711343994.7A
Other languages
Chinese (zh)
Other versions
CN108052977A (en
Inventor
时鹏
钟婧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Normal University
Original Assignee
Fujian Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Normal University filed Critical Fujian Normal University
Priority to CN201711343994.7A priority Critical patent/CN108052977B/en
Publication of CN108052977A publication Critical patent/CN108052977A/en
Application granted granted Critical
Publication of CN108052977B publication Critical patent/CN108052977B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Abstract

The invention relates to a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network. The method applies an image classification algorithm based on deep learning to realize breast density classification of the breast molybdenum target image, and applies a deep learning framework based on a lightweight neural network. The method provided by the invention obviously improves the adaptability on a small-scale image data set, further improves the accuracy and processing speed of breast density classification, and can realize automatic breast density classification of breast molybdenum target images.

Description

Mammary gland molybdenum target image deep learning classification method based on lightweight neural network
Technical Field
The invention belongs to the field of biomedicine, and particularly relates to a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network.
Background
The mammary gland molybdenum target is called mammary gland molybdenum target X-ray radiographic examination, also called molybdenum palladium examination, is the first choice and the simplest and most reliable noninvasive detection means for diagnosing mammary gland diseases at present, has relatively little pain, is simple and easy to operate, has high resolution and good repeatability, can be used for comparing the retained images before and after, is not limited by age and body shape, and is taken as a conventional examination means at present. The mammary gland molybdenum target is used as a relatively non-invasive examination method, can comprehensively and correctly reflect the general anatomical structure of the whole mammary gland, observe the influence of various physiological factors such as menstrual cycle, pregnancy, lactation and the like on the mammary gland structure, and can dynamically observe; to assist in the identification of benign lesions and malignant tumors of the breast; suspicious lesions are discovered in early stage, and are observed by regular follow-up film taking; the pathological changes of breast cancer patients after endocrine treatment, radiotherapy and chemotherapy are followed up and inspected, the curative effect is observed, and the healthy breast is regularly monitored.
The breast molybdenum target is the most important non-invasive examination means for breast cancer screening at present, and the breast image report and data system (BI-RADS) divides the molybdenum target breast density into four grades as an important diagnosis basis. However, in the characteristics of a small number of medical molybdenum target image samples, large difference, uneven density distribution and the like, for the application of mammary molybdenum target image processing and analysis, the manual identification mode can only simply divide the boundary of a mammary region and qualitatively estimate the density of mammary glands in the region, and the requirements of mammary density classification on precision and speed are difficult to meet, but the traditional mammary molybdenum target image automatic density classification method also has the defect of seriously influencing the analysis result: the breast is different in shape, and is difficult to segment various tissues by adopting a traditional method based on a shape model, so that the boundary between the breast and an image background is divided inaccurately; the density distribution of various tissues in the mammary gland is extremely uneven, so that the statistical result of the density distribution histogram is easy to be approximate, the statistical analysis of the whole density of the mammary gland is wrong, and the judgment precision and the processing speed of the mammary gland molybdenum target density classification are seriously influenced.
Disclosure of Invention
The invention aims to provide a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network, which is characterized in that after a mammary gland molybdenum target image generated by conventional mammary gland detection is utilized, the digital mammary gland molybdenum target image is processed and analyzed based on deep learning, so that the mammary gland molybdenum target image is automatically subjected to density classification, the workload of imaging doctors is reduced, and the diagnosis rate of mammary gland diseases is improved.
In order to achieve the purpose, the technical scheme of the invention is as follows: a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network comprises the following steps,
(I) carrying out pixel gray gradient weight calculation on all original images in the mammary gland molybdenum target data set with known density classification to obtain a corresponding gradient weight map;
(II) carrying out erosion and expansion operation on the closed region on the gradient weight map, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
(III) fusing all original images in the mammary gland molybdenum target data set with known density classification with foreground region images corresponding to the original images to obtain an image training set only containing breast muscles and mammary glands;
(IV) constructing a deep learning framework based on a lightweight neural network, wherein the deep learning framework comprises 12 layers, and sequentially comprises an input layer, a convolution layer containing a convolution kernel and adopting a modified linear unit activation function, a pooling layer containing the convolution kernel and adopting a maximum sampling function, a convolution layer containing the convolution kernel and adopting a ReLU activation function, a pooling layer containing the convolution kernel and adopting Maxpooling, a data planarization layer, a 64-bit fully-connected layer, a data discarding layer with a discarding proportion of 0.5, a 4-bit fully-connected layer and an activation layer adopting a normalized exponential activation function as output layers;
(V) increasing the number of input samples of the training set images to a deep learning frame through sample expansion, automatically calculating a classification result by a neural network and comparing the classification result with a real classification, feeding back an obtained error to the neural network to correct each convolution kernel parameter, recalculating the classification result by the corrected network of the training set images and comparing the recalculated classification result with the real classification, feeding back the error to the neural network to correct, and repeating the steps for 200 times to finish a training process;
(VI) carrying out pixel gray gradient weight calculation on the unclassified image to obtain a corresponding gradient weight map;
(VII) carrying out erosion and expansion operation of a closed region on the gradient weight map of the unclassified image, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
(VIII) fusing the unclassified original image with its corresponding foreground region image to obtain a test image containing only breast muscle and mammary gland;
(IX) inputting the test image to a trained neural network, and automatically calculating a classification result by the neural network to finish the test process.
In an embodiment of the present invention, the specific implementation process of step (I) is as follows:
a) traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
b) the gradient weight of a single pixel is the reciprocal of the gradient of the single pixel, and the gradient weights of all the pixels form a gradient weight image with the same size as the original image.
In an embodiment of the present invention, the specific implementation process of step (II) is as follows:
a) carrying out erosion operation on the gradient weight map, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breast and breast muscles from the artificial interferent;
b) performing expansion operation on the gradient weight image with the linear object with the width less than 10 pixels removed, performing expansion operation on the edge of the image closed area by taking a rhombus with the size of 5 pixels as a structural element object, and recovering the original boundary of the main body structure in the image;
c) because the breast and chest muscle area is the main body structure of the molybdenum target image, the structure with the largest area in the gradient weight image is reserved, namely the foreground area only containing the breast and chest muscles, and the boundary of the area is the boundary between the foreground and the background.
In an embodiment of the present invention, the specific implementation process of step (III) is as follows:
a) converting the binarized Mask into the number of bits of a corresponding original image;
b) performing matrix dot multiplication operation on all original images in the mammary gland molybdenum target data set and foreground area images with the same size in one-to-one correspondence, wherein a matrix after the dot multiplication operation is a foreground image;
c) and repeating the dot product operation on all the images in the database to obtain an image training set only containing the breast muscle and the mammary gland.
In an embodiment of the present invention, the specific implementation process of step (IV) is as follows:
a) adding an input layer of 200 x 200 pixels in size;
b) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
c) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
d) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
e) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
f) adding a convolution layer CNN containing 64 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
g) adding a pooling layer containing 64 convolution kernels of 2 x 2 pixel size and using Maxpoling;
h) adding a data planarization layer;
i) adding a 64-bit full-connection layer;
j) adding a data discarding layer with the discarding ratio of 0.5;
k) adding a 4-bit full-connection layer;
l) adding an activation layer adopting a Softmax activation function as an output layer;
m) constructing a deep learning framework of the light weight neural network comprising the 12 layers of various hierarchies.
In an embodiment of the present invention, the specific implementation process of step (V) is as follows:
a) randomly selecting 1 sample from the training set image, inputting the sample into a deep learning frame, and randomly performing random transformation including rotation transformation, width scaling transformation, length scaling transformation and clipping transformation on the sample to generate 32 corresponding samples;
b) inputting the 32 randomly generated samples into a deep learning frame, automatically calculating a classification result by a neural network, comparing the classification result with the real classification to obtain corresponding accuracy and information loss values, and storing 3 parameters of the accuracy, the information loss values and the errors;
c) feeding the obtained error back to the neural network to correct each convolution kernel parameter in the neural network;
d) randomly selecting 1 sample again from the training set image, inputting the sample for random transformation, recalculating a classification result of the corrected network of the obtained 32 samples generated randomly, comparing the recalculated classification result with the real classification, feeding back an error to the neural network for correction, and repeating the process for 200 times to complete the training process;
e) and respectively drawing a correct rate curve and a loss curve according to the correct rate and the loss value obtained in each training process and storing the correct rate curve and the loss value.
In an embodiment of the present invention, the specific implementation process of step (VI) is as follows:
a) traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
b) the gradient weight of a single pixel is the reciprocal of the gradient of the single pixel, and the gradient weights of all the pixels form a gradient weight image with the same size as the original image.
In an embodiment of the present invention, the specific implementation process of step (VII) is as follows:
a) carrying out erosion operation on the gradient weight image, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breast and breast muscle from most of artificial interferents;
b) performing expansion operation on the gradient weight image with the linear object with the width less than 10 pixels removed, performing expansion operation on the edge of the image closed area by taking a rhombus with the size of 5 pixels as a structural element object, and recovering the original boundary of the main body structure in the image;
c) because the breast and chest muscle area is the main body structure of the molybdenum target image, the structure with the largest area in the gradient weight image is reserved, namely the foreground area only containing the breast and chest muscles, and the boundary of the area is the boundary between the foreground and the background.
In an embodiment of the present invention, the specific implementation process of step (VIII) is as follows:
a) converting the binarized foreground area image into the number of bits of a corresponding original image;
b) performing matrix dot multiplication operation on all original images in the mammary gland molybdenum target data set and masks with the same size in one-to-one correspondence, wherein a matrix after the dot multiplication operation is a foreground image;
c) and repeating the dot product operation on all the images in the database to obtain an image training set only containing the breast muscle and the mammary gland.
In an embodiment of the present invention, the specific implementation process of step (IX) is as follows:
a) inputting the test image to a trained neural network, and automatically calculating a classification result by the neural network to finish a test process;
b) and comparing the automatically calculated classification result with the expert classification result, and calculating and recording the classification accuracy.
Compared with the prior art, the invention has the following beneficial effects:
(1) in the method, a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network is adopted, the image classification problem is converted into a machine learning problem, automatic intelligent classification of mammary gland molybdenum target images is realized, the speed is high, the efficiency is high, and the precision of mammary gland density classification can be ensured;
(2) according to the method, the influence of the artificial interferents on the classification result is effectively removed through the segmentation pretreatment of the foreground image containing the breast and breast muscles, and the classification accuracy is improved;
(3) according to the method, limited original images are subjected to random transformation through sample expansion, the number of training set samples is expanded, and the efficiency and classification accuracy of neural network training are improved;
(4) according to the method, the neural network in the traditional deep learning framework is simplified, and the structure comprising the 3-layer CNN neural network is optimized, so that the complexity of the neural network structure is reduced, and the training efficiency and the classification accuracy of the neural network are improved;
(5) the method can realize online real-time breast molybdenum target density classification detection.
Drawings
Figure 1 is a schematic representation of the steps of the present invention.
Fig. 2 is a schematic diagram of foreground image segmentation including breast and breast muscles according to the present invention, where a) is an original breast molybdenum target image without artificial interferents, b) is a gradient weight map corresponding to a), c) is an initial foreground region map corresponding to a), where a red line is a boundary line between a foreground and a background, d) is an original breast molybdenum target image with artificial interferents, e) is a gradient weight map corresponding to d), and f) is a foreground region map corresponding to d), where a red line is a boundary line between a foreground and a background.
FIG. 3 is a schematic diagram of a classification framework for deep learning of mammary molybdenum target images based on a lightweight neural network, which is composed of different types of layers, where an input layer number indicates that the input image has a normalized size of 200 × 200 pixels, a first layer of convolutional layer CNN number indicates that the convolutional kernel size is 3 × 3 pixels, the number of convolutional kernels is 64, a first layer of pooling layer Maxpooling number indicates that the convolutional kernel size is 2 × 2 pixels, the number of convolutional kernels is 32, a second layer of convolutional layer CNN number indicates that the convolutional kernel size is 3 × 3 pixels, the number of convolutional kernels is 64, a second layer of pooling layer Maxpooling number indicates that the convolutional kernel size is 2 × 2 pixels, the number of convolutional kernels is 32, a third layer of convolutional layer CNN number indicates that the convolutional kernel size is 3 × 3 pixels, the number of convolutional kernels is 64, a third layer of pooling layer Maxpooling number indicates that the convolutional kernel size is 2 × 2 pixels, and the number of convolutional kernels is 64, the number of the first layer data planarization layer, Flatten, indicates that the one-dimensional parameters are 33856, the number of the first layer fully-connected layer, density, indicates that the one-dimensional parameters are 64, the number of the first layer data discarding layer, Dropout, indicates that the one-dimensional parameters are 64, the number of the second layer fully-connected layer, indicates that the one-dimensional parameters are 4, and the layer is also an output layer.
FIG. 4 is a schematic diagram illustrating the comparison of classification results of the breast molybdenum target image analysis database (MIAS) with or without foreground segmentation preprocessing according to an embodiment of the present invention.
FIG. 5 is a comparison diagram of the classification results of the breast molybdenum target image analysis database (MIAS) with or without sample expansion according to the embodiment of the present invention.
Fig. 6 is a schematic diagram illustrating comparison of classification results of a deep learning framework using neural networks with different hierarchical structures on a breast molybdenum target image analysis database (MIAS) according to an embodiment of the present invention.
Detailed Description
The invention obtains a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network through extensive and intensive research, the method applies an image classification algorithm based on deep learning to realize mammary gland density classification of a mammary gland molybdenum target image, and due to the application of a deep learning framework based on the lightweight neural network, the method provided by the invention obviously improves the adaptability on a small-scale image data set, further improves the accuracy and processing speed of mammary gland density classification, and can realize automatic mammary gland density classification of a mammary gland molybdenum target image.
Before the present invention is described, it is to be understood that this invention is not limited to the particular methodology and experimental conditions described, as such methodologies and conditions may vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
The invention relates to a mammary gland molybdenum target image deep learning classification method based on a lightweight neural network, which is characterized in that a gray gradient weight image is utilized to carry out foreground image fast segmentation, a preprocessed mammary gland molybdenum target image is obtained to be used as an input data set of a deep learning framework, the molybdenum target image input into a training set is utilized to carry out neural network parameter training, and after training is finished, the unknown classified test set is automatically subjected to density classification, so that the workload of imaging doctors on mammary gland density measurement can be greatly reduced, and the mammary gland disease clinical diagnosis is facilitated.
The core idea of the invention is to introduce the deep learning idea into the automatic intelligent classification of the molybdenum target image of the mammary gland, divide the molybdenum target image sample into four classes according to the density distribution thereof by the light weight neural network combination, including the 1-4 grade mammary gland density which accords with the mammary gland image report and data system (BI-RADS) standard, save a large amount of manpower for imaging doctors, and provide reference for the clinical diagnosis of relevant mammary gland diseases including breast cancer. Firstly, removing artificial interferents in an image by adopting a mammary gland molybdenum target segmentation method based on a gradient weight value image to obtain a foreground region image only containing mammary gland and breast muscles, then training a deep learning framework constructed based on a lightweight neural network through a preprocessed image test set, finally inputting the test set image to the neural network completing training, and automatically calculating a classification result by the neural network to obtain the final density classification of the mammary gland molybdenum target image.
The mammary gland density classification is a difficult point in mammary gland molybdenum target image analysis technology, the method obviously improves the discrimination precision and the processing speed of mammary gland molybdenum target density classification, can be applied to mammary gland disease diagnosis and screening, provides an effective and reliable analysis tool for relevant clinical application and scientific research, and has wide and obvious economic and social benefits.
The technical scheme adopted by the invention for solving the technical problem mainly comprises the following steps:
1. training a mammary gland molybdenum target data set with known density classification, preprocessing all original images by gray gradient weight calculation to obtain foreground region images only containing breast and chest muscles as a training set, constructing a lightweight deep learning framework, training a neural network by taking the training set images as input, finishing training after 200 times of iterative training process is reached, and specifically realizing the training process by 5 steps as follows:
1.1, calculating pixel gray gradient weights of all original images in a mammary molybdenum target data set to obtain a corresponding gradient weight map;
1.2, carrying out erosion and expansion operation on the gradient weight image in a closed region, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
1.3, fusing all original images in the mammary gland molybdenum target data set with corresponding foreground regions to obtain an image data set only containing breast muscles and mammary glands;
1.4, constructing a deep learning framework based on a lightweight Neural Network, which comprises 12 layers, sequentially comprising an input layer with the size of 200 × 200 pixels, a Convolutional layer (CNN) which comprises 32 convolution kernels (cores) with the size of 3 × 3 pixels and adopts a modified linear unit (ReLU) activation function, a Pooling layer (Pooling) which comprises 32 convolution kernels with the size of 2 × 2 pixels and adopts a maximum sampling (Maxpooling) function, a Convolutional layer which comprises 32 convolution kernels with the size of 3 × 3 pixels and adopts a ReLU activation function, a Pooling layer which comprises 32 convolution kernels with the size of 2 × 2 pixels and adopts Maxpooling, a Convolutional layer which comprises 64 convolution kernels with the size of 3 × 3 pixels and adopts a ReLU activation function, a Pooling layer which comprises 64 convolution kernels with the size of 2 × 2 pixels and adopts Maxpooling, a data plane (Flat layer), and a 64-bit full-connection layer (Dense) in sequence, wherein the Convolutional layer (CNN) comprises 12 layers with the size of 200 × 200 pixels, CNN, a Pooling layer (Pooling layer (CNN) which comprises 32 convolution kernels and adopts the size of the maximum sampling (Maxpooling) and adopts the activation function, A data discarding layer (Dropout) with a discarding ratio of 0.5, a 4-bit fully-connected layer, and an Activation layer (Activation) using a normalized exponential (Softmax) Activation function as output layers;
1.5, increasing the number of input samples of the training set image to a deep learning frame through sample expansion, automatically calculating a classification result by a neural network and comparing the classification result with a real classification, feeding back an obtained error to the neural network to correct each convolution kernel parameter, recalculating the classification result by the corrected network of the training set image and comparing the recalculated classification result with the real classification, feeding back the error to the neural network to correct, and repeating the steps for 200 times to finish the training process.
2. Classifying and testing the breast molybdenum target images classified by unknown density, preprocessing the original images by gray gradient weight calculation to obtain foreground region images only containing breast and breast muscles as test images, inputting the foreground region images into a lightweight deep learning frame, automatically calculating classification results by a neural network, completing the test process, and specifically realizing the training process by 4 steps as follows:
2.1, carrying out pixel gray gradient weight calculation on the unclassified image to obtain a corresponding gradient weight map;
2.2, carrying out erosion and expansion operation of a closed region on the gradient weight map of the unclassified image, removing artificial interferents in the image, and obtaining a foreground region image (Mask) only containing breast and chest muscles;
2.3, fusing the unclassified original image with a Mask corresponding to the unclassified original image to obtain a test image only containing breast muscles and mammary glands;
and 2.4, inputting the test image to a trained neural network, and automatically calculating a classification result by the neural network to finish the test process.
In a preferred embodiment of the present invention, the method for deep learning and classifying mammary molybdenum target images based on a lightweight neural network according to the present invention mainly comprises the following steps:
(1) and carrying out pixel gray gradient weight calculation on all original images in the mammary molybdenum target data set with known density classification to obtain a corresponding gradient weight map.
(2) And (2) carrying out erosion and expansion operation on the closed region by using the gradient weight image in the step (1), removing artificial interferents in the image, and obtaining a foreground region image (Mask) only containing breast and chest muscles.
(3) And (3) fusing all original images in the mammary gland molybdenum target data set classified by the known density in the step (1) with the corresponding masks in the step (2) to obtain an image training set only containing the breast muscle and the mammary gland.
(4) Constructing a deep learning framework based on a lightweight Neural Network, wherein the deep learning framework comprises 12 layers, and sequentially comprises an input layer with the size of 200 multiplied by 200 pixels, a Convolutional layer (Convolutional Neural Network, CNN) which comprises 32 convolution kernels with the size of 3 multiplied by 3 pixels (Core) and adopts a modified linear unit (ReLU) activation function, a Pooling layer (Pooling) which comprises 32 convolution kernels with the size of 2 multiplied by 2 pixels and adopts a maximum sampling (Maxpooling) function, a Convolutional layer which comprises 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopts a ReLU activation function, a Pooling layer which comprises 32 convolution kernels with the size of 2 multiplied by 2 pixels and adopts Maxpooling, a Convolutional layer which comprises 64 convolution kernels with the size of 3 multiplied by 3 pixels and adopts a ReLU activation function, a Pooling layer which comprises 64 convolution kernels with the size of 2 multiplied by 2 pixels and adopts Maxpooling, a data planarization layer (Flatten), and a 64-bit fully connected layer (Dense) in sequence, A data discard layer (Dropout) with a discard ratio of 0.5, a 4-bit fully connected layer, and an Activation layer (Activation) using a normalized exponent (Softmax) Activation function as output layers.
(5) And (4) increasing the number of input samples to a deep learning frame by utilizing the training set image in the step (3) through sample expansion, automatically calculating a classification result by a neural network and comparing the classification result with a real classification, feeding back an obtained error to the neural network to correct each convolution kernel parameter, recalculating the classification result by the network obtained after the training set image is corrected and comparing the recalculated classification result with the real classification, feeding back the error to the neural network to correct, and repeating the steps for 200 times to finish the training process.
(6) And carrying out pixel gray gradient weight calculation on the images which are not classified to obtain a corresponding gradient weight map.
(7) And (4) carrying out erosion and expansion operation on the closed region by using the gradient weight map in the step (6), removing artificial interferents in the image, and obtaining a foreground region image (Mask) only containing breast and chest muscles.
(8) Fusing the unclassified breast molybdenum target original image in the step (6) with the corresponding Mask in the step (7) to obtain a test image only containing breast muscles and breasts.
(9) Inputting the test image in the step (8) into the neural network which completes the training in the step (5), and automatically calculating a classification result by the neural network to complete the test process.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the method for calculating the pixel gray gradient weight of the mammary molybdenum target image comprises the following steps:
a) traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
b) the gradient weight of a single pixel is the reciprocal of the gradient of the single pixel, and the gradient weights of all the pixels form a gradient weight image with the same size as the original image.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the foreground region image (Mask) segmentation method only including breast and chest muscles is as follows:
a) carrying out erosion operation on the gradient weight image, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breast and breast muscle from most of artificial interferents;
b) performing expansion operation on the gradient weight image with the linear object with the width less than 10 pixels removed, performing expansion operation on the edge of the image closed area by taking a rhombus with the size of 5 pixels as a structural element object, and recovering the original boundary of the main body structure in the image;
c) because the breast and chest muscle area is the main body structure of the molybdenum target image, the structure with the largest area in the gradient weight image is reserved, namely the foreground area Mask only containing the breast and chest muscle, and the boundary of the area is the boundary between the foreground and the background.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the method for establishing the image training set only comprising the breast muscle and the mammary gland comprises the following steps:
a) converting the binarized Mask into the number of bits of a corresponding original image;
b) performing matrix dot multiplication operation on all original images in the mammary gland molybdenum target data set and masks with the same size in one-to-one correspondence, wherein a matrix after the dot multiplication operation is a foreground image;
c) and repeating the dot product operation on all the images in the database to obtain an image training set only containing the breast muscle and the mammary gland.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the lightweight neural network deep learning framework establishing method comprises the following steps:
a) adding an input layer of 200 x 200 pixels in size;
b) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
c) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
d) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
e) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
f) adding a convolution layer CNN containing 64 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
g) adding a pooling layer containing 64 convolution kernels of 2 x 2 pixel size and using Maxpoling;
h) adding a data planarization layer (Flatten);
i) adding a 64-bit full connection layer (Dense);
j) adding a data discard layer (Dropout) with a discard ratio of 0.5;
k) adding a 4-bit full-connection layer;
l) adding an Activation layer (Activation) adopting a Softmax Activation function as an output layer;
m) constructing a deep learning framework of the light weight neural network comprising the 12 layers of various hierarchies.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the method for sample expansion and neural network training of the input image comprises the following steps:
a) randomly selecting 1 sample from the training set image, inputting the sample into a deep learning frame, and randomly performing random transformation including rotation transformation (angle range +/-20 degrees), width scaling transformation (scaling range +/-0.2 multiplied by width), length scaling transformation (scaling range +/-0.2 multiplied by length) and cutting transformation (cutting range +/-0.2 multiplied by area) on the sample to generate 32 corresponding samples;
b) inputting the 32 randomly generated samples into a deep learning frame, automatically calculating a classification result by a neural network, comparing the classification result with the real classification to obtain corresponding accuracy and information loss values, and storing 3 parameters of the accuracy, the information loss values and the errors;
c) feeding the obtained error back to the neural network to correct each convolution kernel parameter in the neural network;
d) randomly selecting 1 sample again from the training set image, inputting the sample for random transformation, recalculating a classification result of the corrected network of the obtained 32 samples generated randomly, comparing the recalculated classification result with the real classification, feeding back an error to the neural network for correction, and repeating the process for 200 times to complete the training process;
e) and respectively drawing a correct rate curve and a loss curve according to the correct rate and the loss value obtained in each training process and storing the correct rate curve and the loss value.
In a preferred embodiment of the present invention, the breast molybdenum target image deep learning classification method based on the lightweight neural network is characterized in that: the classification test method of the unknown classification mammary gland molybdenum target image comprises the following steps:
a) carrying out pixel gray gradient weight calculation on the images which are not classified to obtain a corresponding gradient weight map;
b) carrying out erosion and expansion operation on a closed region on a gradient weight value image of an unclassified image, removing artificial interferents in the image, and obtaining a foreground region image (Mask) only containing breast and breast muscles;
c) fusing the unclassified original image with a Mask corresponding to the unclassified original image to obtain a test image only containing breast muscles and mammary glands;
d) inputting the test image into the trained neural network, and automatically calculating a classification result by the neural network to complete the test process;
e) and comparing the automatically calculated classification result with the expert classification result, and calculating and recording the classification accuracy.
The invention is further illustrated by the following figures and examples.
Example 1 classification of breast molybdenum target image analysis Association (MIAS) dataset images
Mammary gland molybdenum target X-ray radiographic inspection, also called molybdenum palladium inspection, is carried out on mammary glands to obtain digital mammary gland molybdenum target images.
The method for deeply learning and classifying the mammary gland molybdenum target image based on the lightweight neural network is used for training known classifications in the obtained mammary gland molybdenum target image and testing and analyzing unknown classifications, and mainly comprises the following steps as shown in figure 1:
1. training a mammary gland molybdenum target data set with known density classification, performing gray gradient weight calculation preprocessing on all original images to obtain a foreground region image only containing breast and chest muscles as a training set, constructing a lightweight deep learning frame, training a neural network by taking the training set image as input through sample expansion, completing training after 200 iterative training processes are achieved, and specifically realizing the training process in five steps (see the attached figure 1):
1.1, calculating pixel gray gradient weights of all original images in a mammary molybdenum target data set to obtain a corresponding gradient weight map;
1.2, carrying out erosion and expansion operation on the gradient weight image in a closed region, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
1.3, fusing all original images in the mammary gland molybdenum target data set with corresponding foreground regions to obtain an image data set only containing breast muscles and mammary glands;
1.4, constructing a deep learning framework based on a lightweight neural network, wherein the deep learning framework comprises 12 layers of various types of layers;
1.5, increasing the number of input samples of the training set image to a deep learning frame through sample expansion, automatically calculating a classification result by a neural network and comparing the classification result with a real classification, feeding back an obtained error to the neural network to correct each convolution kernel parameter, recalculating the classification result by the corrected network of the training set image and comparing the recalculated classification result with the real classification, feeding back the error to the neural network to correct, and repeating the steps for 200 times to finish the training process.
2. Classifying and testing the breast molybdenum target images classified by unknown density, preprocessing the original images by gray gradient weight calculation to obtain foreground region images only containing breast and breast muscles as test images, inputting the foreground region images into a lightweight deep learning framework, automatically calculating classification results by a neural network, completing a test process, and specifically realizing the training process by four steps (see attached figure 1):
2.1, carrying out pixel gray gradient weight calculation on the unclassified image to obtain a corresponding gradient weight map;
2.2, carrying out erosion and expansion operation on the closed region on the gradient weight map of the unclassified image, removing artificial interferents in the image, and obtaining a foreground region image (Mask) only containing breast and chest muscles;
2.3, fusing the unclassified original image with a Mask corresponding to the unclassified original image to obtain a test image only containing breast muscles and mammary glands;
and 2.4, inputting the test image into the trained neural network, and automatically calculating a classification result by the neural network to finish the test process.
3. The method comprises the following steps of calculating a corresponding gray gradient weight image of a preprocessed, denoised and enhanced mammary gland molybdenum target image, and concretely realizing the following two steps (see the attached figure 2):
3.1 traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
3.2 the gradient weight of a single pixel is the inverse of its gradient, and the gradient weights of all pixels constitute a gradient weight image (shown in fig. 2b and 2 e) with the same size as the original image.
4. Carrying out erosion and expansion operation of a closed region on the gradient weight image, checking an inflection point of a boundary between an upper breast and an adhesion artificial interferent in the image, removing the artificial interferent in the image, and obtaining the boundary between a foreground region only containing the breast and breast muscles and an image background, wherein the specific implementation process comprises the following three steps (see the attached figure 2):
4.1, carrying out erosion operation on the gradient weight image, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breasts and breast muscles from most of artificial interferents;
4.2, performing expansion operation on the gradient weight image after the linear object with the width less than 10 pixels is removed, and performing expansion operation on the edge of the image closed region by taking a rhombus with the size of 5 pixels as a structural element object to recover the original boundary of the main body structure in the image;
4.3 because the breast and chest muscle area is the main structure of the molybdenum target image, the structure with the largest area in the gradient weight image (shown in fig. 2c and 2 f) is retained, i.e. the foreground area only including the breast and chest muscle, and the outline of the gray wheel in the figure represents the boundary between the foreground and the background.
5. Sequentially adding various layers, and constructing a lightweight neural network deep learning framework which totally comprises 12 layers of various layers, wherein the specific implementation process comprises the following thirteen steps (see the attached figure 3):
5.1 adding an input layer of 200 x 200 pixels size;
5.2 adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
5.3 adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpooling;
5.4 adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
5.5 adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpooling;
5.6 adding a convolution layer CNN containing 64 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
5.7 adding a pooling layer containing 64 convolution kernels of size 2 x 2 pixels and using Maxpooling;
5.8 adding a data planarization layer (Flatten);
5.9 adding a 64-bit full connection layer (Dense);
5.10 adding a data discard layer (Dropout) with a discard ratio of 0.5;
5.11 adding a 4-bit full-connection layer;
5.12 adding an Activation layer (Activation) adopting a Softmax Activation function as an output layer;
5.13 constructing a deep learning framework of the light weight neural network comprising the 12 layers of various hierarchies, wherein the complete framework is shown in FIG. 3.
Supplementary results 1: as shown in fig. 4, in this embodiment, a comparison diagram of classification results of the breast molybdenum target image analysis database (MIAS) with or without foreground segmentation preprocessing is shown, where a curve with relatively smooth change in each sub-image is a training curve generated in a training process, a curve with relatively severe change is a testing curve generated in a testing process, a) is a correct rate curve for classification using an undivided original image, b) is a loss value curve for classification using an undivided original image, c) is a correct rate curve for classification using a segmented foreground image including breast and breast muscle regions, and d) is a loss value curve for classification using a segmented foreground image.
Supplementary results 2: as shown in fig. 5, in this embodiment, a comparison diagram of classification results with or without sample expansion on a breast molybdenum target image analysis database (MIAS) is shown, where a curve with relatively smooth change in each sub-graph is a training curve generated in a training process, a curve with relatively sharp change is a testing curve generated in a testing process, a) is a correct rate curve for classifying an original image without sample expansion, b) is a loss value curve for classifying an original image without sample expansion, c) is a correct rate curve for classifying a randomly generated image set with sample expansion, and d) is a loss value curve for classifying a randomly generated image set with sample expansion.
Supplementary results 3: as shown in fig. 6, in this embodiment, a schematic diagram of comparing classification results of deep learning frames using neural networks with different hierarchical structures on a breast molybdenum target image analysis database (MIAS) is shown, where a curve with a smooth change in each sub-diagram is a training curve generated in a training process, a curve with a sharp change is a testing curve generated in a testing process, a) is a correct rate curve classified by using 2-layer CNN neural networks, b) is a loss value curve classified by using 2-layer CNN neural networks, c) is a correct rate curve classified by using 3-layer CNN neural networks, d) is a loss value curve classified by using 3-layer CNN neural networks, d) is a correct rate curve classified by using 16-layer CNN neural networks, and e) is a loss value curve classified by using 16-layer CNN neural networks.
As can be seen from the above example diagram on the breast molybdenum target image analysis database (MIAS) database, the deep learning framework adopting the foreground image segmentation step, the sample expansion step and the 3-layer CNN neural network structure has an obvious advantage in the classification accuracy, and after 200 times of training and corresponding tests, the final classification accuracy is about 84.8%.
All documents referred to herein are incorporated by reference into this application as if each were individually incorporated by reference. Furthermore, it should be understood that various changes and modifications of the present invention can be made by those skilled in the art after reading the above teachings of the present invention, and these equivalents also fall within the scope of the present invention as defined by the appended claims.
The above are preferred embodiments of the present invention, and all changes made according to the technical scheme of the present invention that produce functional effects do not exceed the scope of the technical scheme of the present invention belong to the protection scope of the present invention.

Claims (8)

1. A mammary gland molybdenum target image deep learning classification method based on a lightweight neural network is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
(I) carrying out pixel gray gradient weight calculation on all original images in the mammary gland molybdenum target data set with known density classification to obtain a corresponding gradient weight map; the specific implementation process is as follows:
a) traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
b) the gradient weight of a single pixel is the reciprocal of the gradient of the single pixel, and the gradient weights of all the pixels form a gradient weight image with the same size as the original image;
(II) carrying out erosion and expansion operation on the closed region on the gradient weight map, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
(III) fusing all original images in the mammary gland molybdenum target data set with known density classification with foreground region images corresponding to the original images to obtain an image training set only containing breast muscles and mammary glands;
(IV) constructing a deep learning framework based on a lightweight neural network, wherein the deep learning framework comprises 12 layers, and sequentially comprises an input layer, a convolution layer containing a convolution kernel and adopting a modified linear unit activation function, a pooling layer containing the convolution kernel and adopting a maximum sampling function, a convolution layer containing the convolution kernel and adopting a ReLU activation function, a pooling layer containing the convolution kernel and adopting Maxpooling, a data planarization layer, a 64-bit fully-connected layer, a data discarding layer with a discarding proportion of 0.5, a 4-bit fully-connected layer and an activation layer adopting a normalized exponential activation function as output layers;
(V) the training set image is added to a deep learning frame through sample expansion, the neural network automatically calculates the classification result and compares the classification result with the real classification, the obtained error is fed back to the neural network to correct each convolution kernel parameter, the network of the training set image after correction recalculates the classification result and compares the classification result with the real classification, the error is fed back to the neural network to correct, the process is circulated for 200 times, and the training process is completed;
(VI) carrying out pixel gray gradient weight calculation on the unclassified image to obtain a corresponding gradient weight map; the specific implementation process is as follows:
a) traversing each pixel of the image from top to bottom and from left to right, calculating the difference between each pixel and the adjacent pixels in the horizontal direction and the vertical direction, and adding the two obtained differences to obtain the gradient containing the change information in the horizontal direction and the vertical direction;
b) the gradient weight of a single pixel is the reciprocal of the gradient of the single pixel, and the gradient weights of all the pixels form a gradient weight image with the same size as the original image;
(VII) carrying out erosion and expansion operation of a closed region on the gradient weight map of the unclassified image, removing artificial interferents in the image, and obtaining a foreground region image only containing breast and breast muscles;
(VIII) fusing the unclassified original image with its corresponding foreground region image to obtain a test image containing only breast muscle and mammary gland;
(IX) inputting the test image to a trained neural network, and automatically calculating a classification result by the neural network to finish the test process.
2. The method of claim 1, wherein: the specific implementation process of the step (II) is as follows:
a) carrying out erosion operation on the gradient weight map, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breast and breast muscles from the artificial interferent;
b) performing expansion operation on the gradient weight image with the linear object with the width less than 10 pixels removed, performing expansion operation on the edge of the image closed area by taking a rhombus with the size of 5 pixels as a structural element object, and recovering the original boundary of the main body structure in the image;
c) because the breast and chest muscle area is the main body structure of the molybdenum target image, the structure with the largest area in the gradient weight image is reserved, namely the foreground area only containing the breast and chest muscles, and the boundary of the area is the boundary between the foreground and the background.
3. The method of claim 1, wherein: the specific implementation process of the step (III) is as follows:
a) converting the binarized foreground region image Mask only containing breast and chest muscles into the number of bits of a corresponding original image;
b) performing matrix dot multiplication operation on all original images in the mammary gland molybdenum target data set and foreground area images with the same size in one-to-one correspondence, wherein a matrix after the dot multiplication operation is a foreground image;
c) and repeating the dot product operation on all the images in the database to obtain an image training set only containing the breast muscle and the mammary gland.
4. The method of claim 1, wherein: the specific implementation process of the step (IV) is as follows:
a) adding an input layer of 200 x 200 pixels in size;
b) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
c) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
d) adding a convolution layer CNN containing 32 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
e) adding a pooling layer containing 32 convolution kernels of 2 x 2 pixel size and using Maxpoling;
f) adding a convolution layer CNN containing 64 convolution kernels with the size of 3 multiplied by 3 pixels and adopting a ReLU activation function;
g) adding a pooling layer containing 64 convolution kernels of 2 x 2 pixel size and using Maxpoling;
h) adding a data planarization layer;
i) adding a 64-bit full-connection layer;
j) adding a data discarding layer with the discarding ratio of 0.5;
k) adding a 4-bit full-connection layer;
l) adding an activation layer adopting a Softmax activation function as an output layer;
m) constructing a deep learning framework of the light weight neural network comprising the 12 layers of various hierarchies.
5. The method of claim 1, wherein: the specific implementation process of the step (V) is as follows:
a) randomly selecting 1 sample from the training set image, inputting the sample into a deep learning frame, and randomly performing random transformation including rotation transformation, width scaling transformation, length scaling transformation and clipping transformation on the sample to generate 32 corresponding samples;
b) inputting the 32 randomly generated samples into a deep learning frame, automatically calculating a classification result by a neural network, comparing the classification result with the real classification to obtain corresponding accuracy and information loss values, and storing 3 parameters of the accuracy, the information loss values and the errors;
c) feeding the obtained error back to the neural network to correct each convolution kernel parameter in the neural network;
d) randomly selecting 1 sample again from the training set image, inputting the sample for random transformation, recalculating a classification result of the corrected network of the obtained 32 samples generated randomly, comparing the recalculated classification result with the real classification, feeding back an error to the neural network for correction, and repeating the process for 200 times to complete the training process;
e) and respectively drawing a correct rate curve and a loss curve according to the correct rate and the loss value obtained in each training process and storing the correct rate curve and the loss value.
6. The method of claim 1, wherein: the specific implementation process of the step (VII) is as follows:
a) carrying out erosion operation on the gradient weight image, taking a rhombus with the size of 5 pixels as a structural element object, carrying out erosion operation on the edge of the image closed region, removing a linear object with the width less than 10 pixels in the image, and separating a foreground region containing breast and breast muscle from most of artificial interferents;
b) performing expansion operation on the gradient weight image with the linear object with the width less than 10 pixels removed, performing expansion operation on the edge of the image closed area by taking a rhombus with the size of 5 pixels as a structural element object, and recovering the original boundary of the main body structure in the image;
c) because the breast and chest muscle area is the main body structure of the molybdenum target image, the structure with the largest area in the gradient weight image is reserved, namely the foreground area only containing the breast and chest muscles, and the boundary of the area is the boundary between the foreground and the background.
7. The method of claim 1, wherein: the specific implementation process of the step (VIII) is as follows:
a) converting the binarized foreground area image into the number of bits of a corresponding original image;
b) performing matrix dot multiplication operation on all original images in the mammary gland molybdenum target data set and masks with the same size in one-to-one correspondence, wherein a matrix after the dot multiplication operation is a foreground image;
c) and repeating the dot product operation on all the images in the database to obtain an image training set only containing the breast muscle and the mammary gland.
8. The method of claim 1, wherein: the specific implementation process of the step (IX) is as follows:
a) inputting the test image to a trained neural network, and automatically calculating a classification result by the neural network to finish a test process;
b) and comparing the automatically calculated classification result with the expert classification result, and calculating and recording the classification accuracy.
CN201711343994.7A 2017-12-15 2017-12-15 Mammary gland molybdenum target image deep learning classification method based on lightweight neural network Active CN108052977B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711343994.7A CN108052977B (en) 2017-12-15 2017-12-15 Mammary gland molybdenum target image deep learning classification method based on lightweight neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711343994.7A CN108052977B (en) 2017-12-15 2017-12-15 Mammary gland molybdenum target image deep learning classification method based on lightweight neural network

Publications (2)

Publication Number Publication Date
CN108052977A CN108052977A (en) 2018-05-18
CN108052977B true CN108052977B (en) 2021-09-14

Family

ID=62132261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711343994.7A Active CN108052977B (en) 2017-12-15 2017-12-15 Mammary gland molybdenum target image deep learning classification method based on lightweight neural network

Country Status (1)

Country Link
CN (1) CN108052977B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2574372B (en) * 2018-05-21 2021-08-11 Imagination Tech Ltd Implementing Traditional Computer Vision Algorithms As Neural Networks
CN108830282A (en) * 2018-05-29 2018-11-16 电子科技大学 A kind of the breast lump information extraction and classification method of breast X-ray image
CN109002831A (en) * 2018-06-05 2018-12-14 南方医科大学南方医院 A kind of breast density classification method, system and device based on convolutional neural networks
CN109035267B (en) * 2018-06-22 2021-07-27 华东师范大学 Image target matting method based on deep learning
CN111091527B (en) * 2018-10-24 2022-07-05 华中科技大学 Method and system for automatically detecting pathological change area in pathological tissue section image
WO2020107167A1 (en) * 2018-11-26 2020-06-04 深圳先进技术研究院 Method and apparatus for automatic grading of mammary gland density
CN109636780A (en) * 2018-11-26 2019-04-16 深圳先进技术研究院 Breast density automatic grading method and device
CN111401396B (en) * 2019-01-03 2023-04-18 阿里巴巴集团控股有限公司 Image recognition method and device
CN109840906A (en) * 2019-01-29 2019-06-04 太原理工大学 The method that a kind of pair of mammography carries out classification processing
CN110009600A (en) * 2019-02-14 2019-07-12 腾讯科技(深圳)有限公司 A kind of medical image area filter method, apparatus and storage medium
CN109902682A (en) * 2019-03-06 2019-06-18 太原理工大学 A kind of mammary gland x line image detection method based on residual error convolutional neural networks
CN110059717A (en) * 2019-03-13 2019-07-26 山东大学 Convolutional neural networks automatic division method and system for breast molybdenum target data set
CN111724450A (en) * 2019-03-20 2020-09-29 上海科技大学 Medical image reconstruction system, method, terminal and medium based on deep learning
CN109993732A (en) * 2019-03-22 2019-07-09 杭州深睿博联科技有限公司 The pectoral region image processing method and device of mammography X
CN109919254B (en) * 2019-03-28 2021-08-17 上海联影智能医疗科技有限公司 Breast density classification method, system, readable storage medium and computer device
CN110232338B (en) * 2019-05-29 2021-02-05 北京邮电大学 Lightweight Web AR (augmented reality) identification method and system based on binary neural network
CN110223280B (en) * 2019-06-03 2021-04-13 Oppo广东移动通信有限公司 Venous thrombosis detection method and venous thrombosis detection device
CN110619947A (en) * 2019-09-19 2019-12-27 南京工程学院 Lung CT auxiliary screening system and method based on lightweight deep learning
CN111598862B (en) * 2020-05-13 2021-05-25 推想医疗科技股份有限公司 Breast molybdenum target image segmentation method, device, terminal and storage medium
CN115909006B (en) * 2022-10-27 2024-01-19 武汉兰丁智能医学股份有限公司 Mammary tissue image classification method and system based on convolution transducer
CN116188488B (en) * 2023-01-10 2024-01-16 广东省第二人民医院(广东省卫生应急医院) Gray gradient-based B-ultrasonic image focus region segmentation method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102147857A (en) * 2011-03-22 2011-08-10 黄晓华 Image processing method for detecting similar round by using improved hough transformation
CN102708550A (en) * 2012-05-17 2012-10-03 浙江大学 Blind deblurring algorithm based on natural image statistic property
CN103985108A (en) * 2014-06-03 2014-08-13 北京航空航天大学 Method for multi-focus image fusion through boundary detection and multi-scale morphology definition measurement
CN104683767A (en) * 2015-02-10 2015-06-03 浙江宇视科技有限公司 Fog penetrating image generation method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102147857A (en) * 2011-03-22 2011-08-10 黄晓华 Image processing method for detecting similar round by using improved hough transformation
CN102708550A (en) * 2012-05-17 2012-10-03 浙江大学 Blind deblurring algorithm based on natural image statistic property
CN103985108A (en) * 2014-06-03 2014-08-13 北京航空航天大学 Method for multi-focus image fusion through boundary detection and multi-scale morphology definition measurement
CN104683767A (en) * 2015-02-10 2015-06-03 浙江宇视科技有限公司 Fog penetrating image generation method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ImageNet Classification with Deep Convolutional Neural Networks;Alex Krizhevsky;《COMMUNICATIONS OF THE ACM》;20170630;全文 *
改进的梯度倒数加权算法在图像平滑中的应用;王洪亮;《红外技术》;20030731;第25卷(第4期);全文 *

Also Published As

Publication number Publication date
CN108052977A (en) 2018-05-18

Similar Documents

Publication Publication Date Title
CN108052977B (en) Mammary gland molybdenum target image deep learning classification method based on lightweight neural network
CN107886514B (en) Mammary gland molybdenum target image lump semantic segmentation method based on depth residual error network
Gunasekara et al. A systematic approach for MRI brain tumor localization and segmentation using deep learning and active contouring
CN109584254A (en) A kind of heart left ventricle's dividing method based on the full convolutional neural networks of deep layer
CN107203989A (en) End-to-end chest CT image dividing method based on full convolutional neural networks
CN109670510A (en) A kind of gastroscopic biopsy pathological data screening system and method based on deep learning
CN111553892B (en) Lung nodule segmentation calculation method, device and system based on deep learning
CN109300136B (en) Automatic segmentation method for organs at risk based on convolutional neural network
CN111681230A (en) System and method for scoring high-signal of white matter of brain
Zhao et al. Cascade and fusion of multitask convolutional neural networks for detection of thyroid nodules in contrast-enhanced CT
Heydarheydari et al. Auto-segmentation of head and neck tumors in positron emission tomography images using non-local means and morphological frameworks
Soleymanifard et al. Segmentation of whole tumor using localized active contour and trained neural network in boundaries
CN114332572B (en) Method for extracting breast lesion ultrasonic image multi-scale fusion characteristic parameters based on saliency map-guided hierarchical dense characteristic fusion network
Nayan et al. A deep learning approach for brain tumor detection using magnetic resonance imaging
Manikandan et al. Segmentation and Detection of Pneumothorax using Deep Learning
CN109214388B (en) Tumor segmentation method and device based on personalized fusion network
CN115661152B (en) Target development condition analysis method based on model prediction
Jagadeesh et al. Brain Tumour Classification using CNN Algorithm
CN115880245A (en) Self-supervision-based breast cancer disease classification method
CN113689950B (en) Method, system and storage medium for identifying blood vessel distribution pattern of liver cancer IHC staining pattern
CN114331996A (en) Medical image classification method and system based on self-coding decoder
Alamin et al. Improved framework for breast cancer detection using hybrid feature extraction technique and ffnn
Lakra et al. A comparative analysis of MRI brain tumor segmentation technique
Barik et al. Cancer detection using cellular automata based segmentation techniques
Anisa et al. Automatic Identification of Cancer Affect in Lungs Using Machine Learning Algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant