CN113033667A - Ultrasound image two-stage deep learning breast tumor classification method and device - Google Patents

Ultrasound image two-stage deep learning breast tumor classification method and device Download PDF

Info

Publication number
CN113033667A
CN113033667A CN202110327899.8A CN202110327899A CN113033667A CN 113033667 A CN113033667 A CN 113033667A CN 202110327899 A CN202110327899 A CN 202110327899A CN 113033667 A CN113033667 A CN 113033667A
Authority
CN
China
Prior art keywords
neural network
deep neural
rads
breast tumor
label
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110327899.8A
Other languages
Chinese (zh)
Other versions
CN113033667B (en
Inventor
张彩彩
梅梅
崔宗敏
梅茁林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Institute of Mechanical and Electrical Engineering Co Ltd
Original Assignee
Jiujiang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiujiang University filed Critical Jiujiang University
Priority to CN202110327899.8A priority Critical patent/CN113033667B/en
Publication of CN113033667A publication Critical patent/CN113033667A/en
Application granted granted Critical
Publication of CN113033667B publication Critical patent/CN113033667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Abstract

The invention discloses a breast tumor classification method and a breast tumor classification device for ultrasound image two-stage deep learning, wherein the method comprises the following steps: obtaining a BI-RADS tag image set D1 from the breast tumor ultrasound image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1; training the deep neural network M1 and the deep neural network M2 through a BI-RADS tag image set D1 and a biopsy tag image set D2; initializing the parameters of a deep neural network M2 with the same part of the deep neural network M1 network structure by adopting the trained parameters of the deep neural network M1; classifying the BI-RADS labels of the breast tumor ultrasound images to be classified; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified, and determining the benign and malignant of the breast tumor ultrasonic image to be classified. The method can effectively improve the accuracy and the intelligent level of the classification diagnosis of the breast tumor ultrasonic image, and can be used for assisting the sonographer in the fields of medical diagnosis and the like.

Description

Ultrasound image two-stage deep learning breast tumor classification method and device
Technical Field
The invention relates to the technical field of computer-aided breast ultrasound image tumor automatic diagnosis, in particular to a breast tumor classification method and device based on ultrasound image two-stage deep learning.
Background
Conventional image-based computer-aided diagnosis systems generally perform good/bad classification by manually defining image features of breast tumors. Recently, deep learning is used to diagnose breast tumor ultrasound images, which has advantages in two aspects: firstly, deep learning can directly find features from ultrasonic images, and the knowledge limitation of manually defined features is made up. Secondly, deep learning is an end-to-end way to learn image characteristics, and the system has higher automation degree and better performance.
The application of the deep learning method requires a breast ultrasound image learning data set containing tumor classification labels. Since most ultrasound images in electronic medical records do not have accurate benign/malignant tags, only BI-rads (Breast Imaging Reporting and Data System) mammary gland image report and Data system rating tags are available. To obtain more learning training data with good/malignant signatures, many studies directly used BI-RADS ranking of images to generate good/malignant signatures for breast tumors. Ballin et al and Cao et al directly treated the BI-RADS 2-4a rated samples as benign and the BI-RADS 4b-6 rated samples as malignant. But since the BI-RADS rating only represents the malignancy probability of a breast tumor, the resulting benign/malignant signature may not match the fact, which often leads to noise signature problems in the field of machine learning.
Liu et al directly treated the BI-RADS 2-3 rated samples as benign, the BI-RADS 5-6 rated samples as malignant, and all samples at BI-RADS 4 grade were biopsied to obtain good/malignant signatures to minimize noise signatures in the training samples. Jiang et al, by designing a sample weighting scheme, simultaneously train the accurate label data and noise data, and give more weight to the accurate label samples. Shen et al propose a strategy for sample selection, i.e. at each training iteration, the most lossy data part is deleted, and the model parameters are updated to minimize the loss function on the rest of the training data, which assumes that the model gradually converges and a classifier with good knowledge resolution will be trained, so that as the training progresses, the training samples labeled incorrectly will show higher loss values, but the sample selection scheme will usually encounter the problem of sample selection bias, resulting in the trained network learning incorrect knowledge instead.
The existence of the noise label can cause negative influence which cannot be eliminated on the performance of the system, and although the influence of the noise label can be reduced to a certain extent in the field of general images by the sample weighting method and the sample selection method, in the breast tumor ultrasonic image set, the number of accurate label data is far smaller than that of label data containing noise. Therefore, it is difficult to design a reasonable sample weighting scheme for breast tumor ultrasound image training data, and the sample selection scheme usually suffers from sample selection bias, which may lead to system performance degradation.
Disclosure of Invention
The embodiment of the invention provides a breast tumor classification method and device for ultrasound image two-stage deep learning, which are used for solving the problems in the background technology.
The embodiment of the invention provides a breast tumor classification method for ultrasound image two-stage deep learning, which comprises the following steps:
acquiring a breast tumor ultrasonic image; obtaining a BI-RADS tag image set D1 from the breast tumor ultrasonic image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1;
constructing a deep neural network M1 and a deep neural network M2;
training a deep neural network M1 through a BI-RADS label image set D1 to obtain a first-stage deep neural network model for BI-RADS label classification; training the deep neural network M2 through the biopsy tag image set D2 to obtain a second-stage deep neural network model for classifying benign and malignant diseases; initializing parameters of a deep neural network M2 of the same part with the deep neural network M1 network structure by adopting trained parameters of the deep neural network M1;
classifying the BI-RADS labels of the breast tumor ultrasound images to be classified by adopting a first-stage deep neural network model; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified by adopting a second-stage deep neural network model, and determining the benign and malignant properties of the breast tumor ultrasonic image to be classified.
Further, the breast tumor classification method for ultrasound image two-stage deep learning provided by the embodiment of the present invention further includes:
obtaining a focus area image set from the BI-RADS label image set D1 to form a training sample of a deep neural network M1; and
a set of lesion region images is obtained from the set of biopsy label images D2, forming a training sample of the deep neural network M2.
Further, the breast tumor classification method for ultrasound image two-stage deep learning provided by the embodiment of the present invention further includes:
and the rest biopsy label images in the breast tumor ultrasonic image are test samples.
Further, the air conditioner is provided with a fan,
the deep neural network M1 includes: the system comprises a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for BI-RADS label judgment;
the deep neural network M2 includes: a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for biopsy label judgment.
Further, the deep convolutional neural network layer structures in the deep neural network M1 and the deep neural network M2 are the same.
Further, the breast tumor classification method for ultrasound image two-stage deep learning provided by the embodiment of the present invention further includes:
and when the BI-RADS label is 1-2 grade, the breast tumor ultrasonic image to be classified is a benign breast tumor ultrasonic image.
The embodiment of the present invention further provides a breast tumor classification device for ultrasound image two-stage deep learning, including:
the label grouping module is used for acquiring breast tumor ultrasonic images and labels for training ultrasonic image two-stage deep learning models; obtaining a BI-RADS tag image set D1 from the breast tumor ultrasonic image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1;
the network construction module is used for constructing a deep neural network M1 and a deep neural network M2;
the model building module is used for training the deep neural network M1 through a BI-RADS label image set D1 to obtain a first-stage deep neural network model for BI-RADS label classification; training the deep neural network M2 through the biopsy tag image set D2 to obtain a second-stage deep neural network model for classifying benign and malignant diseases; initializing parameters of a deep neural network M2 of the same part with the deep neural network M1 network structure by adopting trained parameters of the deep neural network M1;
the tumor classification module is used for classifying the BI-RADS labels of the breast tumor ultrasound images to be classified by adopting a first-stage deep neural network model; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified by adopting a second-stage deep neural network model, and determining the benign and malignant properties of the breast tumor ultrasonic image to be classified.
The embodiment of the invention provides a breast tumor classification method and device for ultrasound image two-stage deep learning, and compared with the prior art, the breast tumor classification method and device have the following beneficial effects:
according to the method, two groups of training samples are obtained on the collected breast tumor ultrasonic image set according to whether a BI-RADS label and a biopsy label are provided at the same time or not, wherein one group is the image set only provided with the BI-RADS label, and the other group is the image set provided with the BI-RADS label and the biopsy label at the same time, in the breast tumor ultrasonic image deep learning model training process, the confusion of the BI-RADS label and the biopsy label is avoided, and the unpredictable negative influence of noise data on the system performance in the traditional training method is eliminated.
In addition, two independent neural networks are constructed and respectively used for carrying out BI-RADS label classification prediction and biopsy result classification prediction on the breast tumor ultrasonic image, and the convolutional neural networks used for image abstract feature extraction in the two networks have the same structure. The training process comprises the steps of training a BI-RADS label classification prediction network, taking parameters of a convolutional neural network in the training network as initialization parameters of the same network structure part in a biopsy result classification prediction network, then training the biopsy result classification prediction network, fully utilizing information of the BI-RADS label and the biopsy label on the premise of not introducing noise data, and improving the performance of automatic benign and malignant detection of the breast tumor ultrasonic image.
Drawings
Fig. 1 is a schematic diagram of a two-stage deep learning model of a breast tumor ultrasound image data set according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a breast tumor classification method for ultrasound image two-stage deep learning, which specifically includes:
step 1: the method comprises the steps of obtaining breast tumor ultrasonic images and labels of a two-stage deep learning model of training ultrasonic images, labeling BI-RADS labels of the collected breast tumor ultrasonic images, generating a BI-RADS label image set D1, extracting images with biopsy result labels in the collected breast tumor ultrasonic images, and generating a biopsy label image set D2.
It should be noted that, on the collected breast tumor ultrasound image set, two sets of training samples are obtained according to whether the BI-RADS tag and the biopsy tag are both provided, one set is the image set with the BI-RADS tag only, and the other set is the image set with the BI-RADS tag and the biopsy tag.
Step 2: constructing two deep neural networks M1 and M2 shown in FIG. 1, wherein M1 comprises a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for BI-RADS label judgment; m2 contains a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for biopsy label judgment, wherein the deep convolutional neural network layers contained in M1 and M2 are identical in structure.
And step 3: performing parameter training on M1 and M2 in two stages, wherein in the first stage, images in a D1 set are preprocessed to obtain a focus area image set, a training sample of a deep neural network M1 is formed, and a deep neural network model for BI-RADS label classification is obtained; in the second stage, parameters of the network part in M2, which are the same as those of M1, are initialized by using the trained parameters of M1, images in the D2 set are preprocessed to obtain a focus area image set, a training sample of the deep neural network M2 is formed, and a deep neural network model for classifying benign and malignant conditions is obtained.
And 4, step 4: preprocessing a breast tumor ultrasonic image to be classified to obtain a focus area image, outputting a BI-RADS label of the image to be classified by using an M1 model, judging that the breast tumor is benign if the output BI-RADS label is 1-2 grade, and outputting a good and malignant prediction label of the image to be classified by using an M2 model if the BI-RADS label is 3 grade or more.
Example analysis:
a breast Tumor Classification method based on ultrasound image Two-stage deep Learning, also called BTCTDL (extracted patient Tumor Classification method based on Two-phase of deep Learning for ultrasound Images), takes a sample set of 1900 breast ultrasound Images (including 930 samples with biopsy result labels, 470 benign samples and 460 malignant samples) as an example, and comprises the following specific steps:
1) the method comprises the steps of performing lesion area image extraction on 1900 breast ultrasound images, labeling BI-RADS labels, generating a BI-RADS label training sample set D1, and randomly selecting 235 samples with benign biopsy labels and 230 samples with malignant biopsy labels from D1 as a training set D2, wherein the rest 465 samples with biopsy labels are used for testing.
2) Constructing two deep neural networks M1 and M2 shown in FIG. 1, wherein M1 comprises a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for BI-RADS label judgment; m2 contains a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for biopsy label judgment, wherein the deep convolutional neural network layers contained in M1 and M2 are identical in structure.
3) Performing parameter training on M1 and M2 in two stages, wherein in the first stage, images in a D1 set are preprocessed to obtain a focus area image set, a training sample of a deep neural network M1 is formed, and a deep neural network model for BI-RADS label classification is obtained; in the second stage, parameters of the network part in M2, which are the same as those of M1, are initialized by using the trained parameters of M1, images in the D2 set are preprocessed to obtain a focus area image set, a training sample of the deep neural network M2 is formed, and a deep neural network model for classifying benign and malignant conditions is obtained.
4) Preprocessing a breast tumor ultrasonic image to be classified to obtain a focus area image, outputting a BI-RADS label of the image to be classified by using an M1 model, judging that the breast tumor is benign if the output BI-RADS label is 1-2 grade, and otherwise, outputting a benign and malignant prediction label of the image to be classified by using an M2 model.
Experimental analysis:
in order to verify the effectiveness of the method, the BTCTDL method provided by the invention is compared with the conventional Breast Tumor Classification method (Breast Tumor Classification method based on ultrasound image Images with noise laboratories, BTCNL) based on ultrasound image noise label learning, and the comparison on the accuracy of biopsy label prediction is mainly carried out on 465 test samples.
The network model of the BTCNL method then only contains the M2 part of the BTCTDL method, and the training samples are 1900 breast ultrasound images, and the benign/malignant label generated according to their BI-RADS rating. Specifically, a sample with a BI-RADS rating of 2-4a is defined as benign and a sample with a BI-RADS rating of 4b-6 is defined as malignant, and the benign/malignant signature generated by this method may not match the true signature, i.e., a noisy signature is present.
In the invention, AlexNet is adopted by the deep convolutional neural network structure contained in the M1 and M2 networks.
TABLE 1 comparison of the two methods in terms of accuracy
Method Rate of accuracy
BTCTDL 72.5%
BTCNL 78.2%
Table 1 shows that compared with the BTCNL method, the BTCTDL method provided by the present invention has a greatly improved classification accuracy rate of benign and malignant breast tumor ultrasound images, for two reasons: (1) the BI-RADS label and the benign and malignant label are distinguished, so that the negative influence of the noise label on the system is removed; (2) since the network parameters for image feature extraction in the benign-malignant label prediction network M2 are initialized by the trained parameters of the M1 network, and the M2 network largely inherits the knowledge learned by the M1 network, the BTCTDL method makes full use of two information, namely a BI-RADS label and a biopsy label, on the premise of not introducing a noise label. In a word, the method can effectively improve the accuracy and the intelligent level of the classification diagnosis of the breast tumor ultrasonic image, and can be used for the technical field of assisting the sonographer in medical diagnosis and the like.
The embodiment of the invention also provides a breast tumor classification device for ultrasound image two-stage deep learning, which comprises:
the label grouping module is used for acquiring breast tumor ultrasonic images and labels for training ultrasonic image two-stage deep learning models; obtaining a BI-RADS tag image set D1 from the breast tumor ultrasonic image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1;
the network construction module is used for constructing a deep neural network M1 and a deep neural network M2;
the model building module is used for training the deep neural network M1 through a BI-RADS label image set D1 to obtain a first-stage deep neural network model for BI-RADS label classification; training the deep neural network M2 through the biopsy tag image set D2 to obtain a second-stage deep neural network model for classifying benign and malignant diseases; initializing parameters of a deep neural network M2 of the same part with the deep neural network M1 network structure by adopting trained parameters of the deep neural network M1;
the tumor classification module is used for classifying the BI-RADS labels of the breast tumor ultrasound images to be classified by adopting a first-stage deep neural network model; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified by adopting a second-stage deep neural network model, and determining the benign and malignant properties of the breast tumor ultrasonic image to be classified.
The breast tumor classification device for ultrasound image two-stage deep learning provided by the embodiment of the invention and the breast tumor classification method for ultrasound image two-stage deep learning are based on the same inventive concept, so the specific description of the classification device is not repeated.
Although the embodiments of the present invention have been disclosed in the foregoing for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying drawings.

Claims (7)

1. A breast tumor classification method for ultrasound image two-stage deep learning is characterized by comprising the following steps:
acquiring a breast tumor ultrasonic image; obtaining a BI-RADS tag image set D1 from the breast tumor ultrasonic image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1;
constructing a deep neural network M1 and a deep neural network M2;
training a deep neural network M1 through a BI-RADS label image set D1 to obtain a first-stage deep neural network model for BI-RADS label classification; training the deep neural network M2 through the biopsy tag image set D2 to obtain a second-stage deep neural network model for classifying benign and malignant diseases; initializing parameters of a deep neural network M2 of the same part with the deep neural network M1 network structure by adopting trained parameters of the deep neural network M1;
classifying the BI-RADS labels of the breast tumor ultrasound images to be classified by adopting a first-stage deep neural network model; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified by adopting a second-stage deep neural network model, and determining the benign and malignant properties of the breast tumor ultrasonic image to be classified.
2. The method for classifying breast tumors by ultrasound image two-stage deep learning according to claim 1, further comprising:
obtaining a focus area image set from the BI-RADS label image set D1 to form a training sample of a deep neural network M1; and
a set of lesion region images is obtained from the set of biopsy label images D2, forming a training sample of the deep neural network M2.
3. The method for classifying breast tumors by ultrasound image two-stage deep learning according to claim 1, further comprising:
and the rest biopsy label images in the breast tumor ultrasonic image are test samples.
4. The method for classifying breast tumors according to claim 1, wherein the ultrasound image is a two-stage deep learning image,
the deep neural network M1 includes: the system comprises a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for BI-RADS label judgment;
the deep neural network M2 includes: a deep convolutional neural network layer for image abstract feature extraction and a classification output layer for biopsy label judgment.
5. The method for classifying breast tumors by ultrasound image two-stage deep learning according to claim 4, wherein the deep neural network M1 and the deep neural network M2 have the same layer structure.
6. The method for classifying breast tumors by ultrasound image two-stage deep learning according to claim 1, further comprising:
and when the BI-RADS label is 1-2 grade, the breast tumor ultrasonic image to be classified is a benign breast tumor ultrasonic image.
7. A breast tumor classification device for ultrasound image two-stage deep learning is characterized by comprising:
the label grouping module is used for acquiring breast tumor ultrasonic images; obtaining a BI-RADS tag image set D1 from the breast tumor ultrasonic image, and obtaining a biopsy tag image set D2 from the BI-RADS tag image set D1;
the network construction module is used for constructing a deep neural network M1 and a deep neural network M2;
the model building module is used for training the deep neural network M1 through a BI-RADS label image set D1 to obtain a first-stage deep neural network model for BI-RADS label classification; training the deep neural network M2 through the biopsy tag image set D2 to obtain a second-stage deep neural network model for classifying benign and malignant diseases; initializing parameters of a deep neural network M2 of the same part with the deep neural network M1 network structure by adopting trained parameters of the deep neural network M1;
the tumor classification module is used for classifying the BI-RADS labels of the breast tumor ultrasound images to be classified by adopting a first-stage deep neural network model; and when the BI-RADS label is 3 grades or above, classifying the biopsy label of the breast tumor ultrasonic image to be classified by adopting a second-stage deep neural network model, and determining the benign and malignant properties of the breast tumor ultrasonic image to be classified.
CN202110327899.8A 2021-03-26 2021-03-26 Ultrasound image two-stage deep learning breast tumor classification method and device Active CN113033667B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110327899.8A CN113033667B (en) 2021-03-26 2021-03-26 Ultrasound image two-stage deep learning breast tumor classification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110327899.8A CN113033667B (en) 2021-03-26 2021-03-26 Ultrasound image two-stage deep learning breast tumor classification method and device

Publications (2)

Publication Number Publication Date
CN113033667A true CN113033667A (en) 2021-06-25
CN113033667B CN113033667B (en) 2023-04-18

Family

ID=76472605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110327899.8A Active CN113033667B (en) 2021-03-26 2021-03-26 Ultrasound image two-stage deep learning breast tumor classification method and device

Country Status (1)

Country Link
CN (1) CN113033667B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113743463A (en) * 2021-08-02 2021-12-03 中国科学院计算技术研究所 Tumor benign and malignant identification method and system based on image data and deep learning
CN114219807A (en) * 2022-02-22 2022-03-22 成都爱迦飞诗特科技有限公司 Mammary gland ultrasonic examination image grading method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717554A (en) * 2018-05-22 2018-10-30 复旦大学附属肿瘤医院 A kind of thyroid tumors histopathologic slide image classification method and its device
CN109308488A (en) * 2018-08-30 2019-02-05 深圳大学 Breast ultrasound image processing apparatus, method, computer equipment and storage medium
CN109614993A (en) * 2018-11-26 2019-04-12 深圳先进技术研究院 The mechanized classification method and device of mammary gland medical ultrasonic image
CN110728674A (en) * 2019-10-21 2020-01-24 清华大学 Image processing method and device, electronic equipment and computer readable storage medium
CN111768366A (en) * 2020-05-20 2020-10-13 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging system, BI-RADS classification method and model training method
CN111783792A (en) * 2020-05-31 2020-10-16 浙江大学 Method for extracting significant texture features of B-ultrasonic image and application thereof
CN112101451A (en) * 2020-09-14 2020-12-18 北京联合大学 Breast cancer histopathology type classification method based on generation of confrontation network screening image blocks
CN112102343A (en) * 2020-08-12 2020-12-18 海南大学 Ultrasound image-based PTC diagnostic system
CN112334076A (en) * 2018-06-29 2021-02-05 皇家飞利浦有限公司 Biopsy prediction and guidance using ultrasound imaging and associated devices, systems, and methods
CN112508943A (en) * 2020-12-25 2021-03-16 四川工商学院 Breast tumor identification method based on ultrasonic image

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717554A (en) * 2018-05-22 2018-10-30 复旦大学附属肿瘤医院 A kind of thyroid tumors histopathologic slide image classification method and its device
CN112334076A (en) * 2018-06-29 2021-02-05 皇家飞利浦有限公司 Biopsy prediction and guidance using ultrasound imaging and associated devices, systems, and methods
CN109308488A (en) * 2018-08-30 2019-02-05 深圳大学 Breast ultrasound image processing apparatus, method, computer equipment and storage medium
CN109614993A (en) * 2018-11-26 2019-04-12 深圳先进技术研究院 The mechanized classification method and device of mammary gland medical ultrasonic image
CN110728674A (en) * 2019-10-21 2020-01-24 清华大学 Image processing method and device, electronic equipment and computer readable storage medium
CN111768366A (en) * 2020-05-20 2020-10-13 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging system, BI-RADS classification method and model training method
CN111783792A (en) * 2020-05-31 2020-10-16 浙江大学 Method for extracting significant texture features of B-ultrasonic image and application thereof
CN112102343A (en) * 2020-08-12 2020-12-18 海南大学 Ultrasound image-based PTC diagnostic system
CN112101451A (en) * 2020-09-14 2020-12-18 北京联合大学 Breast cancer histopathology type classification method based on generation of confrontation network screening image blocks
CN112508943A (en) * 2020-12-25 2021-03-16 四川工商学院 Breast tumor identification method based on ultrasonic image

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
宋茜等: ""基于BI-RADS的超声乳腺图像的计算机辅助诊断研究"", 《生物医学工程学进展》, vol. 30, no. 1, 31 December 2009 (2009-12-31), pages 9 - 13 *
曹占涛等: ""基于修正标签分布的乳腺超声图像分类"", 《电子科技大学学报》, vol. 49, no. 4, 31 July 2020 (2020-07-31), pages 597 - 602 *
满芮等: ""乳腺癌组织病理学图像分类方法研究综述"", 《计算机科学》, vol. 47, no. 11, 30 November 2020 (2020-11-30), pages 145 - 150 *
詹翔等: ""基于深度学习的乳腺病理图像分类实验方法"", 《计算机应用》, vol. 39, no. 2, 30 December 2019 (2019-12-30), pages 119 *
龚勋等: ""乳腺超声图像中易混淆困难样本的分类方法"", vol. 25, no. 7, pages 1490 - 1500 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113743463A (en) * 2021-08-02 2021-12-03 中国科学院计算技术研究所 Tumor benign and malignant identification method and system based on image data and deep learning
CN113743463B (en) * 2021-08-02 2023-09-26 中国科学院计算技术研究所 Tumor benign and malignant recognition method and system based on image data and deep learning
CN114219807A (en) * 2022-02-22 2022-03-22 成都爱迦飞诗特科技有限公司 Mammary gland ultrasonic examination image grading method, device, equipment and storage medium
CN114219807B (en) * 2022-02-22 2022-07-12 成都爱迦飞诗特科技有限公司 Mammary gland ultrasonic examination image grading method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113033667B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN112101451B (en) Breast cancer tissue pathological type classification method based on generation of antagonism network screening image block
CN111985536B (en) Based on weak supervised learning gastroscopic pathology image Classification method
CN107748900B (en) Mammary gland tumor classification device and storage medium based on discriminative convolutional neural network
CN108898595B (en) Construction method and application of positioning model of focus region in chest image
CN109064455B (en) BI-RADS-based classification method for breast ultrasound image multi-scale fusion
CN107993221B (en) Automatic identification method for vulnerable plaque of cardiovascular Optical Coherence Tomography (OCT) image
CN110245657B (en) Pathological image similarity detection method and detection device
CN106056595A (en) Method for automatically identifying whether thyroid nodule is benign or malignant based on deep convolutional neural network
CN112365464B (en) GAN-based medical image lesion area weak supervision positioning method
CN113033667B (en) Ultrasound image two-stage deep learning breast tumor classification method and device
CN1934589A (en) Systems and methods providing automated decision support for medical imaging
CN110838110A (en) System for identifying benign and malignant tumor based on ultrasonic imaging
CN112116571A (en) X-ray lung disease automatic positioning method based on weak supervised learning
CN111192660A (en) Image report analysis method, equipment and computer storage medium
CN111353978B (en) Method and device for identifying heart anatomy structure
CN111310719B (en) Unknown radiation source individual identification and detection method
CN113159223A (en) Carotid artery ultrasonic image identification method based on self-supervision learning
CN113855063B (en) Heart sound automatic diagnosis system based on deep learning
Shakeel et al. Classification of breast cancer from mammogram images using deep convolution neural networks
CN112085742B (en) NAFLD ultrasonic video diagnosis method based on context attention
CN113902702A (en) Pulmonary nodule benign and malignant auxiliary diagnosis system based on computed tomography
CN111325282B (en) Mammary gland X-ray image identification method and device adapting to multiple models
CN113314215A (en) Ultrasonic thyroid nodule sample abundance and benign and malignant automatic auxiliary identification system
CN112861881A (en) Honeycomb lung recognition method based on improved MobileNet model
CN116664932A (en) Colorectal cancer pathological tissue image classification method based on active learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220316

Address after: No. 528, binwen Road, Binjiang Higher Education Park, Hangzhou, Zhejiang, 310053

Applicant after: ZHEJIANG INSTITUTE OF MECHANICAL & ELECTRICAL ENGINEERING

Address before: 332005 No. 551, East Qianjin Road, Lianxi District, Jiujiang City, Jiangxi Province

Applicant before: JIUJIANG University

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant