WO2021042690A1 - 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置 - Google Patents

基于深度卷积神经网络的乳腺癌辅助诊断方法及装置 Download PDF

Info

Publication number
WO2021042690A1
WO2021042690A1 PCT/CN2020/078146 CN2020078146W WO2021042690A1 WO 2021042690 A1 WO2021042690 A1 WO 2021042690A1 CN 2020078146 W CN2020078146 W CN 2020078146W WO 2021042690 A1 WO2021042690 A1 WO 2021042690A1
Authority
WO
WIPO (PCT)
Prior art keywords
breast cancer
neural network
image
convolutional neural
deep convolutional
Prior art date
Application number
PCT/CN2020/078146
Other languages
English (en)
French (fr)
Inventor
秦传波
宋子玉
曾军英
王璠
林靖殷
何伟钊
邓建祥
Original Assignee
五邑大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 五邑大学 filed Critical 五邑大学
Publication of WO2021042690A1 publication Critical patent/WO2021042690A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Definitions

  • the present invention relates to the technical field of medical devices, in particular to a breast cancer assisted diagnosis method and device based on a deep convolutional neural network.
  • the purpose of the present invention is to solve at least one of the technical problems existing in the prior art, and to provide a breast cancer assisted diagnosis method and device based on a deep convolutional neural network, which can effectively improve the efficiency of breast cancer diagnosis.
  • the first aspect of the present invention provides a breast cancer assisted diagnosis method based on a deep convolutional neural network, which includes the following steps:
  • the above-mentioned breast cancer assisted diagnosis method based on deep convolutional neural network has at least the following beneficial effects: the breast cancer CT image is segmented by the medical image segmentation algorithm of deep convolutional neural network, because the deep convolutional neural network is through A large number of algorithm models obtained after training of a large number of related images, so the corresponding images are segmented by the algorithm model and the output results of the correct rate are higher.
  • the above algorithm can quickly obtain the preliminary results, and then the corresponding results Feed back to doctors in the form of pictures to provide an important reference for the doctor’s diagnosis, thereby effectively improving the efficiency of breast cancer diagnosis.
  • the acquiring a breast cancer CT image and preprocessing the breast cancer CT image includes:
  • the method for assisting breast cancer diagnosis based on a deep convolutional neural network after preprocessing the breast cancer CT image, it further includes: performing an enhancement operation on the data, and the performing an enhancement operation on the data includes: The data is mirrored, zoomed, and elastically deformed.
  • the breast cancer CT image data that has been preprocessed and passed through the enhancement operation is input to a multi-channel deep convolutional neural network for retraining.
  • the Adam and SGD optimizers are continuously iterated, and the hyperparameters are tuned at the same time.
  • Using the preprocessed breast cancer CT images as the retraining data set can improve the accuracy and robustness of the deep convolutional neural network for breast cancer segmentation.
  • the method for assisting breast cancer diagnosis based on deep convolutional neural networks is characterized in that the medical image segmentation algorithm using deep convolutional neural networks to segment breast cancer CT images includes: :
  • the breast cancer CT image data is scaled and input into each coding layer
  • the breast cancer CT image features are extracted through multi-dimensional convolution and down-sampling operations
  • feature reorganization is carried out through jump connection and contextual information to locate the lesion, and feature restoration through multi-dimensional convolution and up-sampling operations to strengthen the propagation of the gradient, and output the segmentation result at the last layer of the network.
  • the second aspect of the present invention provides a breast cancer auxiliary diagnosis device based on a deep convolutional neural network, including:
  • Image preprocessing module used to obtain breast cancer CT images and preprocess breast cancer CT image data
  • the image analysis module is used to segment the breast cancer CT images with the medical image segmentation algorithm of the deep convolutional neural network;
  • the result output module is used to feedback and output the segmentation result in the format of a picture, and transmit the output data to the background storage.
  • the above-mentioned breast cancer auxiliary diagnosis device based on deep convolutional neural network has at least the following beneficial effects: the breast cancer CT image is segmented by the medical image segmentation algorithm of deep convolutional neural network.
  • the above algorithm can quickly obtain the preliminary results, and then the corresponding results Feed back to doctors in the form of pictures to provide an important reference for the doctor’s diagnosis, thereby effectively improving the efficiency of breast cancer diagnosis.
  • the acquiring a breast cancer CT image and preprocessing the breast cancer CT image includes:
  • the deep convolutional neural network-based breast cancer auxiliary diagnosis device after preprocessing the breast cancer CT image, further includes: performing an enhancement operation on the data, and the performing an enhancement operation on the data includes: The data is mirrored, zoomed, and elastically deformed.
  • a breast cancer auxiliary diagnosis device based on a deep convolutional neural network
  • a computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to make a computer execute the aforementioned deep convolutional neural network-based Auxiliary diagnosis method of breast cancer.
  • FIG. 1 is a flowchart of a breast cancer assisted diagnosis method based on a deep convolutional neural network according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a breast cancer auxiliary diagnosis device based on a deep convolutional neural network according to an embodiment of the present invention
  • Figure 3 is a preprocessing flow chart of an embodiment of the present invention.
  • FIG. 5 is a flowchart of deep convolutional neural network training according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a breast cancer assisted diagnosis device based on a deep convolutional neural network according to an embodiment of the present invention
  • FIG. 7 is an overall schematic diagram of a breast cancer assisted diagnosis method based on a deep convolutional neural network according to an embodiment of the present invention.
  • FIG. 8 is a specific schematic diagram of a breast cancer assisted diagnosis method based on a deep convolutional neural network according to an embodiment of the present invention.
  • the cloud server includes a database, a data processing and analysis module, and an intelligent assisted diagnosis module.
  • the patient scans through a CT scanner to obtain corresponding image data.
  • the image data is analyzed and processed by the data processing and analysis module, it is then recognized by the intelligent auxiliary diagnosis module, and finally the results of the recognition processing are output to the doctor, and the picture receipt after the doctor’s diagnosis is also stored in the cloud through the hospital’s network
  • the database on the server can be provided to the hospital as a reference for clinical medical history, and can also be used as training data for the intelligent auxiliary diagnosis module.
  • the cloud server is a cloud server that supports tensor operations.
  • a specific implementation of a breast cancer assisted diagnosis method based on a deep convolutional neural network includes a diagnosis process, a data processing process, a data enhancement process, and a data training process.
  • the patient’s breast cancer CT image is collected, processed by a pre-processing algorithm, and input to the trained deep convolutional neural network model for identification processing.
  • the processed result is fed back to the doctor, and the doctor uses the feedback result as Auxiliary reference, combined with the doctor's professional judgment to diagnose the patient, and label the corresponding image.
  • the labeled image enters the database of the cloud server, after preprocessing by the preprocessing algorithm, and then through the enhancement processing of the data enhancement algorithm, and finally the data is input to the training model of the deep convolutional neural network, which is effective for the deep convolutional neural network.
  • the network model is retrained to improve the accuracy and robustness of the deep convolutional neural network for breast cancer segmentation.
  • an embodiment of the present invention discloses a breast cancer assisted diagnosis method based on a deep convolutional neural network, which includes the following steps:
  • Step S100 Obtain a breast cancer CT image, and preprocess the breast cancer CT image
  • Step S200 Use a deep convolutional neural network medical image segmentation algorithm to segment breast cancer CT images
  • Step S300 feedback and output the segmentation result in a picture format, and transmit the output data to the background storage.
  • the breast cancer CT image is segmented by the medical image segmentation algorithm using the deep convolutional neural network. Since the above-mentioned deep convolutional neural network is an algorithm model obtained after a large number of relevant image training, the corresponding image is processed by the algorithm model. After segmentation, the output result has a high correct rate. Before the doctor makes a diagnosis, the above algorithm can quickly obtain the preliminary result, and then the corresponding result is fed back to the doctor in the form of a picture, which provides an important reference for the doctor's diagnosis, thereby effectively improving Diagnostic efficiency of breast cancer.
  • the image data obtained by the database or the CT scanner is sequentially processed by window adjustment, histogram equalization, normalization, and standardization.
  • the breast cancer CT image window The wide window is adjusted to a suitable size and converted into 8-bit grayscale images, and the histogram is equalized to highlight the lesion; then the breast cancer CT image is normalized and standardized to make the image feature distribution approximately the same.
  • Adjust the window width and window level of breast cancer CT images to a suitable size and convert them into 8-bit grayscale images, and then perform histogram equalization to highlight the lesions, which can achieve image contrast enhancement; and perform image normalization and standardization operations to make the image characteristics Approximately the same distribution facilitates the training and prediction of neural networks.
  • the preprocessing operation After performing the preprocessing operation on the breast cancer CT image, it also includes performing an enhancement operation on the image.
  • the above enhancement operation specifically includes the operations of mirroring, scaling, and elastic deformation of the data.
  • the deep convolution application network model will be retrained. Specifically, the breast cancer CT that has been preprocessed and passed through the enhancement operation will be retrained.
  • the image data is input to a multi-channel deep convolutional neural network for retraining, and iterates continuously through Adam and SGD optimizers, while tuning the hyperparameters.
  • the medical image segmentation algorithm of deep convolutional neural network is used to segment breast cancer CT images, including:
  • the breast cancer CT image data is scaled and input into each coding layer
  • the breast cancer CT image features are extracted through multi-dimensional convolution and down-sampling operations
  • feature reorganization is carried out through jump connection and contextual information to locate the lesion, and feature restoration through multi-dimensional convolution and up-sampling operations to strengthen the propagation of the gradient, and output the segmentation result at the last layer of the network.
  • an embodiment of the present invention also provides a breast cancer auxiliary diagnosis device 1 based on a deep convolutional neural network, including:
  • the image preprocessing module 10 is used for obtaining breast cancer CT images and preprocessing breast cancer CT image data
  • the image analysis module 20 is used to segment the breast cancer CT image by using the medical image segmentation algorithm of the deep convolutional neural network;
  • the result output module 30 is used to feed back and output the segmentation result in a picture format, and transmit the output data to the background storage.
  • the breast cancer CT image is segmented by the medical image segmentation algorithm using the deep convolutional neural network. Since the above-mentioned deep convolutional neural network is an algorithm model obtained after a large number of relevant image training, the corresponding image is processed by the algorithm model. After segmentation, the output result has a high correct rate. Before the doctor makes a diagnosis, the above algorithm can quickly obtain the preliminary result, and then the corresponding result is fed back to the doctor in the form of a picture, which provides an important reference for the doctor's diagnosis, thereby effectively improving Diagnostic efficiency of breast cancer.
  • acquiring a breast cancer CT image and preprocessing the breast cancer CT image includes:
  • preprocessing the breast cancer CT image it also includes: performing an enhancement operation on the data, and the enhancement operation on the data includes the operations of mirroring, scaling, and elastic deformation of the data.
  • an embodiment of the present invention also provides a breast cancer assisted diagnosis device 2000 based on a deep convolutional neural network.
  • the breast cancer assisted diagnosis device 2000 based on a deep convolutional neural network can be any type of smart terminal. For example, handsets, tablets, personal computers, etc.
  • the breast cancer auxiliary diagnosis device 2000 based on a deep convolutional neural network includes: one or more control processors 2100 and a memory 2200.
  • one control processor 2100 is taken as an example.
  • the control processor 2100 and the memory 2200 may be connected through a bus or in other ways. In FIG. 6, the connection through a bus is taken as an example.
  • the memory 2200 can be used to store non-transitory software programs, non-transitory computer-executable programs and modules, such as breast cancer based on deep convolutional neural networks in the embodiment of the present invention Program instructions/modules corresponding to the auxiliary diagnostic equipment 2000.
  • the control processor 2100 executes various functional applications and data processing of the breast cancer auxiliary diagnosis device 1 based on the deep convolutional neural network by running the non-transitory software programs, instructions, and modules stored in the memory 2200, thus realizing the above method The breast cancer assisted diagnosis method based on the deep convolutional neural network of the embodiment.
  • the memory 2200 may include a storage program area and a storage data area, where the storage program area may store an operating system and an application program required by at least one function: the storage data area may store a feature extraction device created based on an improved convolution block. Data, etc.
  • the memory 2200 may include a high-speed random access memory, and may also include a non-transitory memory, such as at least one magnetic disk storage device, a flash memory device, or other non-transitory solid-state storage devices.
  • the memory 2200 may optionally include a memory 2200 remotely provided with respect to the control processor 2100, and these remote memories 2200 may be connected to the deep convolutional neural network-based breast cancer auxiliary diagnosis device 2000 through a network.
  • networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the aforementioned one or more modules are stored in the memory 2200, and when executed by the one or more control processors 2100, a breast cancer assisted diagnosis based on a deep convolutional neural network in the aforementioned method embodiment is executed method.
  • the embodiment of the present invention also provides a computer-readable storage medium that stores computer-executable instructions, and the computer-executable instructions are executed by one or more control processors 2100, for example, as shown in FIG. 6
  • One of the control processors 2100 in the control processor 2100 can make the one or more control processors 2100 execute the method of breast cancer assisted diagnosis based on the deep convolutional neural network in the above method embodiments, for example, execute the above-described FIG. 1 Steps S100 to S300 in the method in FIG. 2 implement the functions of the devices 10 to 30 in FIG. 2.
  • the device embodiments described above are merely illustrative, and the devices described as separate components may or may not be physically separated, that is, they may be located in one place, or they may be distributed on multiple network devices. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each implementation manner can be implemented by means of software plus a general hardware platform.
  • All or part of the processes in the above-mentioned embodiment methods can be instructed by a computer program to instruct the relevant hardware to be incomplete.
  • the program can be stored in a computer readable storage medium, and the program is being executed. At this time, it may include the flow of the embodiment of the above-mentioned method.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Theoretical Computer Science (AREA)
  • Primary Health Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种基于深度卷积神经网络的乳腺癌辅助诊断方法及装置,包括以下步骤:获取乳腺癌CT图像,对乳腺癌CT图像进行预处理;采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;将分割结果以图片的格式进行反馈输出,并将输出的数据传输至后台存储。通过采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,由于上述深度卷积神经网络是经过了大量相关图像训练后得到的算法模型,因此相应的图像经过该算法模型的分割后所输出的结果正确率较高,在医生确诊前,通过上述算法可以快速得到初步的结果,再将相应结果以图片的格式反馈给医生,为医生的确诊提供重要的参考,从而有效提升乳腺癌的诊断效率。

Description

基于深度卷积神经网络的乳腺癌辅助诊断方法及装置 技术领域
本发明涉及医疗器械的技术领域,特别涉及一种基于深度卷积神经网络的乳腺癌辅助诊断方法及装置。
背景技术
全球乳腺癌发病率自20世纪70年代末开始一直呈上升趋势。美国8名妇女一生中就会有1人患乳腺癌。中国不是乳腺癌的高发国家,但不宜乐观,近年我国乳腺癌发病率的增长速度却高出高发国家1~2个百分点。据国家癌症中心和卫生部疾病预防控制局2012年公布的2009年乳腺癌发病数据显示:全国肿瘤登记地区乳腺癌发病率位居女性恶性肿瘤的第1位,女性乳腺癌发病率(粗率)全国合计为42.55/10万,城市为51.91/10万,农村为23.12/10万,乳腺癌已成为当前社会的重大公共卫生问题。
乳腺癌种类以及形状多种多样,大部分病灶区域小,使得在临床诊断过程非常困难,需要医生花费大量时间去观察、检查确定乳腺癌的种类、形态以及区域状况。
发明内容
本发明的目的在于至少解决现有技术中存在的技术问题之一,提供一种基于深度卷积神经网络的乳腺癌辅助诊断方法及装置,能够有效提高乳腺癌的诊断效率。
本发明的第一方面,提供一种基于深度卷积神经网络的乳腺癌辅助诊断方法,包括以下步骤:
获取乳腺癌CT图像,对乳腺癌CT图像进行预处理;
采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
将分割结果以图片的格式进行反馈输出,并将输出的数据传输至后台存储。
上述的基于深度卷积神经网络的乳腺癌辅助诊断方法至少具有以下有益效果:通过采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,由于上述深度卷积神经网络是经过了大量相关图像训练后得到的算法模型,因此相应的图像经过该算法模型的分割后所输出的结果正确率较高,在医生确诊前,通过上述算法可以快速得到初步的结果,再将相应结果以图片的格式反馈给医生,为医生的确诊提供重要的参考,从而有效提升乳腺癌的诊断效率。
根据本发明第一方面所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,所述获取乳腺癌CT图像,对乳腺癌CT图像进行预处理,包括:
调整乳腺癌CT图像窗宽,并进行直方图均衡化突出病灶部位;
然后对乳腺癌CT图像进行归一化、标准化操作。将乳腺癌CT图像窗宽窗位调整至合适大小转换为8位灰度级图像,再进行直方图均衡化突出病灶部位,可以实现图像对比度增强;而进行图像归一化、标准化操作使图像特征分布近似相同则便于神经网络的训练和预测。
根据本发明第一方面所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,在对乳腺癌CT图像进行预处理后还包括:对数据进行增强操作,所述对数据进行增强操作包括,将数据进行镜像、缩放、以及弹性形变的操作。
根据本发明第一方面所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,还包括:
将预处理后通过增强操作的乳腺癌CT图像数据,输入至一个多通道深度卷积神经网络进行再训练,通过Adam和SGD优化器不断迭代,同时对超参数进行调优。将预处理后乳腺癌CT图像作为再训练的数据集,可以提升深度卷积神经网络对乳腺癌分割的准确性和鲁棒性。
根据本发明第一方面所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,所述的采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,包括:
多尺度输入阶段,将乳腺癌CT图像数据经过缩放输入每一个编码层;
编码阶段,通过多维卷积和下采样操作提取乳腺癌CT图像特征;
解码阶段,通过跳跃连接联系上下文信息进行特征重组,对病灶进行定位,并通过多维卷积和上采样操作进行特征还原,加强梯度的传播,在网络的末层输出分割结果。
本发明的第二方面,提供一种基于深度卷积神经网络的乳腺癌辅助诊断装置,包括:
图像预处理模块,用于获取乳腺癌CT图像,对乳腺癌CT图像数据进行预处理;
图像分析模块,用于采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
结果输出模块,用于将分割结果将以图片的格式进行反馈输出,并将输出的数据传输至后台存储。
上述的基于深度卷积神经网络的乳腺癌辅助诊断装置至少具有以下有益效果:通过采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,由于上述深度卷积神经网络是经过了大量相关图像训练后得到的算法模型,因此相应的图像经过该算法模型的分割后所输出的结果正确率较高,在医生确诊前,通过上述算法可以快速得到初步的结果,再将相应结果以图片的格式反馈给医生,为医生的确诊提供重要的参考,从而有效提升乳腺癌的诊断效率。
根据本发明第二方面所述的基于深度卷积神经网络的乳腺癌辅助诊断装置,所述获取乳腺癌CT图像,对乳腺癌CT图像进行预处理,包括:
调整乳腺癌CT图像窗宽,并进行直方图均衡化突出病灶部位;
然后对乳腺癌CT图像进行归一化、标准化操作。
根据本发明第二方面所述的基于深度卷积神经网络的乳腺癌辅助诊断装置,在对乳腺癌CT图像进行预处理后还包括:对数据进行增强操作,所述对数据进行增强操作包括,将数据进行镜像、缩放、以及弹性形变的操作。
本发明的第三方面,提供一种基于深度卷积神经网络的乳腺癌辅助诊断设 备,包括至少一个控制处理器和用于与所述至少一个控制处理器所通信连接的存储器:所述存储器存储有可被所述至少一个控制处理器执行的指令,所述指令被所述至少一个控制处理器执行,以使所述至少一个控制处理器能够执行上述的基于深度卷积神经网络的乳腺癌辅助诊断方法。
本发明的第三方面,提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行上述的基于深度卷积神经网络的乳腺癌辅助诊断方法。
附图说明
下面结合附图和实施例对本发明进一步地说明。
图1为本发明实施例基于深度卷积神经网络的乳腺癌辅助诊断方法的流程图;
图2为本发明实施例基于深度卷积神经网络的乳腺癌辅助诊断装置的示意图;
图3为本发明实施例的预处理流程图;
图4为本发明实施例的数据增强流程图;
图5为本发明实施例的深度卷积神经网络训练流程图;
图6为本发明实施例的基于深度卷积神经网络的乳腺癌辅助诊断设备的示意图;
图7为本发明实施例基于深度卷积神经网络的乳腺癌辅助诊断方法的整体示意图;
图8为本发明实施例基于深度卷积神经网络的乳腺癌辅助诊断方法的具体示意图。
具体实施方式
本部分将详细描述本发明的具体实施例,本发明之较佳实施例在附图中示出,附图的作用在于用图形补充说明书文字部分的描述,使人能够直观地、形象地理解本发明的每个技术特征和整体技术方案,但其不能理解为对本发明保护范 围的限制。
本发明的描述中,除非另有明确的限定,设置、安装、连接等词语应做广义理解,所属技术领域技术人员可以结合技术方案的具体内容合理确定上述词语在本发明中的具体含义。
需要说明的是,如果不冲突,本发明实施例中的各个特征可以相互结合,均在本发明的保护范围之内。另外,虽然在装置示意图中进行了功能模块划分,在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于装置中的模块划分,或流程图中的顺序执行所示出或描述的步骤。
参照图7,一种基于深度卷积神经网络的乳腺癌辅助诊断方法的整体实施方式,云服务器包括数据库、数据处理分析模块、智能辅助诊断模块,患者通过CT扫描仪扫描得到相应的图像数据,图像数据经过数据处理分析模块分析处理后,再经过智能辅助诊断模块的识别处理,最后将识别处理后的结果输出给医生,而医生确诊处理后的图片收据还会通过医院的网络,储存到云服务器上的数据库,可以提供给医院作为临床病史作参考,也可以作为智能辅助诊断模块的训练数据。其中,云服务器为支持张量运算的云端服务器。
参照图8,一种基于深度卷积神经网络的乳腺癌辅助诊断方法的具体实施方式,包括诊断过程、数据处理过程、数据增强过程、数据训练过程。具体地,患者乳腺癌CT图像经采集后,经过预处理算法的处理,输入到训练好的深度卷积神经网络模型进行识别处理,最后将处理的结果反馈到医生处,医生根据反馈的结果作为辅助参考,结合医生的专业判断对患者进行确诊,并对相应的图像进行标签操作。其中,被标签后的图像进入云服务器的数据库,经过预处理算法的预处理后,再经过数据增强算法的增强处理,最后将数据输入到深度卷积神经网络的训练模型,对深度卷积神经网络模型进行再训练,从而提升深度卷积神经网络对乳腺癌分割的准确性和鲁棒性。
参照图1,本发明实施例公开了基于深度卷积神经网络的乳腺癌辅助诊断方法,包括以下步骤:
步骤S100:获取乳腺癌CT图像,对乳腺癌CT图像进行预处理;
步骤S200:采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
步骤S300:将分割结果以图片的格式进行反馈输出,并将输出的数据传输至后台存储。
通过采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,由于上述深度卷积神经网络是经过了大量相关图像训练后得到的算法模型,因此相应的图像经过该算法模型的分割后所输出的结果正确率较高,在医生确诊前,通过上述算法可以快速得到初步的结果,再将相应结果以图片的格式反馈给医生,为医生的确诊提供重要的参考,从而有效提升乳腺癌的诊断效率。
关于对图像数据进行预处理,参照图3,将数据库或者CT扫描仪得到的图像数据依次进行窗调整、直方图均衡化、归一化、标准化的操作处理,具体地,将乳腺癌CT图像窗宽窗位调整至合适大小转换为8位灰度级图像,并进行直方图均衡化突出病灶部位;然后对乳腺癌CT图像进行归一化、标准化操作使图像特征分布近似相同。将乳腺癌CT图像窗宽窗位调整至合适大小转换为8位灰度级图像,再进行直方图均衡化突出病灶部位,可以实现图像对比度增强;而进行图像归一化、标准化操作使图像特征分布近似相同则便于神经网络的训练和预测。
对乳腺癌CT图像进行预处理操作后,还包括对图像进行增强操作,参照图4,上述的增强操作具体包括,将数据进行镜像、缩放、以及弹性形变的操作。
参照图5,为了提升深度卷积神经网络对乳腺癌分割的准确性和鲁棒性,还会对深度卷积申请网络模型进行再训练,具体地,将预处理后通过增强操作的乳腺癌CT图像数据,输入至一个多通道深度卷积神经网络进行再训练,通过Adam和SGD优化器不断迭代,同时对超参数进行调优。
进一步地,采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,包括:
多尺度输入阶段,将乳腺癌CT图像数据经过缩放输入每一个编码层;
编码阶段,通过多维卷积和下采样操作提取乳腺癌CT图像特征;
解码阶段,通过跳跃连接联系上下文信息进行特征重组,对病灶进行定位,并通过多维卷积和上采样操作进行特征还原,加强梯度的传播,在网络的末层输出分割结果。
参照图2,本发明实施例还提供一种基于深度卷积神经网络的乳腺癌辅助诊断装置1,包括:
图像预处理模块10,用于获取乳腺癌CT图像,对乳腺癌CT图像数据进行预处理;
图像分析模块20,用于采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
结果输出模块30,用于将分割结果将以图片的格式进行反馈输出,并将输出的数据传输至后台存储。通过采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,由于上述深度卷积神经网络是经过了大量相关图像训练后得到的算法模型,因此相应的图像经过该算法模型的分割后所输出的结果正确率较高,在医生确诊前,通过上述算法可以快速得到初步的结果,再将相应结果以图片的格式反馈给医生,为医生的确诊提供重要的参考,从而有效提升乳腺癌的诊断效率。
进一步地,获取乳腺癌CT图像,对乳腺癌CT图像进行预处理,包括:
调整乳腺癌CT图像窗宽,并进行直方图均衡化突出病灶部位;
然后对乳腺癌CT图像进行归一化、标准化操作。
进一步地,在对乳腺癌CT图像进行预处理后还包括:对数据进行增强操作,所述对数据进行增强操作包括,将数据进行镜像、缩放、以及弹性形变的操作。
需要说明的是,由于本实施例中的基于深度卷积神经网络的乳腺癌辅助诊断装置与上述的基于深度卷积神经网络的乳腺癌辅助诊断方法基于相同的发明构思,因此,方法实施例中的相应内容同样适用于本装置实施例,此处不再详述。
参照图6,本发明的实施例还提供了一种基于深度卷积神经网络的乳腺癌辅助诊断设备2000,该基于深度卷积神经网络的乳腺癌辅助诊断设备2000可以是任意类型的智能终端,例如子机、平板电脑、个人计算机等。
具体地,该基于深度卷积神经网络的乳腺癌辅助诊断设备2000包括:一个或多个控制处理器2100和存储器2200,图6中以一个控制处理器2100为例。控制处理器2100和存储器2200可以通过总线或者其他方式连接,图6中以通过总线连接为例。
存储器2200作为一种非暂态计算机可读存储介质,可用于存储非暂态软件程序、非暂态性计算机可执行程序以及模块,如本发明实施例中的基于深度卷积神经网络的乳腺癌辅助诊断设备2000对应的程序指令/模块。控制处理器2100通过运行存储在存储器2200中的非暂态软件程序、指令以及模块,从而执行基于深度卷积神经网络的乳腺癌辅助诊断装置1的各种功能应用以及数据处理,即实现上述方法实施例的基于深度卷积神经网络的乳腺癌辅助诊断方法。
存储器2200可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序:存储数据区可存储根据基于改进卷积块的特征提取装置的使用所创建的数据等。此外,存储器2200可以包括高速随机存取存储器,还可以包括非暂态存储器,例如至少一个磁盘存储器件、闪存器件、或其他非暂态固态存储器件。
在一些实施方式中,存储器2200可选包括相对于控制处理器2100远程设置的存储器2200,这些远程存储器2200可以通过网络连接至该基于深度卷积神经网络的乳腺癌辅助诊断设备2000。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
上述的一个或者多个模块存储在所述存储器2200中,当被所述一个或者多个控制处理器2100执行时,执行上述方法实施例中的一种基于深度卷积神经网络的乳腺癌辅助诊断方法。
本发明实施例还提供了一种计算机可读存储介质,所述计算机可读存储介 质存储有计算机可执行指令,该计算机可执行指令被一个或多个控制处理器2100执行,例如,被图6中的一个控制处理器2100执行,可使得上述一个或多个控制处理器2100执行上述方法实施例中的一种基于深度卷积神经网络的乳腺癌辅助诊断方法,例如,执行以上描述的图1中的方法步骤Sl00至S300,实现图2中的装置10至30的功能。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的装置可以是或者也可以不是物理上分开的,即可以位于一个地方,或者也可以分布到多个网络装置上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施方式的描述,本领域技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现。本领域技术人员可以理解实现上述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件未完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(ReadOnly Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
上面结合附图对本发明实施例作了详细说明,但是本发明不限于上述实施例,在所述技术领域普通技术人员所具备的知识范围内,还可以在不脱离本发明宗旨的前提下作出各种变化。

Claims (10)

  1. 基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,包括以下步骤:
    获取乳腺癌CT图像,对乳腺癌CT图像进行预处理;
    采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
    将分割结果以图片的格式进行反馈输出,并将输出的数据传输至后台存储。
  2. 根据权利要求1所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,所述获取乳腺癌CT图像,对乳腺癌CT图像进行预处理,包括:
    调整乳腺癌CT图像窗宽,并进行直方图均衡化突出病灶部位;
    然后对乳腺癌CT图像进行归一化、标准化操作。
  3. 根据权利要求1所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,在对乳腺癌CT图像进行预处理后还包括:对数据进行增强操作,所述对数据进行增强操作包括,将数据进行镜像、缩放、以及弹性形变的操作。
  4. 根据权利要求3所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,还包括:
    将通过增强操作的乳腺癌CT图像数据,输入至一个多通道深度卷积神经网络进行再训练,通过Adam和SGD优化器不断迭代,同时对超参数进行调优。
  5. 根据权利要求1所述的基于深度卷积神经网络的乳腺癌辅助诊断方法,其特征在于,所述的采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割,包括:
    多尺度输入阶段,将乳腺癌CT图像数据经过缩放输入每一个编码层;
    编码阶段,通过多维卷积和下采样操作提取乳腺癌CT图像特征;
    解码阶段,通过跳跃连接联系上下文信息进行特征重组,对病灶进行定位,并通过多维卷积和上采样操作进行特征还原,加强梯度的传播,在网络的末层输出分割结果。
  6. 基于深度卷积神经网络的乳腺癌辅助诊断装置,其特征在于,包括:
    图像预处理模块,用于获取乳腺癌CT图像,对乳腺癌CT图像数据进行预处理;
    图像分析模块,用于采用深度卷积神经网络的医学图像分割算法对乳腺癌CT图像进行病灶分割;
    结果输出模块,用于将分割结果将以图片的格式进行反馈输出,并将输出的数据传输至后台存储。
  7. 根据权利要求6所述的基于深度卷积神经网络的乳腺癌辅助诊断装置,其特征在于,所述获取乳腺癌CT图像,对乳腺癌CT图像进行预处理,包括:
    调整乳腺癌CT图像窗宽,并进行直方图均衡化突出病灶部位;
    然后对乳腺癌CT图像进行归一化、标准化操作。
  8. 根据权利要求6所述的基于深度卷积神经网络的乳腺癌辅助诊断装置,其特征在于,在对乳腺癌CT图像进行预处理后还包括:对数据进行增强操作,所述对数据进行增强操作包括,将数据进行镜像、缩放、以及弹性形变的操作。
  9. 基于深度卷积神经网络的乳腺癌辅助诊断设备,其特征在于:包括至少一个控制处理器和用于与所述至少一个控制处理器所通信连接的存储器:所述存储器存储有可被所述至少一个控制处理器执行的指令,所述指令被所述至少一个控制处理器执行,以使所述至少一个控制处理器能够执行如权利要求1-5任一项所述的基于深度卷积神经网络的乳腺癌辅助诊断方法。
  10. 一种计算机可读存储介质,其特征在于:所述计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使计算机执行如权利要求1-5任一项所述的一种基于深度卷积神经网络的乳腺癌辅助诊断方法。
PCT/CN2020/078146 2019-09-05 2020-03-06 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置 WO2021042690A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910835262.2A CN110751621B (zh) 2019-09-05 2019-09-05 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置
CN201910835262.2 2019-09-05

Publications (1)

Publication Number Publication Date
WO2021042690A1 true WO2021042690A1 (zh) 2021-03-11

Family

ID=69276168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078146 WO2021042690A1 (zh) 2019-09-05 2020-03-06 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置

Country Status (2)

Country Link
CN (1) CN110751621B (zh)
WO (1) WO2021042690A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220248880A1 (en) * 2020-04-28 2022-08-11 Boe Technology Group Co., Ltd. Intelligent vase system, flower recognition and presentation method and electronic apparatus

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751621B (zh) * 2019-09-05 2023-07-21 五邑大学 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置
CN111899828A (zh) * 2020-07-31 2020-11-06 青岛百洋智能科技股份有限公司 一种知识图谱驱动的乳腺癌诊疗方案推荐系统
CN111968147B (zh) * 2020-08-06 2022-03-15 电子科技大学 一种基于关键点检测的乳腺癌病理图像综合分析系统
CN111899259A (zh) * 2020-08-27 2020-11-06 海南大学 一种基于卷积神经网络的前列腺癌组织微阵列分级方法
CN112950624A (zh) * 2021-03-30 2021-06-11 太原理工大学 基于深度卷积神经网络的直肠癌t分期自动诊断方法及设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108257135A (zh) * 2018-02-01 2018-07-06 浙江德尚韵兴图像科技有限公司 基于深度学习方法解读医学图像特征的辅助诊断系统
CN109002831A (zh) * 2018-06-05 2018-12-14 南方医科大学南方医院 一种基于卷积神经网络的乳腺密度分类方法、系统及装置
CN109727243A (zh) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 乳腺超声图像识别分析方法及系统
CN110751621A (zh) * 2019-09-05 2020-02-04 五邑大学 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9589374B1 (en) * 2016-08-01 2017-03-07 12 Sigma Technologies Computer-aided diagnosis system for medical images using deep convolutional neural networks
CN106339591B (zh) * 2016-08-25 2019-04-02 汤一平 一种基于深度卷积神经网络的预防乳腺癌自助健康云服务系统
CN110047082B (zh) * 2019-03-27 2023-05-16 深圳大学 基于深度学习的胰腺神经内分泌肿瘤自动分割方法及系统
CN110189323B (zh) * 2019-06-05 2022-12-13 深圳大学 一种基于半监督学习的乳腺超声图像病灶分割方法
CN111028242A (zh) * 2019-11-27 2020-04-17 中国科学院深圳先进技术研究院 一种肿瘤自动分割系统、方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108257135A (zh) * 2018-02-01 2018-07-06 浙江德尚韵兴图像科技有限公司 基于深度学习方法解读医学图像特征的辅助诊断系统
CN109002831A (zh) * 2018-06-05 2018-12-14 南方医科大学南方医院 一种基于卷积神经网络的乳腺密度分类方法、系统及装置
CN109727243A (zh) * 2018-12-29 2019-05-07 无锡祥生医疗科技股份有限公司 乳腺超声图像识别分析方法及系统
CN110751621A (zh) * 2019-09-05 2020-02-04 五邑大学 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220248880A1 (en) * 2020-04-28 2022-08-11 Boe Technology Group Co., Ltd. Intelligent vase system, flower recognition and presentation method and electronic apparatus

Also Published As

Publication number Publication date
CN110751621B (zh) 2023-07-21
CN110751621A (zh) 2020-02-04

Similar Documents

Publication Publication Date Title
WO2021042690A1 (zh) 基于深度卷积神经网络的乳腺癌辅助诊断方法及装置
Rahane et al. Lung cancer detection using image processing and machine learning healthcare
WO2022063199A1 (zh) 一种肺结节自动检测方法、装置及计算机系统
CN110084318B (zh) 一种结合卷积神经网络和梯度提升树的图像识别方法
CN114565761B (zh) 一种基于深度学习的肾透明细胞癌病理图像肿瘤区域的分割方法
CN111915584B (zh) 一种基于ct影像的病灶随访评估方法及系统
CN109389584A (zh) 基于cnn的多尺度鼻咽肿瘤分割方法
CN107563434B (zh) 一种基于三维卷积神经网络的脑部mri图像分类方法、装置
EP4177828A1 (en) Method and system for domain knowledge augmented multi-head attention based robust universal lesion detection
WO2023045231A1 (zh) 一种解耦分治的面神经分割方法和装置
CN108648182B (zh) 一种基于分子亚型的乳腺癌核磁共振图像肿瘤区域分割方法
US11836997B2 (en) Convolutional localization networks for intelligent captioning of medical images
CN109785311B (zh) 一种疾病诊断装置、电子设备及存储介质
Yang et al. Towards unbiased COVID-19 lesion localisation and segmentation via weakly supervised learning
CN115471470A (zh) 一种食管癌ct图像分割方法
CN114140465A (zh) 基于宫颈细胞切片图像的自适应的学习方法和学习系统
CN113313680A (zh) 一种结直肠癌病理图像预后辅助预测方法及系统
JP2024143991A (ja) マルチタスク学習ネットワークにおける画像分割方法及びシステム
CN116721289A (zh) 基于自监督聚类对比学习的宫颈oct图像分类方法及系统
Singh et al. Detection and classification of brain tumor using hybrid feature extraction technique
Ramamurthy et al. A novel two‐staged network for skin disease detection using atrous residual convolutional networks
CN116823848A (zh) 基于图像融合技术的多模态脑肿瘤分割方法
Kalaivani et al. A Deep Ensemble Model for Automated Multiclass Classification Using Dermoscopy Images
CN112861916A (zh) 一种浸润性宫颈癌病理图像分类方法和系统
CN110570943A (zh) 智能推荐mdt入组的方法及装置、电子设备、存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20861814

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20861814

Country of ref document: EP

Kind code of ref document: A1