CN112820377B - Deep learning-based automatic generation method for radiotherapy plan - Google Patents

Deep learning-based automatic generation method for radiotherapy plan Download PDF

Info

Publication number
CN112820377B
CN112820377B CN202110142971.XA CN202110142971A CN112820377B CN 112820377 B CN112820377 B CN 112820377B CN 202110142971 A CN202110142971 A CN 202110142971A CN 112820377 B CN112820377 B CN 112820377B
Authority
CN
China
Prior art keywords
convolution
radiotherapy
module
flux map
deep learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110142971.XA
Other languages
Chinese (zh)
Other versions
CN112820377A (en
Inventor
杨益东
袁曾泰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202110142971.XA priority Critical patent/CN112820377B/en
Publication of CN112820377A publication Critical patent/CN112820377A/en
Application granted granted Critical
Publication of CN112820377B publication Critical patent/CN112820377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Urology & Nephrology (AREA)
  • Surgery (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiation-Therapy Devices (AREA)

Abstract

The invention discloses an automatic generation method of a radiotherapy plan based on deep learning, which comprises the following steps of A, preprocessing a case database to obtain a case characteristic image and a case flux map; step B, training a flux map prediction neural network by using the case feature images and the case flux map, and obtaining an optimal flux map prediction neural network through cross verification; step C, reading and storing medical digital images of patients to be treated with radiotherapy and preprocessing communication standard files to obtain characteristic images of the patients to be treated with radiotherapy; and D, transmitting the characteristic image of the radiotherapeutic person to a prediction neural network to obtain a predicted flux map of the radiotherapeutic person, and transmitting the predicted flux map of the radiotherapeutic person to a planning system so as to generate a predicted radiotherapeutic plan of the radiotherapeutic person. The invention generates the radiotherapy plan which can be directly used by introducing the deep learning of the convolutional neural network, reduces the dependence on the personal experience of a physical engineer, reduces the artificial prediction error, shortens the time for making the radiotherapy plan and provides the personalized treatment plan.

Description

Deep learning-based automatic generation method for radiotherapy plan
Technical Field
The invention relates to the technical field of medical radiotherapy treatment, in particular to an automatic radiotherapy plan generation method based on deep learning.
Background
Emerging technologies such as Intensity Modulated Radiotherapy Treatment (IMRT), volume arc dynamic rotation intensity modulated radiotherapy (VMAT) have become common radiotherapy treatment modalities, which exhibit great advantages in providing highly adaptive dose distribution to achieve better coverage of planned target volumes (Planning Target Volume, PTV) and organ-at-risk residuals, resulting in a substantial improvement in radiotherapy planning quality. However, making a qualified plan requires the physical operator to communicate feedback with the attending physician multiple times, involving a significant amount of manual intervention, which takes a significant amount of time. Meanwhile, the quality of radiotherapy planning is limited to experience accumulation of a physical engineer, and the consistency of the planning quality among different institutions is difficult to ensure.
In recent years, there have been research teams working on automated radiotherapy planning studies. Specifically, in 2019, dan et al published on Medical Physics as "3D radiotherapy dose prediction on head and neck cancer patients with a hierarchically densely connected U-net deep learning architecture," which proposes that accurate prediction of dose distribution of head and neck cancer patients be achieved based on a Hierarchically Densely Connected U-net neural network. In 2020, chen et al published on Medical Physics article "A feasibility study on an automated method to generate patient-specific dose distributions for radiotherapy using deep learning" which proposed a method for accurate dose prediction based on deep learning.
The method comprises the steps of firstly calculating the predicted dose distribution to obtain a dose objective function, and then inputting the objective function into a treatment system to obtain a usable radiotherapy plan through calculation. These methods do not directly generate a usable radiotherapy plan, are complex and require repeated calculations. With respect to the limitations of the existing methods, researchers in the field have sought solutions.
Disclosure of Invention
The invention aims to provide an automatic generation method of a radiotherapy plan based on deep learning, which is used for solving the problem that a flux map which can be used for the radiotherapy plan cannot be directly obtained by using the prior art.
In order to achieve the purpose of the invention, the invention provides an automatic generation method of a radiotherapy plan based on deep learning, which realizes the technical scheme of solving the problems and comprises the following steps:
step A: preprocessing a case database to obtain a case feature image and a case flux map;
and (B) step (B): training a flux map prediction neural network by using the case feature images and the case flux map, and obtaining an optimal flux map prediction neural network through cross verification;
step C: reading and preprocessing a medical digital image storage and communication standard file of a radiotherapy patient to obtain a characteristic image of a person to be treated with radiotherapy;
step D: and transmitting the characteristic image of the radiotherapeutic person to a prediction neural network to obtain a predicted flux map of the radiotherapeutic person, and transmitting the predicted flux map of the radiotherapeutic person to a planning system so as to generate a predicted radiotherapeutic plan of the radiotherapeutic person.
Optionally, the neural network structure is a convolution module, a cavity space convolution pooling pyramid (ASPP) module, a global inference network (GloRe) module, a depth residual error network (ResNet) module, a transposed convolution module and an output module which are sequentially connected.
Optionally, the convolution module is a cascade convolution module and comprises four convolution layers, the convolution kernel size of the convolution layers is 3*3, the step length of the first three layers is 1, the step length of the last layer is 2, and the expansion rate of the middle two layers of convolution kernels is 2 and 4 in sequence; each convolution operation is followed by an activation function and a batch normalization operation, and the activation functions in each layer are:
f(x)=max(x,0)
where x represents the input of the network and f (x) represents the output of the network.
Optionally, the ASPP module is a parallel convolution module, and includes four operators, which are a convolution layer with a convolution kernel size of 1*1, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 6, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 12, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 18, and a global pooling operation, a convolution operation with a convolution kernel size of 1*1, and an operation consisting of linear interpolation upsampling.
Optionally, the GloRe module is mainly divided into three steps, namely graph projection, graph convolution and graph back projection.
Optionally, the ResNet module is a dense convolution module, and includes six convolution layers, and the convolution layers are composed of convolutions with convolution kernel sizes of 1*1 and 3*3, and the step sizes are all 2.
Optionally, the transpose module includes a layer of convolution kernel 3*3, a transpose convolution layer with a step size of 2, and a convolution layer with a convolution kernel size of 3*3, a step size of 1.
Optionally, the case database is constructed based on clinical intensity modulated radiotherapy plan of cervical cancer, and can be popularized to other tumors.
Optionally, the digital image storage and communication standard (DICOM) file of the case medical science comprises a CT image, a target region and a organs at risk profile, and a prescription dose, and the preprocessed feature image comprises a target region projection map, an organs at risk projection map and an actual flux map; the feature images are 512 x 512, and the spatial resolution is 5 x 5mm 3 The method comprises the steps of carrying out a first treatment on the surface of the The medical digital image storage and communication standard file content is not limited to CT images, but can be magnetic resonance images or other forms of images or data.
Optionally, preprocessing the case database specifically to fill the target area and the outline of the organs at pixels of the corresponding area of the CT or other forms of images, marking the target area and the outline area of the organs at the same time by using different positive integers, filling the non-target area and the non-organs at the same time with 0, and projecting the target area of the case at all the projection angles to obtain a projection map; and filling the actual flux map in the corresponding field to obtain a clinical flux map.
Optionally, the case target area and the organ at risk comprise: cervical cancer clinical target area, planned target area, bladder, rectum, small intestine, trunk, left femoral head and right femoral head; the target area of cases and organs at risk may vary when generalized to other tumors.
Optionally, the planning system is a philips developed (Pinnacle) planning system; and can be generalized to other planning systems.
In the radiotherapy plan automatic design method based on deep learning, the method comprises the steps of constructing a flux map prediction neural network; collecting radiotherapy plan data of past cases, and establishing a case database, wherein the case database is clinical radiotherapy plan data of various cancers, and the radiotherapy plan data at least comprises a case flux map and a case DICOM file; preprocessing a case database to obtain a characteristic image which can be trained by a flux map predictive neural network; deep learning is carried out by utilizing the characteristic images so as to train the flux map prediction neural network, and the optimal flux map prediction neural network is obtained through cross verification; reading a case target area and a jeopardy organ in a DICOM file of a patient to be subjected to radiotherapy, and processing to obtain a projection map; transmitting the projection graph file to the trained neural network to obtain a predicted flux graph output by the trained neural network; the predicted flux map is transmitted to a planning system to obtain a predicted radiation treatment plan. Therefore, the automatic design method of the radiotherapy plan based on deep learning realizes automatic prediction of the flux map, generates the radiotherapy plan which can be directly used, and realizes the full-automatic design flow of the personalized radiotherapy plan by combining a planning system.
The invention has the advantages and positive effects that:
according to the deep learning-based radiotherapy plan automatic method, the radiotherapy plan which can be directly used is generated through the deep learning of the convolutional neural network, dependence on personal experience of a physical engineer is reduced, artificial prediction errors are reduced, time for making the radiotherapy plan is shortened, and a personalized treatment plan is provided.
Drawings
FIG. 1 is a flowchart of an automatic generation method of a radiotherapy plan based on deep learning according to an embodiment of the present invention;
FIG. 2 is a diagram of a flux map prediction network according to an embodiment of the present invention;
FIG. 3 is a block diagram of a convolution module according to an embodiment of the present disclosure;
FIG. 4 is a block diagram of an ASPP module according to an embodiment of the invention;
FIG. 5 is a block diagram of a GloRe module according to one embodiment of the present invention;
FIG. 6 is a block diagram of a ResNet module according to one embodiment of the present invention;
FIG. 7 is a block diagram of a transpose convolution module provided in accordance with one embodiment of the present invention;
FIG. 8 is a comparison of a predicted flux map and an actual flux map provided by an embodiment of the present invention.
Detailed Description
The radiotherapy plan automatic design method based on deep learning provided by the invention is further described in detail below with reference to the accompanying drawings and specific embodiments. Advantages and features of the invention will become more apparent from the following description and from the claims. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for convenience and clarity in aiding in the description of embodiments of the invention.
Referring to fig. 1, the embodiment provides an automatic radiotherapy plan generation method based on deep learning, which includes the following steps:
step A: collecting radiotherapy plan data of past cases, constructing a case database based on clinical intensity modulated radiotherapy plan of cervical cancer, and preprocessing the data to obtain characteristic images and case flux diagrams which can be trained by a neural network; the case database can also be generalized to other tumors. The case database is clinical radiotherapy plan data of a certain cancer, the radiotherapy plan data at least comprise a case flux map and a case medical digital image storage and communication standard (DICOM) file, and the case medical digital image storage and communication standard file comprises a cervical cancer flux map, a CT image, a target area, a organs at risk outline and a prescription dose.
In this embodiment, 80 cases of Intensity Modulated Radiation Therapy (IMRT) plan data of seven cervical cancer patients are taken as an example, data sets including a flux map and CT images of cervical cancer, target area and organ-at-risk contours and prescription doses are collected, the data sets are stored in DICOM format commonly used in the medical field, CT images, target area and organ-at-risk contour points are extracted from the original data sets using an internally developed computer programming language Python-based program, and the points are connected to obtain a structure map. Wherein the case target area and the organ at risk comprise: cervical cancer clinical target area, planned target area, bladder, rectum, small intestine, trunk, left femoral head and right femoral head; the target area of cases and organs at risk may vary when generalized to other tumors.
Preprocessing the data to obtain corresponding characteristic images including target region projection images, organs at risk projection images and actual flux images, wherein the characteristic images have the same size512 x 512, spatial resolution 5 x 5mm 3 The method comprises the steps of carrying out a first treatment on the surface of the The medical digital image storage and communication standard file content is not limited to CT images, but can be magnetic resonance images or other forms of images or data.
The specific embodiment of preprocessing the case database is as follows: acquiring specific areas of each target area, filling the outlines of the target areas into pixel points of areas corresponding to CT or other forms of images, and marking the target areas by using 1; filling the outline of the organs at risk on pixel points of corresponding areas of CT or other forms of images, marking the organs at risk by different positive integers, filling non-target areas and non-organs at risk areas by 0, and further projecting the target areas of the cases under all the projection angles to obtain a projection map; and filling the actual flux map in the corresponding field to obtain a clinical flux map. In the embodiment, the bladder, rectum, small intestine, trunk, left femoral head and right femoral head in the jeopardizing organ area are respectively marked by 2, 3, 4, 5 and 6;
in this embodiment, the seven angles of the radiation field are 30 °, 90 °, 140 °, 180 °, 220 °, 275 °, and 330 °. Firstly, carrying out three-dimensional reconstruction on the obtained target area and the organs at risk to obtain a three-dimensional information matrix of a patient anatomy structure, and then respectively carrying out projection on the three-dimensional information matrix under seven portal angles to obtain a projection diagram of the target area and the organs at the portal angles;
the actual flux map is converted into a two-dimensional matrix and normalized, and the specific method is to unify the data by multiplying the prescription dose normalization factors of all patients by the two-dimensional matrix of the flux map, so that differences among patients are balanced, and the training difficulty of a neural network is reduced.
And (B) step (B): constructing a flux map prediction neural network;
the flux map prediction neural network comprises a plurality of cascade convolutions, a hole convolution and a transpose convolution, please refer to fig. 2, which shows that the flux map prediction neural network specifically comprises five convolution modules, a hole space convolution pooling pyramid (Atrous Spatial Pyramid Pooling, ASPP) module, a global inference network (Global Reasoning Networks, gloRe) module, three depth residual error network (Deep Residual Network, resNet) modules, five transpose convolution modules and an output module, and the relationships between the modules are as follows:
a convolution module 32 that accepts 1 first pathology feature image and sends 2 second pathology feature images;
the convolution module 64 receives 1 second pathological feature image and transmits 2 third pathological feature images;
a convolution module 128 that accepts 1 third pathology feature image, sends 2 fourth pathology feature images;
a convolution module 512 that accepts the 1 fourth pathological feature image and sends 2 fifth pathological feature images;
a convolution module 1024 that accepts 1 fifth pathology feature image and sends 2 sixth pathology feature images; the ResNet module 128 receives 1 fourth pathology feature image and transmits 1 seventh pathology feature image; the ResNet module 512 accepts 1 fifth pathology feature image, sends 1 eighth pathology feature image; the ResNet module 1024 accepts 1 sixth pathology feature image, sends 1 ninth pathology feature image; ASPP module 2048 receives 1 sixth pathology feature image, and transmits 1 tenth pathology feature image; a GloRe module for receiving 1 tenth pathological feature image and transmitting 1 eleventh pathological feature image;
a transpose convolution module 1024 that accepts 1 eleventh pathology feature image and 1 ninth pathology feature image, transmitting 1 twelfth pathology feature image;
a transpose convolution module 512 that accepts 1 eighth pathology feature image and 1 twelfth pathology feature image, sends 1 thirteenth pathology feature image;
a transpose convolution module 128 that receives the 1 seventh pathology feature image and the 1 thirteenth pathology feature image, and transmits the 1 fourteenth pathology feature image;
a transpose convolution module 64 that receives 1 third pathology image and 1 fourteenth pathology image, and transmits 1 fifteenth pathology image;
transpose convolution module 32 receives the 1 second pathology image and the 1 fifteenth pathology image, and sends 1 predicted flux map.
Referring to fig. 3, the convolution module is shown to be a cascade convolution module, and includes four convolution layers, the convolution kernel size of the convolution layers is 3*3, the step size of the first three layers is 1, the step size of the last layer is 2, and the expansion rate of the middle two layers of convolution kernels is 2 and 4 in sequence; each convolution operation is followed by an activation function and a batch normalization operation, and the activation functions in each layer are:
f(x)=max(x,0)
where x represents the input of the network and f (x) represents the output of the network.
Referring to fig. 4, the ASPP module is a parallel convolution module, and includes four operators, which are convolution layers with convolution kernel size 1*1; a convolution layer with a convolution kernel size of 3*3 and an expansion rate of 6; a convolution layer with a convolution kernel size of 3*3 and a dilation rate of 12; convolution operation with a convolution kernel of 3*3, a convolution layer with an expansion rate of 18 and global pooling, convolution operation with a convolution kernel of 1*1, global pooling and linear interpolation up-sampling;
referring to fig. 5, the GloRe module is mainly divided into three steps, namely, image projection, image convolution and image back projection;
referring to fig. 6, the res net module is a dense convolution module, and includes six convolution layers, including convolutions with convolution kernel sizes of 1*1 and 3*3, where the step sizes are all 2;
referring to fig. 7, the transpose convolution module includes a layer of convolution kernel 3*3, a transpose convolution layer with a step size of 2, and a convolution layer with a convolution kernel size of 3*3, a step size of 1;
the feature image is input into the flux map prediction neural network in the form of 8-channel images, the size is 512 x 512, the number of image channels is gradually increased through a convolution module, the number of image channels is sequentially 32, 64, 128, 512, 1024 and 2048 from top to bottom, the resolution of the image is gradually reduced, 256 x 256, 128 x 128, 64 x 64, 32 x 32, 16 x 16 and 8 x 8 from top to bottom, the resolution of the image is restored to 512 x 512 through a transposition convolution module, and the predicted flux map is output, namely a single-precision floating point number matrix of 512 x 512.
And (B) step (B): c, training a flux map prediction neural network by using the case feature images and the case flux maps obtained in the step A, and obtaining an optimal flux map prediction neural network through cross verification;
five-fold cross validation is carried out on the data set, in the embodiment, the collected data of 80 cervical cancer patients are divided into 5 parts, 16 patient plans are selected randomly to be used as a test set, the remaining four parts are used as training sets for training the flux map prediction neural network, and a flux map prediction model with the best performance is selected according to each group of training results.
The specific training process in this embodiment is as follows, a deep learning frame (TensorFlow2.0) developed by Google is adopted as a training frame, and the iterative steps of the training process are as follows: firstly, inputting characteristic images in a training set into a flux map prediction neural network for forward propagation, then interactively calculating loss between a flux prediction result and an actual flux map, and then reversely propagating a loss value to update parameter weights in the flux map prediction neural network, wherein all data in the training set are calculated once to represent one cycle, and the complete training flux map prediction neural network needs to be cycled at least thousands of times.
Neural networks use Adam (Adam) algorithm with an initial learning rate of 1e -3 And when the loss value does not obviously decrease in 200 cycles, the flux map is considered to predict the convergence of the neural network area, and the training of the neural network is stopped at the moment, and the model is stored.
Step C: reading a medical digital image storage and communication standard file (DICOM) file of a patient to be treated with radiotherapy, and preprocessing to obtain a characteristic image of the patient to be treated with radiotherapy;
step D: and transmitting the characteristic image of the radiotherapeutic person to a trained prediction neural network, outputting a predicted flux map of the radiotherapeutic person by the trained prediction neural network, and finally transmitting the predicted flux map of the radiotherapeutic person to a Pinnacle planning system developed by philips to obtain a predicted radiotherapeutic plan of the radiotherapeutic person. The planning system is a Philips developed (Picnacle) planning system, and can be popularized to other planning systems.
The automatic generation method of the deep learning-based radiotherapy plan can generate a directly available radiotherapy plan through a neural network, and referring to the comparison of the predicted flux map of fig. 8a and the actual flux map of fig. 8b, the automatic plan can complete the complete cervical cancer radiotherapy plan preparation, thereby remarkably shortening the cervical cancer radiotherapy plan preparation time and reducing the workload of a physical engineer. Generally, if it is desired to obtain accurate prediction results through deep learning, the training data set must be very large, and in this embodiment, only 80 patients will obtain more accurate results if the database flux map prediction neural network is enlarged.
In summary, in the radiotherapy plan automatic generation method based on deep learning provided by the invention, the method comprises the steps of constructing a flux map prediction neural network; collecting radiotherapy plan data of past cases, and establishing a case database, wherein the case database is clinical radiotherapy plan data of various cancers, and the radiotherapy plan data at least comprises a case flux map and a case DICOM file; preprocessing a case database to obtain a characteristic image which can be trained by a flux map predictive neural network; deep learning is carried out by utilizing the characteristic images so as to train the flux map prediction neural network, and the optimal flux map prediction neural network is obtained through cross verification; reading a case target area in a DICOM file of a patient to be treated with radiotherapy, and processing to obtain a projection chart; transmitting the projection graph file to the trained neural network to obtain a predicted flux graph output by the trained neural network; the predicted flux map is transmitted to a planning system to obtain a predicted radiation treatment plan. Therefore, the automatic generation method of the radiotherapy plan based on deep learning realizes automatic prediction of the flux map, generates the radiotherapy plan which can be directly used, and realizes the full-automatic design flow of the personalized radiotherapy plan by combining a planning system.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. The automatic radiotherapy plan generation method based on deep learning is characterized by comprising the following steps:
step A: preprocessing a case database to obtain a case feature image and a case flux map;
and (B) step (B): training a flux map prediction neural network by using the case feature images and the case flux map, and obtaining an optimal flux map prediction neural network through cross verification;
step C: reading and preprocessing a medical digital image storage and communication standard file of a radiotherapy patient to obtain a characteristic image of a person to be treated with radiotherapy;
step D: inputting the characteristic image of the radiotherapeutic person into a prediction neural network to obtain a predicted flux map of the radiotherapeutic person, and sending the predicted flux map of the radiotherapeutic person to a planning system so as to generate a predicted radiotherapeutic plan of the radiotherapeutic person;
the neural network structure comprises a convolution module, a cavity space convolution pooling pyramid module, a global reasoning network module, a depth residual error network module, a transposition convolution module and an output module which are connected in sequence;
the pretreatment of the case database is specifically as follows: filling the target area and the organs at risk outline onto pixel points of corresponding areas of CT or other forms of images, marking by using different positive integers, filling non-target areas and non-organs at risk area with 0, and further projecting the case target area under all the projection angles to obtain a projection map; and filling the actual flux map in the corresponding field to obtain a clinical flux map.
2. The automatic generation method of radiotherapy plans based on deep learning according to claim 1, wherein the convolution module is a cascade convolution module and comprises four convolution layers, the convolution kernel size of the convolution layers is 3*3, the step size of the first three layers is 1, the step size of the last layer is 2, and the expansion rate of the middle two layers of convolution kernels is 2 and 4 in sequence; each convolution operation is followed by an activation function and a batch normalization operation, and the activation functions in each layer are:
fx)=max(x,0);
where x represents the input of the network and f (x) represents the output of the network.
3. The automatic generation method of radiotherapy plans based on deep learning according to claim 1, wherein the hole space convolution pooling pyramid module is a parallel convolution module and comprises four operation operators, namely a convolution layer with a convolution kernel size of 1*1, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 6, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 12, a convolution kernel size of 3*3, a convolution layer with a dilation rate of 18 and global pooling, a convolution operation with a convolution kernel size of 1*1 and an operation consisting of linear interpolation upsampling.
4. The automatic radiotherapy plan generation method based on deep learning according to claim 1, wherein the global inference network module is divided into three steps, namely graph projection, graph convolution and graph back projection.
5. The automatic generation method of radiotherapy plans based on deep learning according to claim 1, wherein the depth residual error network module is a dense convolution module and comprises six convolution layers, wherein the convolution layers consist of convolutions with convolution kernel sizes of 1*1 and 3*3, and the step sizes are all 2.
6. The automatic generation method of radiotherapy plans based on deep learning according to claim 1, wherein the transpose convolution module comprises a layer of convolution layers with a convolution kernel size of 3*3, a step size of 2 and a layer of convolution layers with a convolution kernel size of 3*3 and a step size of 1.
7. The automatic generation method of radiotherapy plans based on deep learning according to claim 1, wherein the case database is constructed based on clinical intensity modulated radiotherapy plans of cervical cancer, and the case database can be popularized to other tumors.
8. The automatic generation method of radiotherapy plan based on deep learning according to claim 1, wherein the medical digital image storage and communication standard file of radiotherapy patient comprises CT image, target region and organs at risk outline, prescription dose, the preprocessed feature image comprises target region projection map, organs at risk projection map and actual flux map, the feature image size is 512 x 512, and the spatial resolution is 5 x 5mm 3 The method comprises the steps of carrying out a first treatment on the surface of the The medical digital image storage and communication standard file content is not limited to CT images, but can be magnetic resonance images or other forms of images or data.
CN202110142971.XA 2021-02-02 2021-02-02 Deep learning-based automatic generation method for radiotherapy plan Active CN112820377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110142971.XA CN112820377B (en) 2021-02-02 2021-02-02 Deep learning-based automatic generation method for radiotherapy plan

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110142971.XA CN112820377B (en) 2021-02-02 2021-02-02 Deep learning-based automatic generation method for radiotherapy plan

Publications (2)

Publication Number Publication Date
CN112820377A CN112820377A (en) 2021-05-18
CN112820377B true CN112820377B (en) 2023-06-20

Family

ID=75860589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110142971.XA Active CN112820377B (en) 2021-02-02 2021-02-02 Deep learning-based automatic generation method for radiotherapy plan

Country Status (1)

Country Link
CN (1) CN112820377B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516233B (en) * 2021-09-13 2022-01-28 四川大学 Neural network prediction device for VMAT radiotherapy plan
CN113517072B (en) * 2021-09-13 2022-01-28 四川大学 VMAT radiotherapy plan prediction device based on deep neural network
CN114373080B (en) * 2022-03-22 2022-07-29 中国石油大学(华东) Hyperspectral classification method of lightweight hybrid convolution model based on global reasoning
CN116844734B (en) * 2023-09-01 2024-01-16 福建自贸试验区厦门片区Manteia数据科技有限公司 Method and device for generating dose prediction model, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109979564A (en) * 2017-12-28 2019-07-05 北京连心医疗科技有限公司 A kind of intelligence radiotherapy planning method, equipment and storage medium
CN110349665A (en) * 2019-07-01 2019-10-18 复旦大学附属肿瘤医院 Carcinoma of the rectum radiotherapy planning the Automation Design method based on deep learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11850445B2 (en) * 2016-09-07 2023-12-26 Elekta, Inc. System and method for learning models of radiotherapy treatment plans to predict radiotherapy dose distributions
US11100647B2 (en) * 2018-09-10 2021-08-24 Google Llc 3-D convolutional neural networks for organ segmentation in medical images for radiotherapy planning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109979564A (en) * 2017-12-28 2019-07-05 北京连心医疗科技有限公司 A kind of intelligence radiotherapy planning method, equipment and storage medium
CN110349665A (en) * 2019-07-01 2019-10-18 复旦大学附属肿瘤医院 Carcinoma of the rectum radiotherapy planning the Automation Design method based on deep learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MR-based treatment planning in radiation therapy using a deep learning approach;Fang Liu等;Radiation Oncology Physics;105-114 *
基于深度学习方法的放疗患者摆位误差预测;高翔;宋双;张伟;陈妙然;夏宇;曹征;;北京生物医学工程(04);54-62 *
基于知识的放射治疗技术研究;陈思佳等;中国医学物理学杂志;1350-1355 *

Also Published As

Publication number Publication date
CN112820377A (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN112820377B (en) Deep learning-based automatic generation method for radiotherapy plan
US11954761B2 (en) Neural network for generating synthetic medical images
CN108717866B (en) Method, device, equipment and storage medium for predicting radiotherapy plan dose distribution
US11615873B2 (en) Deep learning based dosed prediction for treatment planning and quality assurance in radiation therapy
CN107715314B (en) Deep learning-based radiotherapy system and method
CN114681813B (en) Automatic radiation therapy planning system, automatic radiation therapy planning method, and storage medium
CN113096766B (en) Three-dimensional dose prediction method and system in personalized accurate radiotherapy plan
Zhang et al. Automatic segmentation and applicator reconstruction for CT‐based brachytherapy of cervical cancer using 3D convolutional neural networks
Fu et al. Deformable MR‐CBCT prostate registration using biomechanically constrained deep learning networks
CN113674834A (en) Radiotherapy target region establishing and correcting method based on dose distribution preview system
CN112546463A (en) Radiotherapy dose automatic prediction method based on deep neural network
Kawula et al. Patient‐specific transfer learning for auto‐segmentation in adaptive 0.35 T MRgRT of prostate cancer: a bi‐centric evaluation
CN110404184A (en) A kind of method and system of measuring and calculating radiotherapy roentgen dose X distribution and dose objective function
CN112037885B (en) Dose prediction method, device, computer equipment and storage medium in radiotherapy planning
CN114846476A (en) Deep learning engine for training radiation therapy treatment plans
CN115018809A (en) Target area segmentation and identification method and system of CT image
Beljaards et al. A cross-stitch architecture for joint registration and segmentation in adaptive radiotherapy
Jiao et al. TransDose: Transformer-based radiotherapy dose prediction from CT images guided by super-pixel-level GCN classification
CN114155934B (en) Tumor intensity modulated radiotherapy dosage prediction method based on deep learning
Ma et al. Clinical evaluation of deep learning–based clinical target volume three-channel auto-segmentation algorithm for adaptive radiotherapy in cervical cancer
CN113941100A (en) Method and apparatus for generating deliverable radiotherapy plan according to three-dimensional spatial dose distribution
CN116993793A (en) Abdominal multi-organ registration method based on self-adaptive multi-gating hybrid expert model
CN112419348B (en) Male pelvic cavity CT segmentation method based on multitask learning edge correction network
CN116563192A (en) Method for multi-mathematical processing of patient data with head and neck cancer, associated computer-readable medium and computing device
Bao et al. Deep Reinforcement Learning for Beam Angle Optimization of Intensity-Modulated Radiation Therapy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant