CN115345954A - Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning - Google Patents

Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning Download PDF

Info

Publication number
CN115345954A
CN115345954A CN202211008134.9A CN202211008134A CN115345954A CN 115345954 A CN115345954 A CN 115345954A CN 202211008134 A CN202211008134 A CN 202211008134A CN 115345954 A CN115345954 A CN 115345954A
Authority
CN
China
Prior art keywords
mri
image
deep learning
prostate cancer
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202211008134.9A
Other languages
Chinese (zh)
Inventor
冯朝燕
闵祥德
张配配
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202211008134.9A priority Critical patent/CN115345954A/en
Publication of CN115345954A publication Critical patent/CN115345954A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Abstract

The invention relates to the technical field of modern medicine, and discloses a deep learning-based whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer, which comprises the following steps: s1, retrospective data collection: retrospectively collecting WB-MRI data to construct a training set and a verification set, retrospectively incorporating 200 cases of the WB-MRI data according to an grouping standard, randomly dividing the WB-MRI data into 180 cases of the training set and 20 cases of the verification set, training a neural network model by using the training set data, and adjusting model parameters and optimizing the model by using the verification set data; the invention has shown great potential in the aspects of accelerating imaging speed and improving imaging quality by applying the deep learning technology to WB-MRI rapid imaging. The project takes knowledge learned in undersampling reconstruction (DC-CNN network) of cardiac MRI and super-resolution reconstruction (RDN network) of natural images as a premise by introducing transfer learning, and is applied to WB-MRI rapid imaging, so that the calculated amount and the training time can be reduced, and the WB-MRI image reconstruction effect can be improved.

Description

Deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer
Technical Field
The invention relates to the technical field of modern medicine, in particular to a whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning.
Background
Chinese experts in metastatic prostate cancer chemotherapy (2019), noted that imaging studies have significant implications for accurate assessment of metastatic load and for assessment of efficacy in patients with metastatic prostate cancer. The Whole body magnetic resonance imaging (WB-MRI) breaks through the limitation of a single part, comprises Whole body anatomy and function imaging, can be used for qualitative and quantitative evaluation of the Whole body multifocal malignant lesion, and is favorable for accurately evaluating the prostate cancer metastasis burden and the treatment response. WB-MRI has extremely high sensitivity and specificity to bone metastasis, and the efficacy is superior to SPECT and is equivalent to PET-CT. Whole-body diffusion weighted imaging (WB-DWI), also known as PET-like imaging, has good efficacy in the assessment of prostate cancer metastatic burden, and has excellent reproducibility, with an Apparent Diffusion Coefficient (ADC) for DWI quantitative parameters enabling quantitative assessment of lesions. Compared with PET-CT, WB-MRI has the advantages of low price, no wound, no radiation, no need of contrast agent and the like, and has good potential in the aspects of metastatic prostate cancer focus detection and treatment response evaluation. International prostate cancer metastasis reporting and data System (MET-RADS-P) guidelines recommend WB-MRI as a routine examination of metastatic prostate cancer. However, WB-MRI is not widely used in clinical settings. The long scanning time (usually 45-60 minutes) is a major problem for restricting the development of WB-MRI, and the long scanning time may cause pain and intolerance to patients with metastatic bone pain of prostate cancer or people who cannot lie for a long time, also may affect the utilization rate of MRI instruments and the timeliness of diagnosis, and may also introduce motion artifacts to cause image distortion. The implementation of WB-MRI fast imaging allows more patients to use and benefit from this examination and avoids CT, PET/CT related radiation. Therefore, the realization of WB-MRI quick imaging is of great significance.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning, which solves the problems in the prior art.
(II) technical scheme
In order to achieve the purpose, the invention provides the following technical scheme: a whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer based on deep learning comprises the following steps:
s1, retrospective data collection: retrospectively collecting WB-MRI data to construct a training set and a verification set, retrospectively incorporating 200 WB-MRI data according to grouping standards, randomly dividing the data into 180 training sets and 20 verification sets, training a neural network model by using the training set data, and adjusting model parameters and optimizing the model by using the verification set data;
s2, image preprocessing: accelerating scanning of WB-MRI is realized by adopting two modes of T2WI image preprocessing and DWI image preprocessing, so that an undersampled (or downsampled) input image and a fully sampled target image required by deep network training are obtained;
s3, training, verifying and optimizing a WB-MRI rapid imaging model based on deep learning: a deep cascade convolution neural network (DC-CNN) and a deep residual error dense network (RDN) are used as basic networks, a migration learning method is combined, retrospective data are used for learning the mapping relation between an input image (undersampled or downsampled image) and a target image (full sampling image), namely network parameter weights, so as to obtain a WB-MRI (magnetic resonance imaging) fast imaging model based on deep learning, and the WB-MRI fast imaging model is shown in FIG. 1;
s4, testing of a WB-MRI rapid imaging model based on deep learning: prospective collection of 50 suspected metastatic prostate cancer patients and 10 healthy volunteers for testing of the constructed deep learning model;
s5, application of the WB-MRI rapid imaging model based on deep learning in metastatic prostate cancer assessment: prospective inclusion of the 50 cases of visit pathology confirmed prostate cancer and suspected metastatic patients, all included patients underwent advanced learning accelerated WB-MRI DL Scanning and conventional full miningSample WB-MRI F Scanning to explore the application of the WB-MRI rapid imaging model based on deep learning in metastatic prostate cancer assessment.
Preferably, in step S2, the T2WI image preprocessing includes:
(1) Simulating undersampling of k space;
(2) Reconstructing a coronal position;
(3) Normalizing the data;
(4) And data is expanded.
Preferably, in step S2, the DWI image preprocessing includes:
(1) Simulating down-sampling of the image;
(2) Reconstructing a coronary position;
(3) Normalizing the data;
(4) And data is expanded.
Preferably, in step S2, the retrospectively collected T2WI image is subjected to accelerated scanning using k-space undersampling and image downsampling, and the DWI image is subjected to an accelerated method using downsampling.
Preferably, in step S3, the DC-CNN is used to learn the mapping relationship between the under-sampled image and the full-sampled image, and the RDN is used to learn the mapping relationship between the under-sampled image and the full-sampled image.
Preferably, the step S4 specifically includes:
(1) Image quality and ADC parameter evaluation: prospective 50 patients with metastatic prostate cancer are admitted to perform WB-MRI real undersampling (or undersampling) and conventional full-sampling scanning at the same time to obtain a deep learning accelerated image WB-MRI DL And full sampling image WB-MRI F Comparison of WB-MRI DL And WB-MRI F Image quality and ADC value differences;
(2) Evaluation of stability and repeatability of the model: prospective 10 healthy volunteers were subjected to two deep learning accelerated WB-MRI scans and evaluated for WB-MRI based on the fast imaging model reconstruction of deep learning DL Repeatability of the image and repeatability of the ADC values.
Preferably, the step S5 specifically includes:
(1) And evaluating the diagnosis efficiency: comparative WB-MRI DL And WB-MRI F Diagnostic efficacy on prostate cancer metastases;
(2) And evaluating the curative effect of the metastasis: 50 patients with suspected metastatic prostate cancer were followed up and patients receiving treatment during the follow-up visit were again hospitalized with a late learning accelerated WB-MRI within 1-3 months after treatment DL_after Scanning and conventional WB-MRI F_after Scanning, independent evaluation of the display of the metastasis by the accelerated and conventional images by two physicians, and comparison of the pre-treatment WB-MRI DL And post-treatment WB-MRI DL_after Shift focus variation on the image.
Preferably, WB-MRI before comparative treatment DL And post-treatment WB-MRI DL_after The changes in metastasis on the image include lesion size, number, signal intensity, major lesion ADC values, etc.
(III) advantageous effects
The invention provides a whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer based on deep learning, which has the following beneficial effects:
the invention has shown great potential in the aspects of accelerating imaging speed and improving imaging quality by applying the deep learning technology to WB-MRI rapid imaging. The project takes knowledge learned in undersampling reconstruction (DC-CNN network) of cardiac MRI and super-resolution reconstruction (RDN network) of natural images as a premise by introducing transfer learning, and is applied to WB-MRI rapid imaging, so that the calculated amount and the training time can be reduced, and the WB-MRI image reconstruction effect can be improved. By means of the under-sampling reconstruction and super-resolution reconstruction technology based on deep learning, a robust and accurate WB-MRI rapid imaging model is planned to be established. The scanning time and the reconstruction time are both very short by combining off-line training and on-line testing.
Drawings
FIG. 1 is a flowchart of WB-MRI rapid imaging based on deep learning according to the present invention;
FIG. 2 is a schematic diagram of the present invention simulating undersampling of k-space;
FIG. 3 is a schematic diagram of a down-sampling of a simulated image according to the present invention;
FIG. 4 is a schematic diagram of the reconstruction of a WB-MRI coronal view in accordance with the present invention;
FIG. 5 is a network structure diagram of the DC-CNN in the present invention;
FIG. 6 is a diagram of the RDN network architecture of the present invention;
FIG. 7 is a schematic diagram of a technical route of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 7, the present invention provides a technical solution: a whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning comprises the following steps:
s1, retrospective data collection
Retrospectively collect patients from a hospital 2014 in 2 months who so far had WB-MRI scans on 3.0T Skyra MRI in siemens 3.0T. The following patients were excluded: (1) Incomplete WB-MRI examination (incomplete sequence or incomplete scan site); (2) WB-DWI has the condition that obvious motion artifacts or magnetic sensitivity artifacts and the like influence the lesion diagnosis. A total of about 200 patients (mainly prostate cancer patients) were enrolled, and patients were randomized into 180 training sets and 20 validation sets. Training the neural network model by using the training set data, adjusting model parameters by using the verification set data, and optimizing the model.
The hospital WB-MRI data were all acquired on a 3.0T Skyra MRI, siemens, germany. The patient takes a supine position with an advanced head, uses 1 32-channel head and neck coil, 3 18-channel surface phased array coils and a built-in spine coil, scans the head and neck, the chest, the abdomen, the pelvic cavity and the thigh by 5 sections, and covers the skull top to the middle of the thigh. The WB-MRI sequences included T1WI, T2WI and DWI. The scan parameters were as follows: (1) CrownState T1WI: using a vibe sequence, TR =4.58ms, TE =1.23ms, excitation times 1, layer number 40, layer thickness 5mm, layer spacing 0mm, matrix 384 × 307, FOV 450mm × 360mm; (2) coronal position T2WI: performing fat suppression by adopting a rapid spin echo (TSE) sequence and a rapid inversion recovery (TIRM) sequence, wherein the TR is 3000ms, the TE is 257ms, the excitation times are 2, the number of layers is 40, the layer thickness is 5mm, the interlayer spacing is 0mm, the matrix is 384 multiplied by 307, and the FOV is 450mm multiplied by 360mm; (3) axial DWI: using a Single shot echo planar imaging (SS-EPI) sequence, b values were taken at 50 and 900s/mm 2 TR =8720ms, te =60ms, excitation number 5, layer number 50, layer thickness 5mm, layer pitch 0mm, matrix 192 × 192, fov 430mm × 360mm.
S2, preprocessing the image
Since T1WI uses the vibe sequence, the scan time is short and no accelerated scan is needed. The invention only performs reconstruction acceleration on T2WI and DWI. At present, two methods for realizing the acceleration of the MRI sequence by adopting deep learning are mainly used: (1) k-space undersampling: the method realizes undersampled scanning of k space by a sparse sampling method so as to save scanning time, and the undersampled image is reconstructed into an approximate full-sampling image by a deep learning model so as to realize scanning acceleration. This method can be applied to TSE-T2WI sequences that use parallel line k-space fill trajectories, but is not applicable to SS-EPI DWI sequences, since the SS-EPI sequence can achieve all k-space fills with one shot. (2) Image down-sampling: scanning time is saved by collecting low-resolution images, and the low-resolution images are reconstructed into high-resolution images through a deep learning model, so that scanning acceleration is realized. This method is applicable to both TSE-T2W and SS-EPI DWI sequences. In the project, for the WB-T2WI sequence, the two methods are simultaneously adopted to realize rapid scanning, and the adopted method is finally determined according to the quality of the reconstructed image. For the WB-DWI sequence, a second method, namely, a downsampling method of an image is to be adopted to realize fast scanning.
T2WI image preprocessing:
the project adopts two acceleration methods to realize the rapid scanning of WB-T2WI, so that for retrospectively collected WB-T2WI data, a simulated k-space undersampling method and a simulated image downsampling method are adopted to carry out simulated acceleration scanning.
(1) Simulating k-space undersampling:
retrospectively, a full-sampling image is obtained from an MRI image, and a simulated k-space undersampling needs to be performed on the obtained MRI image to simulate an accelerated scan of MRI, in this project, the method is used for WB-T2WI (as shown in fig. 2), and the steps are as follows:
(1) fft2 function is used for Fourier Transform (FT) of the retrospective T2WI image, the spatial domain of the image is converted into the frequency domain, namely, the full sampling image Ref (x, y) is subjected to Fourier transform to obtain full sampling k space data S r (k x ,k y ) Is shown as S r (k x ,k y )=FT(Ref(x,y));
(2) The project adopts four methods of one-dimensional low-frequency undersampling, one-dimensional uniform undersampling, one-dimensional random variable density undersampling and two-dimensional random variable density undersampling to realize undersampling of k space data, and determines an undersampling track according to the reconstruction quality of a deep learning model image. S for fully sampled k-space data r (k x ,k y ) Is represented by, wherein k x Indicating the position of the k-space frequency encoding direction, k y Indicating the position of the k-space phase encoding direction. Undersampled template mask (k) x ,k y ) The value in the mask corresponding to the point to be sampled is 1, and the value in the mask corresponding to the point not to be sampled is 0. In the undersampled template, we want to retain 1/4 of the data, simulating 4 x acceleration. With S u (k x ,k y ) Representing the acquired undersampled k-space data. Using undersampled template mask and full sampling k space data matrix S r (k x ,k y ) And acquiring analog undersampled data by performing dot multiplication, wherein the analog undersampled data is expressed by a formula as follows: s u (k x ,k y )=S r (k x ,k y ).×mask(k x ,k y );
(3) And (3) zero filling reconstruction: for undersampled data S u (k x ,k y ) The value of which is 0 corresponding to a point in k-space where data acquisition is not performed, and then Inverse fourier transform (Inverse fourier t) is performed on the k-space data using numpyransform, IFT), resulting in a zero-filled undersampled image, denoted by Input (x, y), input (x, y) = IFT (S) u (k x ,k y )). Thus, a pair of training data, i.e., the fully sampled image Ref (x, y) and the analog undersampled image Input (x, y), is obtained. Input (x, y) is used as the Input of the network, and Ref (x, y) is used as the reference image of the network.
(2) Simulating down-sampling of an image:
the images obtained in retrospective MRI images are relatively high resolution images and we need to perform simulated downsampling of the obtained MRI images to simulate an accelerated scan of MRI, in this project the method is used for WB-T2W and WB-DWI (as shown in fig. 3). We first apply gaussian blur with standard deviation σ =0.1 to the image and then use the degradation strategy of bicubic interpolation to down-sample retrospective WB-T2WI and WB-DWI by a factor of 4. Thus, the low resolution image Input (x, y) required for the experiment is obtained. The bicubic interpolation is implemented using the Numpy software package. In this way, a pair of training data, i.e., a fully-sampled high-resolution image Ref (x, y) and a downsampled low-resolution image Input (x, y), is also obtained. Input (x, y) is used as the Input of the network, and Ref (x, y) is used as the reference image of the network.
(3) Data normalization:
and (3) carrying out module value [0,1] normalization processing on each voxel of the input image by adopting a Numpy software package, wherein the normalized image is the input of training data.
(4) Data augmentation:
deep learning relies on large data, which is not readily available for medical images, and data augmentation (data augmentation) plays a very important role in improving network performance and preventing overfitting. In the study, each image of the training set data is sequentially rotated by 10 degrees, 20 degrees and 30 degrees in a small amplitude, and finally an image with 4 times of the original data amount can be obtained. The validation set and test set data were not subject to data augmentation.
DWI image pre-processing
Filling of all k spaces can be realized by one-time excitation of the SS-EPI sequence, and sparse undersampling of the k spaces is difficult to realize in actual sequence scanning. Therefore, in this project, the WB-DWI is adopted for the image lower samplingThe simulation was accelerated in a similar manner. DWI image preprocessing: (1) simulating image downsampling; (2) coronal reconstruction: for the WB-DWI image, because the original image is axial scanning and the radiographing is the best coronal position, in order to obtain a better reconstruction effect, the original axial image needs to be reconstructed into a coronal position so that the reconstruction network directly outputs the coronal position image. The axial position WB-DWI is divided into 5 sections of head and neck, chest, abdomen, pelvic cavity and thigh for scanning. We perform coronal reconstruction for the high resolution WB-DWI and the simulated low resolution WB-DWI for each segment. Each segment of axial WB-DWI image is to be reconstructed into a coronal image of 5mm slice thickness using vtk-Python. B values of WB-DWI images of 50 and 900s/mm 2 The image of (a) is subjected to a coronal reconstruction (as shown in fig. 4) to obtain a high resolution image Ref (x, y) and a low resolution image Input (x, y) consistent with the coronal T2 WI. Input (x, y) is used as the Input of the network, and Ref (x, y) is used as the reference image of the network. (3) normalizing the data; and (4) data augmentation. The methods of down-sampling, data normalization and data amplification of the analog image are the same as those in the T2WI image preprocessing. Fig. 4 is used for simulating 4-fold down-sampling of DWI, the down-sampled low-resolution image has obvious sawtooth edges and mosaic phenomena and poor detail display when being observed in an enlarged mode, and a high-resolution MRI image can be reconstructed from a low-resolution MRI image through an RDN network.
S3, catenary, verification and optimization of WB-MRI (magnetic resonance imaging) rapid imaging model based on deep learning
The included 200 cases of WB-MRI data were randomly divided into 180 cases of training set and 20 cases of validation set. Training the neural network model by using the training set data, and verifying the set data to adjust model parameters and optimize the model. In the model training process, the pixel-by-pixel loss between the fully sampled target image and the reconstructed image needs to be calculated so as to recover the structural details of the undersampled (or downsampled) image; the mean square error loss ADCloss between the fully sampled target image and the ADC map generated by the reconstructed image is also calculated to ensure that the model can provide an accurate estimate of the ADC map. The acceleration multiple and k-space undersampling method finally adopted in the project is determined by the reconstructed image of the verification set.
(1) Deep learning workstation:
and two workstations provided with RTX2080Ti x2 display cards, 128G memories, CPU Intel i9-9900k and Win10 (64-bit) systems. The entire training was done under PyTorch framework based on Python 3.7.
(2) Network selection:
the deep concatenated convolutional neural network (DV-CNN) (as shown in fig. 5) and the residual error dense network (RDN) (as shown in fig. 6) are used for the transfer learning. The DC-CNN is an MRI accelerated deep neural network based on k-space undersampling. The RDN is a new super-resolution reconstruction network that generates high-resolution images from low-resolution images. And the WB-T2WI respectively adopts a DC-CNN network and an RDN network to realize accelerated scanning, and the WB-DWI adopts the RDN network to realize accelerated scanning.
(3) Model training, verification and optimization:
(1) transfer learning: and the DC-CNN and the RDN weighted values are loaded as initial parameters of the model, so that the calculated amount and the training time can be reduced, overfitting can be effectively prevented, the network fitting speed can be increased, and the WB-MRI image reconstruction effect can be improved.
(2) Model training: training a deep neural network model by using training set data, wherein each pair of training samples of the network consists of a full-sampling target image Ref (x, y) and a simulated undersampled (or downsampled) image Input (x, y), and a prior neural network model is obtained after training is finished. The preliminarily set network parameters are shown in table 1, and the real-time adjustment is carried out according to the model reconstruction result in the project execution process.
Figure BDA0003809818640000101
TABLE 1 network initial parameters
(3) Model verification and optimization: and (4) performing parameter adjustment and optimization on the neural network model obtained by training by using a verification set, and testing and storing the optimal network model parameters while training. The quality evaluation of the reconstructed image is divided into subjective quality evaluation and objective quality evaluation.
Quality evaluation of the director: the reconstructed image and the full sampling image are compared through human visual perception, and the overall definition, the contrast and the like of the image, the detail richness degree and the sharpening degree of a local area and the like are mainly compared.
Objective quality evaluation: indexes such as Mean Square Error (MSE), peak signal-to-noise ratio (PSNR), structural Similarity (SSIM) and the like are adopted to measure the reconstruction effect. PSNR and MSE emphasize the overall error level of the image, SSIM emphasizes partial detail texture of the comparison image to a certain extent, and the like. Mathematically, MSE and PSNR have high correlation, and are compared and measured element by element (pixel-wise) of an image, and then an average value is obtained, so that the overall error level of the image is finally reflected. SSIM accelerates WB-MRI for deep learning from the three aspects of brightness (luminosity), contrast (contrast) and structure (structure) of an image DL And conventional WB-MRI F Is measured.
Quantitative evaluation of ADC: in order to ensure that the quantitative parameters of the WB-DWI image accelerated by deep learning are consistent with the full sampling, the difference between the quantitative parameters of the WB-DWI accelerated by deep learning and the quantitative parameters of the WB-DWI image accelerated by full sampling is quantitatively evaluated by ADC value mean square error loss (ADCloss). Accelerating image WB-DWI DL And WB-DWI for conventional image F ADC (a) adopts a single exponential model ADC = [ ln (S) ] b1 /S b2 )]/(b 2 -b 1 ) Formula calculation, wherein b 1 And b 2 Are the corresponding b values, S b1 And S b2 Respectively, the signal strength values at the time of the b value, b in this study 1 =50s/mm 2 ,b 2 =900s/mm 2
S4, testing of WB-MRI rapid prototyping model based on deep learning
In order to ensure the robustness and generalization capability of the constructed deep learning model, the constructed deep learning model is tested by adopting prospective data. K-space undersampling of WB-T2WI sequences requires a reduction of the k-space sampling matrix. The acquisition of the WB-T2WI and WB-DWI low resolution images can be accomplished by a full-time MRI technician.
(1) Prospective data collection
The project is from the date of the completion of the WB-MRI fast imaging model training based on deep learning,prospective admission to the pathology of 50 visits from our hospital confirmed prostate cancer and patients were suspected of metastases (bone metastases, lymph node metastases or visceral metastases such as lung, liver, brain, etc.), excluding patients with other malignancies. WB-MRI with deep learning acceleration for all patients DL WB-MRI for scanning and routine full sampling F And (6) scanning. The conventional WB-MRI scan takes about 40 minutes, the deep learning 4-fold accelerated WB-DWID and WB-T2WIDL sequence scans are within 5 minutes, and the prospective patient scan approach described above is increased by about 15 minutes over the conventional WB-MRI scan method. The following data were excluded: (1) patients who failed to complete both a deep learning accelerated WB-MRIDL scan and a conventional WB-MRIF scan; (2) WB-MRI images have obvious motion artifacts or magnetic sensitivity artifacts and the like which affect a lesion diagnostician. In addition, 10 prospective volunteers underwent two WB-MRIs DL The scanning was accelerated, with two scans spaced by 24 hours.
(2) Image quality assessment
50 prospective included suspected metastatic prostate cancer patients were concurrently subjected to deep learning accelerated WB-MRI DL WB-MRI for scanning and routine full sampling F Scanning, comparison of WB-MRI Using subjective and objective quality assessment criteria DL And WB-MRI F The image quality of (a). Since the high b-value DWI images routinely used in clinical diagnosis are predominant, WB-T2WI and b =900s/mm were evaluated in this project 2 WB-DWI images of (1). Subjective and objective quality evaluations (evaluation criteria see model validation section) were performed by two physicians on the image quality of the deep learning accelerated WB-MRIDL and the conventional WB-MRIF, respectively, and the consistency of the accelerated image and the conventional image was evaluated using the Kappa test (k value). Kappa evaluation criteria: 0-0.2 is poor in consistency; 0.21-0.40 consistency average; 0.41-0.60 consistency is medium; 0.61-0.80 of strong consistency; 0.81-1, strong consistency.
(3) ADC parameter estimation
ADC = [ ln (SI) according to single exponential model using Numpy software package 1 /SI 2 )]/(b 2 -b 1 ) Formula calculation accelerated image WB-DWI DL And WB-DWI for conventional image F In which b 1 And b 2 Respectively 50 s/mm and 900s/mm 2 ,SI 1 And SI 2 Respectively, the signal strength values at which the corresponding b values are taken. The mean square error loss ADCloss between the fully sampled target image and the ADC map generated from the model reconstructed image is compared pixel by pixel. WB-DWI in 50 cases of prospective Advance diagnosis of metastatic prostate cancer patients DL And WB-DWI F Respectively generating ADC parameter graphs, drawing ROI for suspicious lesion regions, and respectively measuring WB-DWI DL And WB-DWI F Evaluation of WB-DWI Using the Kappa test DL And WB-DWI F Uniformity of ADC values.
(4) Stability and repeatability evaluation of WB-MRI rapid prototyping model based on deep learning
The stability and reproducibility of images is a prerequisite for their clinical application. Quasi-prospective WB-MRI involving 10 healthy volunteers for accelerating deep learning twice DL Scanning was used for repeatability assessment. (1) Image quality repeatability evaluation: evaluation of first WB-MRI Using subjective and objective quality evaluation criteria DL1 And second WB-MRI DL2 The repeatability of the two-time accelerated image objective quality evaluation indexes (MSE, PSNR, SSIM) is evaluated by using an Intraclass Correlation Coefficient (ICC). (2) Evaluation of ADC value repeatability: WB-DWI in each healthy volunteer DL1 And WB-DWI DL2 Drawing 20 ROIs at the similar positions in the generated ADC image, and comparing the WB-DWI DL1 -ADC and WB-DWI DL2 -differences in ADC. Reproducibility of the ADC values of the two accelerated images was evaluated using ICC. ICC value is 0-1, and repeatability is poor if the value is less than 0.4; between 0.4 and 0.75 indicates moderate repeatability; > 0.75 indicates good repeatability.
S5, application of WB-MRI rapid imaging model based on deep learning in metastatic prostate cancer assessment
The prospective 50 patients with metastatic prostate cancer were followed up for more than 6 months, and the patients receiving the treatment during the follow-up visit were returned to the hospital within 1-3 months after the treatment to perform the WB-MRI with accelerated deep learning DL_after And conventional WB-MRI F_after Scanning and exploring WB-MRI (Wide band-melting magnetic resonance imaging) rapid imaging model based on deep learning in metastatic prostate cancerUse in an assessment.
Diagnostic criteria for metastases: the metastasis sites include bone metastasis, lymph node metastasis, and visceral metastasis such as lung, liver, brain metastasis, etc. Metastases were diagnosed for (1) pathology confirmed as metastases, or (2) enlarged during follow-up, or (3) post-treatment, reduced lesions. The non-metastatic focus was diagnosed in the case of no change in the size of the focus during the follow-up period or no change in the focus before and after the treatment.
(1) Evaluation of diagnostic efficacy: evaluation of WB-MRI DL Diagnostic efficacy on prostate cancer metastases. Patient inclusion for primary WB-MRI was assessed independently by two radiologists (5 and 7 years of prostate MRI diagnostic experience, respectively) DL And WB-MRI F Independently assessing whether there is metastasis in the head and neck, chest, abdomen, pelvis to mid-thigh using a 5-score visual scoring system (see article PMID:18539889 assessing the likelihood of metastasis in each patient: 1 means certainly no metastasis; 2 means likely no metastasis; 3 means equilaterally, possibly or not; 4 means likely metastasis; 5 means absolutely metastatic and recording the location and number of metastases; final diagnosis is determined by a co-discussion of two physicians) DL And WB-MRI F Inter-observer consistency. Comparison of WB-MRI Using Receiver Operating Characterization (ROC) curves DL And WB-MRI F For the diagnostic efficacy of prostate cancer metastases, the sensitivity, specificity and accuracy differences of the two scan images were compared using the paired Chi-Square test (McNemar's test).
(2) And (3) evaluating the curative effect of the metastasis: the 50 patients with metastatic prostate cancer are followed for more than 6 months, and the patients who receive the treatment in the follow-up period are returned to the hospital within 1-3 months after the treatment to perform WB-MRI with accelerated deep learning DL_after And conventional WB-MRI F_after Scanning, comparison of pre-treatment WB-MRI Using paired t-test, paired chi-square test DL And post-treatment WB-MRI DL_after Changes in metastatic foci (lesion size, number, signal intensity, major lesion ADC values, etc.) for pre-treatment WB-MRI F Medicine for treating and regulatingPost-treatment WB-MRI F_after For reference. The visualization of the metastasis for the accelerated and conventional images was evaluated independently by two physicians, respectively, using the Kappa test to assess inter-observer consistency.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. A whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer based on deep learning is characterized by comprising the following steps:
s1, retrospective data collection: retrospectively collecting WB-MRI data to construct a training set and a verification set, retrospectively incorporating 200 WB-MRI data according to grouping standards, randomly dividing the data into 180 training sets and 20 verification sets, training a neural network model by using the training set data, and adjusting model parameters and optimizing the model by using the verification set data;
s2, image preprocessing: accelerating scanning of WB-MRI is realized by adopting two modes of T2WI image preprocessing and DWI image preprocessing, so that an undersampled (or downsampled) input image and a fully sampled target image required by deep network training are obtained;
s3, training, verifying and optimizing a WB-MRI rapid imaging model based on deep learning: the method comprises the steps that a deep cascade convolution neural network (DC-CNN) and a deep residual error dense network (RDN) are used as basic networks, a transfer learning method is combined, retrospective data are used for learning the mapping relation between an input image (undersampled or downsampled image) and a target image (full sampling image), namely network parameter weights, so that a WB-MRI (wideband-magnetic resonance imaging) rapid imaging model based on deep learning is obtained;
s4, testing of a WB-MRI rapid imaging model based on deep learning: prospective collection of 50 suspected metastatic prostate cancer patients and 10 healthy volunteers for testing of the constructed deep learning model;
s5, application of the WB-MRI rapid imaging model based on deep learning in metastatic prostate cancer assessment: the pathology, which was prospectively included in 50 visits, was confirmed as prostate cancer and the metastatic patients were suspected, all patients enrolled for WB-MRI with accelerated deep learning DL Scanning and conventional full-sampling WB-MRI F Scanning to explore the application of the WB-MRI rapid imaging model based on deep learning in metastatic prostate cancer assessment.
2. The deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer according to claim 1, wherein: in step S2, the T2WI image preprocessing includes:
(1) Simulating undersampling of k space;
(2) Reconstructing a coronary position;
(3) Normalizing the data;
(4) And data is expanded.
3. The deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer according to claim 1, wherein: in step S2, the DWI image preprocessing includes:
(1) Simulating down-sampling of the image;
(2) Reconstructing a coronary position;
(3) Normalizing the data;
(4) And data is expanded.
4. The whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer based on deep learning of claim 1, wherein: in step S2, the retrospectively collected T2WI image is subjected to acceleration scanning using k-space undersampling and image downsampling, and the DWI image is subjected to an acceleration method using downsampling.
5. The deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer according to claim 1, wherein: in the step S3, the DC-CNN is used to learn the mapping relationship between the under-sampled image and the full-sampled image, and the RDN is used to learn the mapping relationship between the under-sampled image and the full-sampled image.
6. The deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer according to claim 1, wherein: the step S4 specifically includes:
(1) Image quality and ADC parameter evaluation: prospective inclusion of 50 patients with metastatic prostate cancer in the diagnosis process simultaneously carries out WB-MRI real undersampling (or undersampling) and conventional full-sampling scanning to obtain a deep learning accelerated image WB-MRI DL And full-sampled image WB-MRI F Comparison of WB-MRI DL And WB-MRI F Image quality and ADC value differences;
(2) Evaluation of stability and repeatability of the model: prospective 10 healthy volunteers were subjected to two deep learning accelerated WB-MRI scans and evaluated for WB-MRI based on the fast imaging model reconstruction of deep learning DL Repeatability of the image and repeatability of the ADC values.
7. The deep learning-based whole-body magnetic resonance fast imaging method for assessing metastatic prostate cancer according to claim 1, wherein: the step S5 specifically includes:
(1) And evaluating the diagnosis efficiency: comparative WB-MRI DL And WB-MRI F Diagnosis of metastatic focus of prostate cancerEfficiency;
(2) And evaluating the curative effect of the metastasis: 50 patients with suspected metastatic prostate cancer were followed up and patients receiving treatment during the follow-up visit were again hospitalized with a late learning accelerated WB-MRI within 1-3 months after treatment DL_after Scanning and conventional WB-MRI F_after Scanning, independent evaluation of the display of the metastasis by the accelerated and conventional images by two physicians, and comparison of the pre-treatment WB-MRI DL And post-treatment WB-MRI DL_after Shift focus variation on the image.
8. The deep learning-based metastatic prostate cancer assessment whole-body magnetic resonance fast imaging method according to claim 7, wherein: comparison of Pre-treatment WB-MRI DL And post-treatment WB-MRI DL_after The changes in metastasis on the image include lesion size, number, signal intensity, major lesion ADC values, etc.
CN202211008134.9A 2022-08-22 2022-08-22 Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning Withdrawn CN115345954A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211008134.9A CN115345954A (en) 2022-08-22 2022-08-22 Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211008134.9A CN115345954A (en) 2022-08-22 2022-08-22 Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning

Publications (1)

Publication Number Publication Date
CN115345954A true CN115345954A (en) 2022-11-15

Family

ID=83954927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211008134.9A Withdrawn CN115345954A (en) 2022-08-22 2022-08-22 Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning

Country Status (1)

Country Link
CN (1) CN115345954A (en)

Similar Documents

Publication Publication Date Title
US7358730B2 (en) Diffusion tensor imaging using highly constrained image reconstruction method
US11143730B2 (en) System and method for parallel magnetic resonance imaging
Ueda et al. Compressed sensing and deep learning reconstruction for women’s pelvic MRI denoising: utility for improving image quality and examination time in routine clinical practice
CN104379058B (en) Merge the multiple shooting scan protocols of multiplexing sensitivity encoding (MUSE) for high-resolution MRI
US20170003368A1 (en) System and Method For High Resolution Diffusion Imaging
US11002815B2 (en) System and method for reducing artifacts in echo planar magnetic resonance imaging
US11796617B2 (en) System and method for reconstruction of magnetic resonance images acquired with partial Fourier acquisition
CN113298902B (en) Method for reconstructing magnetic resonance image in convoluted field of view, computer device and storage medium
WO2023093842A1 (en) Method for multi-parametric quantitative imaging of liver
Delbany et al. One‐millimeter isotropic breast diffusion‐weighted imaging: Evaluation of a superresolution strategy in terms of signal‐to‐noise ratio, sharpness and apparent diffusion coefficient
US11835612B2 (en) System and method for motion correction of magnetic resonance image
Johansson et al. Rigid‐body motion correction of the liver in image reconstruction for golden‐angle stack‐of‐stars DCE MRI
CN113470139A (en) CT image reconstruction method based on MRI
Rodríguez‐Soto et al. Correction of Artifacts Induced by B0 Inhomogeneities in Breast MRI Using Reduced‐Field‐of‐View Echo‐Planar Imaging and Enhanced Reversed Polarity Gradient Method
Xue et al. Reliability of radiomics features due to image reconstruction using a standardized T2‐weighted pulse sequence for MR‐guided radiotherapy: An anthropomorphic phantom study
Wang et al. High‐fidelity direct contrast synthesis from magnetic resonance fingerprinting
Jacob et al. Improved model-based magnetic resonance spectroscopic imaging
Yi et al. Fast and Calibrationless low-rank parallel imaging reconstruction through unrolled deep learning estimation of multi-channel spatial support maps
US9709651B2 (en) Compensated magnetic resonance imaging system and method for improved magnetic resonance imaging and diffusion imaging
CN112581385B (en) Diffusion kurtosis imaging tensor estimation method, medium and device based on multiple prior constraints
CN115345954A (en) Whole-body magnetic resonance fast imaging method for evaluating metastatic prostate cancer based on deep learning
US10908247B2 (en) System and method for texture analysis in magnetic resonance fingerprinting (MRF)
He et al. Compressed sensing-based simultaneous recovery of magnitude and phase MR images via dual trigonometric sparsity
Hu et al. Accelerated 3D MR neurography of the brachial plexus using deep learning–constrained compressed sensing
Wang et al. Intravoxel incoherent motion magnetic resonance imaging reconstruction from highly under-sampled diffusion-weighted PROPELLER acquisition data via physics-informed residual feedback unrolled network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20221115