CN111583179B - Lung nodule deep learning classification method based on floating cutting - Google Patents

Lung nodule deep learning classification method based on floating cutting Download PDF

Info

Publication number
CN111583179B
CN111583179B CN202010255922.2A CN202010255922A CN111583179B CN 111583179 B CN111583179 B CN 111583179B CN 202010255922 A CN202010255922 A CN 202010255922A CN 111583179 B CN111583179 B CN 111583179B
Authority
CN
China
Prior art keywords
image
model
nodule
sequence
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202010255922.2A
Other languages
Chinese (zh)
Other versions
CN111583179A (en
Inventor
高峰
张仕瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202010255922.2A priority Critical patent/CN111583179B/en
Publication of CN111583179A publication Critical patent/CN111583179A/en
Application granted granted Critical
Publication of CN111583179B publication Critical patent/CN111583179B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung
    • G06T2207/30064Lung nodule

Abstract

The invention discloses a lung nodule deep learning classification method based on floating cutting, which comprises the steps of establishing a CT image lung nodule classification neural network model, and setting a lung nodule CT image sequence as a two-dimensional lung nodule image sequence at X, Y axis obtained by cutting a lung nodule along the Z axis direction; setting an image containing a three-dimensional center position of a lung nodule in a lung nodule CT image sequence as a center image; screening images of known nodule classifications as training, verification and test samples; determining a center image in the image sequence and a nodule center position and a nodule length on the center image; during training, floating cutting is carried out on the training sample sequence along the positive and negative directions of the Z axis by adopting the changed cutting interval, so as to obtain a plurality of groups of dynamic training sample sequences; training the model by adopting a dynamic training sample sequence; and during testing, sequentially inputting the upper and lower images at the nodule into the model to obtain a predicted value, and obtaining a comprehensive predicted value according to the weight of the predicted value. The invention can learn three-dimensional image information by two-dimensional input and improve the performance of the model.

Description

Lung nodule deep learning classification method based on floating cutting
Technical Field
The invention relates to the field of medical image processing, in particular to a lung nodule deep learning classification method based on floating cutting.
Background
At present, due to the aggravation of air pollution, the increase of smoking population, the acceleration of life rhythm, the higher working pressure and the like, lung cancer becomes one of the cancers with the highest morbidity and mortality in the global scope. The 5-year survival rate of a patient diagnosed with lung cancer is only 15.6%, and the resection operation performed by error diagnosis causes additional burden to the body of the patient, and the patient with weak constitution can not even bear the operation process, so that the improvement of the early diagnosis accuracy rate of the patient is very important. Since early lung nodules are small in volume and difficult to detect with chest radiographs with low resolution, Computerized Tomography (CT) is the most common method for detecting lung diseases at present, and physicians diagnose lung diseases by observing CT images of the lungs. However, the medical features of lung nodules are mostly obtained by clinical statistical induction, and hundreds of CT images are acquired from each patient, resulting in some subjective factors of the physician and a large workload.
In recent years, research on computer-aided diagnosis (CAD) methods based on medical image processing and pattern recognition has been conducted at home and abroad. Quantitative objective analysis is formed through CAD to assist a doctor in making a diagnosis decision, which can improve the diagnosis accuracy to a certain extent, but the traditional method needs to manually extract the texture feature or the shape feature of the lung nodule, and although each feature value has corresponding definition, the feature values have strong subjectivity and still depend on the experience of researchers seriously.
Since Geoffrey Hinton proposed a Deep Belief Network (DBN), a deep learning method that does not require manual setting of extraction features has caused a surge in academia. There are a lot of researchers studying the application of deep learning methods in the classification of benign and malignant pulmonary nodules. Some use specific images in the CT sequence selected by the physician and some use 3D images as model input. In the former method, it is difficult to ensure that the selected image contains the most nodule information, possibly due to physician subjectivity or irregularity of nodule shape, and the nodule important information does not necessarily exist in only one image in the sequence of images. In the latter method, the model input is a three-dimensional matrix, so that the model has more parameters needing to be trained, and the resource consumption of calculation, storage and the like is large. And because the number of usable images which are acquired from a patient and accurately marked with information such as node positions, lengths, types and the like by doctors is limited, the premise that a deep learning model needs a large number of samples in the training process is difficult to meet.
In summary, how to use limited lung CT image samples to effectively train a deep learning model is one of the technical problems that needs to be solved urgently at present.
Disclosure of Invention
The invention provides a lung nodule deep learning classification method based on floating cutting, which improves the prediction precision and solves the technical problems in the prior art.
The technical scheme adopted by the invention for solving the technical problems in the prior art is as follows: a lung nodule deep learning classification method based on floating cutting is characterized in that a CT image lung nodule classification neural network model is established, and a lung nodule CT image sequence is set as a two-dimensional lung nodule image sequence which is obtained by cutting lung nodules along the Z-axis direction and is located on X, Y axes; setting an image containing a three-dimensional center position of a lung nodule in a lung nodule CT image sequence as a center image; collecting a plurality of groups of lung nodule CT image sequences classified by known nodules, determining a central image of each group of lung nodule CT image sequences and a nodule central position and nodule length on the central image, dividing the lung nodule CT image sequences into three parts, wherein one part is used as a training sample set, one part is used as a test sample set, and the central image in the other part is used as a verification sample set; during training, floating cutting is carried out on the lung nodule CT image sequence in the training sample set along the positive and negative directions of the Z axis by adopting the changed cutting interval, so as to obtain a plurality of groups of dynamic training sample sequences; training the model by adopting a dynamic training sample sequence; after each group of dynamic training sample sequences is used for training the model, a verification sample is input into the model to preliminarily evaluate the performance of the model; when the result of the preliminary evaluation does not reach the set index, continuing training the model; when the result of the preliminary evaluation reaches a set index, testing the model by adopting a test sample sequence, and evaluating the performance of the model; when the evaluation result does not reach the set index, continuing training and primarily evaluating the model; and when the evaluation result reaches the set index, ending the training.
Furthermore, when the training sample sequence is subjected to floating cutting along the positive and negative directions of the Z axis by adopting a dynamic cutting interval, the floating amplitude is set to limit the number of floating cutting layers, so that each layer of image after floating cutting is in the nodule area; the floatable amplitude is determined by the CT image cutting interlayer spacing, the nodule length in the central image and a control factor, wherein the control factor is a limiting coefficient, and the value range is 0.6-1; in the model training, the control factors are initially set, and are gradually adjusted and determined according to the model test result.
Further, the method for training and testing the CT image pulmonary nodule classification neural network model comprises the following steps:
step 1, setting performance indexes and initial control factors of a model;
step 2, determining floatable amplitude;
step 3, randomly selecting a cutting interval within the floatable range to perform floating cutting on the training sample sequence to obtain a dynamic training sample sequence;
step 4, training the model by using a group of dynamic training sample sequences;
step 5, inputting the verification sample into the trained model, and performing primary evaluation on the performance of the model;
step 6, judging whether the performance of the model reaches a set index or not according to the preliminary evaluation result; if the model performance reaches the set index, performing step 7; if the model performance does not reach the set index, repeating the steps 3 to 5;
step 7, testing the verified model by using the test sample sequence, and evaluating the performance of the model;
step 8, judging whether the performance of the model reaches a set index or not according to the evaluation result; if the model performance reaches the set index, ending the training; and if the model performance does not reach the set index, adjusting the control factor, and repeating the step 2 to the step 7.
Further, the original cutting interlayer spacing of the CT image is set to be s, the nodule length of the central image is set to be l, the control factor is v, R 'is floatable amplitude, and the expression of R' is as follows: r' ═ INT (ν R), where R is an intermediate variable, expressed as:
Figure BDA0002437309560000031
further, a group of lung nodule CT image sequences in the test sample set is used as a group of test sample sequences; marking the sequence ordinal number of each group of test sample sequences, and setting corresponding weight of a predicted value corresponding to the sequence ordinal number; and sequentially inputting a group of test sample sequences to the trained CT image lung nodule classification neural network model according to ordinal number, and combining the model output result of each layer of image with the weight of the predicted value to obtain the comprehensive predicted value of the group of test sample sequences in the nodule region.
Further, setting the sequence ordinal number of a middle image in a group of test sample sequences as 0, marking the test sample with the sequence ordinal number along the positive direction of the Z axis by taking the middle image as an origin, wherein the sequence ordinal number is a positive value, and marking the test sample with the sequence ordinal number along the negative direction of the Z axis, and the sequence ordinal number is a negative value; setting the image sequence ordinal number of the test sample as d, and setting the weight of a predicted value of the image sequence ordinal number d of the corresponding test sample as beta; the calculation of β is shown below:
Figure BDA0002437309560000032
wherein R' is floatable amplitude.
Further, a comprehensive predicted value of a group of test sample sequences in the nodule region is set as P, and the calculation method is shown as the following formula:
Figure BDA0002437309560000033
in the formula: n is a radical of P The number of images in the starting and ending sequence of the test sample sequence; beta is a i Weighting a predicted value of an ith layer image in the test sample sequence; p i R' is a floatable amplitude and is a predicted value of an ith layer image.
The invention has the advantages and positive effects that: and in the process of training the model, a two-dimensional image obtained by floating cutting is used as model input, and the model is trained by each sample of the dynamic adjustment learning rate. In the process of applying the deep learning method aiming at the pulmonary nodule classification of the lung CT image, model training is dynamically carried out to achieve the aim of indirectly increasing training samples. In the training process, the model parameters are less, larger calculation and storage resources are not needed, and meanwhile, the classification model can train and learn the integral three-dimensional image information of the lung nodule, so that the model performance is improved. When the model is tested, the comprehensive predicted value of the test sample sequence in the nodule region is obtained by combining the test sample whole sequence image of the nodule region and the corresponding predicted value weight, so that the predicted value of the model does not only depend on the expert marked intermediate image, and the subjective influence is reduced. The invention can also be applied to other classified neural network models which take the two-dimensional images as model input to classify the three-dimensional images.
Drawings
FIG. 1 is a flow chart of the operation of the present invention.
Fig. 2 is a schematic diagram of a group of training sample sequences performing floating cutting along the positive and negative directions of the Z-axis.
Figure 3 is a schematic representation of a set of two-dimensional lung nodule image sequences.
In the figure: i is the central image, d is the image ordinal, R is the number of unilateral lung nodule crossings, and R' is the floatable amplitude.
Detailed Description
For a further understanding of the contents, features and effects of the invention, reference will now be made to the following examples, which are to be read in connection with the accompanying drawings, wherein:
referring to fig. 1 to 3, a method for classifying lung nodules by deep learning based on floating cutting is disclosed, which includes establishing a lung nodule classification neural network model of a CT image, and setting a lung nodule CT image sequence as a two-dimensional lung nodule image sequence at X, Y axis obtained by cutting lung nodules along a Z-axis direction; setting an image containing a three-dimensional center position of a lung nodule in a lung nodule CT image sequence as a center image I; the method comprises the steps of collecting a plurality of groups of lung nodule CT image sequences of known nodule classification, determining a central image I of each group of lung nodule CT image sequences and a nodule central position and nodule length on the central image I, dividing the lung nodule CT image sequences into three parts, using a part of lung nodule CT image sequences as a training sample set, using a part of lung nodule CT image sequences as a test sample set, and using a central image I in a part of lung nodule CT image sequences as a verification sample set.
During training, floating cutting is carried out on the training sample sequence along the positive and negative directions of the Z axis by adopting the changed cutting interval, so as to obtain a plurality of groups of dynamic training sample sequences; training a lung nodule classification neural network model of the CT image by adopting a dynamic training sample sequence; each layer of image in each group of dynamic training sample sequence can be intercepted for the nodule region marked by the doctor, so that the nodule is positioned in the intercepted image region. And after the image is cut, corresponding to the input size of the CT image pulmonary nodule classification neural network model, adjusting the size of the image to be suitable for being input into the CT image pulmonary nodule classification neural network model, and inputting the adjusted image into the CT image pulmonary nodule classification neural network model for training.
And inputting the test sample sequence into the trained model for testing, and evaluating the model. And (4) carrying out image interception on the test sample sequence by adopting the same method as the training sample. And adjusting the image to be the input size of the CT image pulmonary nodule classification neural network model, and inputting the input size to the CT image pulmonary nodule classification neural network model for testing. And inputting the processed test sample image sequence into the trained model for testing.
And intercepting the image of the verification sample for the nodule area marked by the doctor, so that the nodule is positioned in the intercepted image area. The image can be intercepted and processed by referring to the method, and after the model is trained by using a group of dynamic training sample sequences, the performance of the model is preliminarily evaluated by inputting a verification sample into the model.
Before training, the performance indexes of the model are set, and the performance indexes of the model can comprise accuracy, sensitivity and specificity which can be obtained by a confusion matrix, an ROC curve, an area under the ROC curve and the like. And evaluating whether the performance of the model meets the expected model performance by using the set index of the model performance.
When the result of the preliminary evaluation does not reach the set index, continuing training the model; when the result of the preliminary evaluation reaches a set index, testing the model by adopting a test sample sequence, and evaluating the performance of the model; when the evaluation result does not reach the set index, continuing training and primarily evaluating the model; and when the evaluation result reaches the set index, finishing the training.
The lung nodule CT image sequence in the test sample set can be subjected to image interception and processing by adopting the same method as the dynamic training sample sequence. And then inputting the CT image sequence in the processed test sample set into the verified CT image lung nodule classification neural network model, and evaluating the model performance according to the output result of the CT image lung nodule classification neural network model.
Furthermore, when the training sample sequence is subjected to floating cutting along the positive and negative directions of the Z axis by adopting a dynamic cutting interval, the floating amplitude can be set to limit the number of layers of floating cutting, so that each layer of image after floating cutting is in the nodule area; the floatable amplitude is the range of the cuttable layer number; the floatable amplitude can be determined by the CT image cutting interlayer distance, the nodule length in the central image I and a control factor, and the control factor is set for preventing the image of the region outside the nodule obtained by cutting in the floating cutting process. Wherein the control factor is a limiting coefficient, and the value range can be 0.6-1; in the training of the CT image pulmonary nodule classification neural network model, the control factors can be initially set, and the control factors can be gradually adjusted and determined according to the test result of the CT image pulmonary nodule classification neural network model.
The floating cut is a process in which the center of a nodule marked by a physician is used as an origin, the amplitude obtained by the CT image layer spacing, the nodule length and a control factor is used as a range, and a CT image is randomly taken in the range and used as model input.
Further, the method for training and testing the lung nodule classification neural network model of the CT image can comprise the following steps:
step 1, setting performance indexes and initial control factors of a model;
step 2, determining floatable amplitude;
step 3, randomly selecting a cutting interval within the floatable range to perform floating cutting on the training sample sequence to obtain a dynamic training sample sequence;
step 4, training the model by using a group of dynamic training sample sequences; the learning rate of the training samples may be calculated prior to training.
Step 5, inputting the verification sample into the trained model, and performing primary evaluation on the performance of the model;
step 6, judging whether the performance of the model reaches a set index or not according to the preliminary evaluation result; if the model performance reaches the set index, performing step 7; if the model performance does not reach the set index, repeating the steps 3 to 5;
step 7, testing the verified model by using the test sample sequence, and evaluating the performance of the model;
when the evaluation is performed, the weighted average of the model output values of a plurality of test samples corresponding to one nodule can be used as the comprehensive predicted value.
For example, the image ordinal number of each layer of image in each group of test sample sequence can be labeled, and corresponding predicted value weight can be set corresponding to the image ordinal number; a group of test sample sequences can be sequentially input into the trained CT image lung nodule classification neural network model according to the image ordinal number, and the model output result of each layer of image can be combined with the weight of a predicted value to obtain the comprehensive predicted value of the group of test sample sequences in a nodule region.
Step 8, judging whether the performance of the model reaches a set index or not according to the evaluation result; if the model performance reaches the set index, ending the training; and if the model performance does not reach the set index, adjusting the control factor, and repeating the step 2 to the step 7.
Further, the original cutting layer interval of the CT image can be set to s, the nodule length of the central image I is set to l, the control factor v, R 'is the floatable amplitude, and the expression of R' can be: r' ═ INT (ν R), where R can be an intermediate variable, which can be expressed as the theoretical number of unilaterally penetrating layers of the lung nodule, expressed as:
Figure BDA0002437309560000061
further, a set of lung nodule CT image sequences in the test sample set may be taken as a set of test sample sequences; the image ordinal number of each layer of image in each group of test sample sequence can be marked, and corresponding predicted value weight can be set corresponding to the image ordinal number; a group of test sample sequences can be sequentially input into the trained CT image lung nodule classification neural network model according to the image ordinal number, and the model output result of each layer of image can be combined with the weight of a predicted value to obtain the comprehensive predicted value of the group of test sample sequences in a nodule region. The composite predictor is an output weighted value of a group of images of the test sample sequence, and can be a weighted average value.
Further, the image ordinal number of the test sample can be set as d, the image ordinal number d of the intermediate image in the group of test sample sequences can be set as 0, the intermediate image can be used as an origin, the test sample is labeled with the image ordinal number d along the positive direction of the Z axis, the image ordinal number d is a positive value, the test sample is labeled with the image ordinal number d along the negative direction of the Z axis, and the image ordinal number d is a negative value; the image ordinal number d numbering method comprises the following steps: numbering image ordinal numbers d in a group of test sample sequences, wherein the image ordinal number d of an intermediate image is 0, namely the label d is 0, and numbering is carried out in sequence from the intermediate image along the positive direction of a Z axis, and the numbering is the image ordinal number d; at this time, d is a positive value and is +1, +2, + 3., + n in sequence; sequentially numbering from the middle image along the Z-axis negative direction, wherein the numbering is an image ordinal number d; in this case d is negative and in turn-1, -2, -3.
Setting the image ordinal number of the test sample as d, and setting the weight of a predicted value of the image ordinal number d corresponding to the test sample as beta; the method of calculating β is shown by the following formula:
Figure BDA0002437309560000071
wherein R' is floatable amplitude.
Further, a comprehensive predicted value of a set of test sample sequences in the nodule region may be set as P, and the calculation method may be as follows:
Figure BDA0002437309560000072
in the formula: n is a radical of P The number of images in the starting and ending sequence of the test sample sequence; beta is a i Weighting the predicted value of the ith layer image in the test sample sequence; p i R' is a floatable amplitude and is a predicted value of an ith layer image.
The working process and working principle of the present invention are further described below by taking a preferred embodiment of the present invention as an example:
a lung nodule deep learning classification method based on floating cutting is a method for improving the classification performance of a deep learning model by dynamically carrying out model training to indirectly increase training samples in the process of applying a deep learning method to the lung nodule classification of a lung CT image. The method is mainly suitable for a Convolutional Neural Network (CNN) or a deep learning model taking image feature extraction as a primary task. Referring to fig. 3, the present invention mainly depends on the increase and decrease direction (left and right direction in fig. 3) of the lung nodules from the middle position marked by the physician in the lung CT image along with the sequence, and the length, area, image characteristics, etc. of the lung nodules are gradually reduced and disappeared, so that the learning rate of each sample obtained by floating segmentation is inversely proportional to the degree of the lung nodules deviating from the middle position.
Referring to fig. 1, the work flow of the present invention mainly includes the following steps:
establishing a pulmonary nodule classification neural network model of a CT image, wherein the input size of the model is w m ×h m And the number of model output nodes is 1 or 2.
A, lung CT image sample data preprocessing for training and testing a model:
step A-1, a lung CT image with well-known nodule classification and image interlamellar spacing s is screened from the existing case data.
And step A-2, outputting the node number to be 1 or 2 according to the classification model to be trained, for example, setting a class label to be [0] or [0,1] for a benign node or cystic node, and setting a class label to be [1] or [1,0] for a malignant node or a real node. And labeling related information of each nodule in the CT image by an experienced radiology expert, wherein the information comprises: in the CT image sequence, a central image I where the three-dimensional center of the nodule is located, the position (x, y) where the center of the nodule is located in the central image I, the length l of the nodule in the central image I, the image cutting size w multiplied by h and the like.
And step A-3, setting a control factor v belonging to [0.6, 1] for preventing the image of the region outside the nodule from being obtained by cutting during floating cutting, wherein the value can be adjusted according to the performance of the model.
Step B, training a CT image pulmonary nodule classification neural network model:
step B-1, calculating the number of layers penetrating through one side of the lung nodule according to the length l of the lung nodule and the layer spacing s of the CT image
Figure BDA0002437309560000082
And setting R 'as floatable amplitude, and calculating by using a control factor v to obtain the floatable amplitude R' ═ INT (ν R) when the training model is obtained.
Step B-2, as shown in figure 3,
Figure BDA0002437309560000083
the image number indicates the direction (the position of the image I in the step a-2, which is the nodule-labeled intermediate layer, is set to "d" equal to 0). Randomly taking values in each nodule region d, cutting the nodule region d from the position (x, y) in the corresponding image to obtain a size w multiplied by h image, and normalizing the pixel value to [0, 1%]Range, then adjust image length and width to model input size w m ×h m As the round training sample.
Learning rate η of each sample (S) For the reduction value of the current model learning rate, the reduction coefficient is the inverse ratio of the image ordinal number d and the ratio of each floatable amplitude R', and the calculation formula is shown as the following formula:
Figure BDA0002437309560000081
in the formula, η is the model learning rate.
And step B-3, training the deep learning model by the samples of the dynamically adjusted learning rates obtained in the step B-2, namely inputting training samples into the model, calculating loss function values through a model loss function and corresponding class labels, updating model weights through a gradient descent method and the like, and adjusting the model learning rates according to an optimization algorithm set in the model and the like. And if the model training is finished, finishing the training of the model, otherwise returning to the step B-1 to continue the training.
And step C, testing the lung nodule classification neural network model of the CT image.
Step C-1, aiming at the lung nodule CT image of the test sample, marking the starting and ending numbers of the image sequence of the nodule region in the CT image and the position (x) of the center of the nodule in the intermediate image by a radiology expert P ,y P ) Image cutting size w P ×h P
Step C-2, as shown in fig. 2, let the image ordinal number of the test sample be d, and the sequence middle position be d ═ 0. Setting the weight of a predicted value of the image ordinal number d corresponding to the test sample as beta; the calculation of β is shown below:
Figure BDA0002437309560000091
in the formula, R' is floatable amplitude.
Step C-3, position (x) in each image of sequence start and stop numbers P ,y P ) Cutting to obtain w P ×h P Image post-normalization to [0,1] pixel values]Range, then adjust image length and width to model input size w m ×h m And then sequentially passing through the trained CT image lung nodule classification neural network model, calculating by combining with the weight of the predicted value to obtain a final comprehensive predicted value aiming at the nodule region, and setting P as the comprehensive predicted value of the test sample sequence in the nodule region, wherein the calculation method can be shown as the following formula:
Figure BDA0002437309560000092
in the formula: n is a radical of P The number of images in the starting and ending sequence of the test sample sequence; beta is a i Weighting the predicted value of the ith layer image in the test sample sequence; p is i R' is a floatable amplitude and is a predicted value of an ith layer image.
And D, adjusting a control factor according to the classification performance of the model in the step C-3.
And D-1, if the model performance is in accordance with the expectation, the whole process is ended, otherwise, the step A-3 is returned to adjust the control factors. The above-mentioned embodiments are only for illustrating the technical ideas and features of the present invention, and the purpose thereof is to enable those skilled in the art to understand the contents of the present invention and to carry out the same, and the present invention shall not be limited to the embodiments, i.e. the equivalent changes or modifications made within the spirit of the present invention shall fall within the scope of the present invention.

Claims (5)

1. A lung nodule deep learning classification method based on floating cutting is characterized in that a CT image lung nodule classification neural network model is established, and a lung nodule CT image sequence is set as a two-dimensional lung nodule image sequence which is obtained by cutting a lung nodule along the Z-axis direction and is in an X, Y axis; setting an image containing a three-dimensional center position of a lung nodule in a lung nodule CT image sequence as a center image; collecting a plurality of groups of lung nodule CT image sequences classified by known nodules, determining a central image of each group of lung nodule CT image sequences and a nodule central position and nodule length on the central image, dividing the lung nodule CT image sequences into three parts, wherein one part is used as a training sample set, one part is used as a test sample set, and the central image in the other part is used as a verification sample set; during training, floating cutting is carried out on the lung nodule CT image sequence in the training sample set along the positive and negative directions of the Z axis by adopting dynamic cutting intervals to obtain a plurality of groups of dynamic training sample sequences; training the model by adopting a dynamic training sample sequence; after each group of dynamic training sample sequences is used for training the model, a verification sample is input into the model to preliminarily evaluate the performance of the model; when the result of the preliminary evaluation does not reach the set index, continuing training the model; when the result of the preliminary evaluation reaches a set index, testing the model by adopting a test sample sequence, and evaluating the performance of the model; when the evaluation result does not reach the set index, continuing training and primarily evaluating the model; when the evaluation result reaches the set index, ending the training; taking a group of lung nodule CT image sequences in the test sample set as a group of test sample sequences; marking the sequence ordinal number of each group of test sample sequences, and setting corresponding weight of a predicted value corresponding to the sequence ordinal number; sequentially inputting a group of test sample sequences to the trained CT image lung nodule classification neural network model according to ordinal number, and combining the model output result of each layer of image with the weight of a predicted value to obtain the comprehensive predicted value of the group of test sample sequences in a nodule region;
setting the sequence ordinal number of a middle image in a group of test sample sequences as 0, taking the middle image as an origin, marking the test sample with the sequence ordinal number along the positive direction of a Z axis, wherein the sequence ordinal number is a positive value, marking the test sample with the sequence ordinal number along the negative direction of the Z axis, and the sequence ordinal number is a negative value; setting the image sequence ordinal number in the test sample sequence as d, and setting the weight of a predicted value corresponding to the image sequence ordinal number d in the test sample sequence as beta; the method of calculating β is shown by the following formula:
Figure FDA0003771888620000011
in the formula, R' is floatable amplitude; the floatable amplitude is used for limiting the number of floating cutting layers, so that each layer of images after floating cutting are in the nodule area.
2. The lung nodule deep learning classification method based on floating cutting according to claim 1, wherein when the training sample sequence is subjected to floating cutting along the positive and negative directions of the Z axis by adopting a dynamic cutting interval, the floatable amplitude is determined by the CT image cutting interval, the nodule length in the central image and a control factor, wherein the control factor is a limiting coefficient, and the value range is 0.6-1; in the model training, the control factors are initially set, and are gradually adjusted and determined according to the model test result.
3. The lung nodule deep learning classification method based on the floating cutting as claimed in claim 2, wherein the method for training and testing the CT image lung nodule classification neural network model comprises the following steps:
step 1, setting performance indexes and initial control factors of a model;
step 2, determining floatable amplitude;
step 3, randomly selecting a cutting interval within the floatable range to perform floating cutting on the training sample sequence to obtain a dynamic training sample sequence;
step 4, training the model by using a group of dynamic training sample sequences;
step 5, inputting the verification sample into the trained model, and performing primary evaluation on the performance of the model;
step 6, judging whether the performance of the model reaches a set index or not according to the preliminary evaluation result; if the model performance reaches the set index, performing step 7; if the model performance does not reach the set index, repeating the steps 3 to 5;
step 7, testing the verified model by using the test sample sequence, and evaluating the performance of the model;
step 8, judging whether the performance of the model reaches a set index or not according to the evaluation result; if the model performance reaches the set index, ending the training; and if the model performance does not reach the set index, adjusting the control factor, and repeating the step 2 to the step 7.
4. The method for classifying pulmonary nodule deep learning based on floating cutting as claimed in claim 2, wherein the original cutting layer spacing of the CT image is s, the nodule length of the central image is l, the control factor is v, R 'is floatable amplitude, and the expression of R' is: r' ═ INT (ν R), where R is an intermediate variable, expressed as:
Figure FDA0003771888620000021
5. the method for classifying pulmonary nodule deep learning based on floating segmentation according to claim 2, wherein the comprehensive predicted value of a group of test sample sequences in a nodule region is P, and the calculation method is as follows:
Figure FDA0003771888620000022
in the formula: n is a radical of P The number of images in the starting and ending sequence of the test sample sequence; beta is a i Weighting the predicted value of the ith layer image in the test sample sequence; p i R' is a floatable amplitude and is a predicted value of an ith layer image.
CN202010255922.2A 2020-04-02 2020-04-02 Lung nodule deep learning classification method based on floating cutting Expired - Fee Related CN111583179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010255922.2A CN111583179B (en) 2020-04-02 2020-04-02 Lung nodule deep learning classification method based on floating cutting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010255922.2A CN111583179B (en) 2020-04-02 2020-04-02 Lung nodule deep learning classification method based on floating cutting

Publications (2)

Publication Number Publication Date
CN111583179A CN111583179A (en) 2020-08-25
CN111583179B true CN111583179B (en) 2022-09-16

Family

ID=72122506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010255922.2A Expired - Fee Related CN111583179B (en) 2020-04-02 2020-04-02 Lung nodule deep learning classification method based on floating cutting

Country Status (1)

Country Link
CN (1) CN111583179B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112185523B (en) * 2020-09-30 2023-09-08 南京大学 Diabetic retinopathy classification method based on multi-scale convolutional neural network
CN113656954A (en) * 2021-08-10 2021-11-16 北京首钢自动化信息技术有限公司 Cutting map generation method and device
CN113888532A (en) * 2021-11-09 2022-01-04 推想医疗科技股份有限公司 Medical image analysis method and device based on flat scanning CT data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242839A (en) * 2018-08-29 2019-01-18 上海市肺科医院 A kind of good pernicious classification method of CT images Lung neoplasm based on new neural network model
CN109523525A (en) * 2018-11-07 2019-03-26 广州大学 Malign lung nodules recognition methods, device, equipment and the storage medium of image co-registration
CN109685768A (en) * 2018-11-28 2019-04-26 心医国际数字医疗系统(大连)有限公司 Lung neoplasm automatic testing method and system based on lung CT sequence

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109003260B (en) * 2018-06-28 2021-02-09 深圳视见医疗科技有限公司 CT image pulmonary nodule detection method, device and equipment and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242839A (en) * 2018-08-29 2019-01-18 上海市肺科医院 A kind of good pernicious classification method of CT images Lung neoplasm based on new neural network model
CN109523525A (en) * 2018-11-07 2019-03-26 广州大学 Malign lung nodules recognition methods, device, equipment and the storage medium of image co-registration
CN109685768A (en) * 2018-11-28 2019-04-26 心医国际数字医疗系统(大连)有限公司 Lung neoplasm automatic testing method and system based on lung CT sequence

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Lung segmentation based on a deep learning approach for dynamic chest radiography;Yuki Kitahara et al;《SPIE Medical Imaging》;20191231;全文 *
工业CT切片序列任意方向剖面的绘制方法;段黎明 等;《强激光与粒子束》;20131130;全文 *

Also Published As

Publication number Publication date
CN111583179A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111583179B (en) Lung nodule deep learning classification method based on floating cutting
CN108052977B (en) Mammary gland molybdenum target image deep learning classification method based on lightweight neural network
CN109598727B (en) CT image lung parenchyma three-dimensional semantic segmentation method based on deep neural network
CN107016665B (en) CT pulmonary nodule detection method based on deep convolutional neural network
CN108389201B (en) Lung nodule benign and malignant classification method based on 3D convolutional neural network and deep learning
CN109447940B (en) Convolutional neural network training method, ultrasonic image identification and positioning method and system
CN107103187B (en) Lung nodule detection grading and management method and system based on deep learning
CN108921851B (en) Medical CT image segmentation method based on 3D countermeasure network
CN101763644B (en) Pulmonary nodule three-dimensional segmentation and feature extraction method and system thereof
CN108257135A (en) The assistant diagnosis system of medical image features is understood based on deep learning method
CN104933709B (en) Random walk CT lung tissue image automatic segmentation methods based on prior information
CN109584254A (en) A kind of heart left ventricle's dividing method based on the full convolutional neural networks of deep layer
Zhong et al. Cancer image classification based on DenseNet model
KR101144964B1 (en) System for Detection of Interstitial Lung Diseases and Method Therefor
CN104102839B (en) A kind of Alzheimer disease cortex automatic classification method based on multiple dimensioned grid surface shape facility
CN108304887A (en) Naive Bayesian data processing system and method based on the synthesis of minority class sample
CN112365464A (en) GAN-based medical image lesion area weak supervision positioning method
CN113706434B (en) Post-processing method for chest enhancement CT image based on deep learning
CN111767952A (en) Interpretable classification method for benign and malignant pulmonary nodules
CN110379509A (en) A kind of Breast Nodules aided diagnosis method and system based on DSSD
CN116664931A (en) Knee osteoarthritis grading method based on quantum-to-classical migration learning
CN114359629A (en) Pneumonia X chest radiography classification and identification method based on deep migration learning
CN114332572B (en) Method for extracting breast lesion ultrasonic image multi-scale fusion characteristic parameters based on saliency map-guided hierarchical dense characteristic fusion network
CN115910362A (en) Atopic dermatitis characteristic prediction method based on enhanced particle swarm optimization
CN112071430B (en) Intelligent pathological index prediction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220916