CN111415333B - Mammary gland X-ray image antisymmetric generation analysis model training method and device - Google Patents
Mammary gland X-ray image antisymmetric generation analysis model training method and device Download PDFInfo
- Publication number
- CN111415333B CN111415333B CN202010147836.XA CN202010147836A CN111415333B CN 111415333 B CN111415333 B CN 111415333B CN 202010147836 A CN202010147836 A CN 202010147836A CN 111415333 B CN111415333 B CN 111415333B
- Authority
- CN
- China
- Prior art keywords
- mammary gland
- bilateral
- image
- original
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000005075 mammary gland Anatomy 0.000 title claims abstract description 87
- 238000004458 analytical method Methods 0.000 title claims abstract description 75
- 238000012549 training Methods 0.000 title claims abstract description 50
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000002146 bilateral effect Effects 0.000 claims abstract description 85
- 210000000481 breast Anatomy 0.000 claims abstract description 55
- 238000003062 neural network model Methods 0.000 claims abstract description 34
- 238000000605 extraction Methods 0.000 claims abstract description 25
- 230000006870 function Effects 0.000 claims description 33
- 238000009607 mammography Methods 0.000 claims description 31
- 238000004590 computer program Methods 0.000 claims description 13
- 230000013016 learning Effects 0.000 claims description 10
- 238000007781 pre-processing Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 10
- 238000010606 normalization Methods 0.000 claims description 8
- 238000011176 pooling Methods 0.000 claims description 5
- 239000002689 soil Substances 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000010191 image analysis Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000002372 labelling Methods 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 206010006187 Breast cancer Diseases 0.000 description 2
- 208000026310 Breast neoplasm Diseases 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000004907 gland Anatomy 0.000 description 2
- 230000003902 lesion Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- ZOKXTWBITQBERF-UHFFFAOYSA-N Molybdenum Chemical compound [Mo] ZOKXTWBITQBERF-UHFFFAOYSA-N 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 206010070834 Sensitisation Diseases 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 201000011510 cancer Diseases 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003211 malignant effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910052750 molybdenum Inorganic materials 0.000 description 1
- 239000011733 molybdenum Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000010827 pathological analysis Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008313 sensitization Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the application provides a training method and device for an analysis model generated by antisymmetry of a mammary gland X-ray image, which solve the problems of low efficiency and low accuracy of the existing mammary gland X-ray image analysis mode. The mammary gland X-ray image antisymmetry generation analysis model training method comprises the following steps: respectively extracting original double-sided mammary gland image features of double-sided mammary gland X-ray images based on a pre-trained feature extraction model, wherein the double-sided mammary gland X-ray images respectively correspond to left breast and right breast; inputting the original bilateral mammary gland image characteristics into a neural network model to obtain an antisymmetric analysis result opposite to the characteristic information of the original bilateral mammary gland image characteristics; calculating the loss function value between the generated anti-symmetric analysis result and the original bilateral mammary gland image characteristics; and adjusting parameters of the neural network model based on the loss function value.
Description
Technical Field
The application relates to the technical field of image analysis, in particular to a mammary X-ray image antisymmetric generation analysis model training method, a device, electronic equipment and a computer readable storage medium.
Background
Breast cancer is currently the most frequently occurring cancer in women worldwide. Breast X-ray imaging is the most important means for early screening of breast cancer. Because the X-rays are absorbed to different degrees when passing through the human body, the X-rays passing through the human body have different amounts, the formed image carries the information of the density distribution of each part of the human body, and the fluorescence or sensitization caused on the fluorescent screen or the photographic film has larger difference, so that shadows with different densities are displayed on the fluorescent screen or the photographic film (after development and fixation). According to the contrast of shade and shade, the clinical manifestation, the test result and the pathological diagnosis are combined to judge whether a certain part of the human body is normal or not.
The existing analysis method based on the mammary gland X-ray image is mostly based on the labeling of the existing focus area, and the focus area is learned by using a neural network so as to obtain the analysis result of benign and malignant diseases. However, labeling of focal areas requires a large number of experienced doctors, so that such labeling is very difficult to obtain and the model training is inefficient. However, one difficulty in training a model using only pathology results is how to properly locate the lesion area. In many cases, the lesions are characterized very like dense glands, often confused with normal glands. The existing method is to make some local features extraction on a single image or to give some broader constraints to achieve the purpose of locating the focus, but the accuracy of the analysis mode is lower.
Disclosure of Invention
In view of the above, the embodiment of the application provides a training method and a training device for an anti-symmetric generation analysis model of a mammary gland X-ray image, which solve the problems of low efficiency and low accuracy of the existing mammary gland X-ray image analysis mode.
According to one aspect of the present application, an embodiment of the present application provides a method for training an analysis model for generating an antisymmetric breast X-ray image, comprising: respectively extracting original double-sided mammary gland image features of double-sided mammary gland X-ray images based on a pre-trained feature extraction model, wherein the double-sided mammary gland X-ray images respectively correspond to left breast and right breast; inputting the original bilateral mammary gland image characteristics into a neural network model to obtain an antisymmetric analysis result opposite to the characteristic information of the original bilateral mammary gland image characteristics; calculating the loss function value between the generated anti-symmetric analysis result and the original bilateral mammary gland image characteristics; and adjusting parameters of the neural network model based on the loss function value.
In one embodiment of the present application, the method further comprises: and inputting the anti-symmetric analysis result and the original bilateral mammary gland image characteristics into the neural network model for weak supervision learning.
In an embodiment of the present application, before extracting original bilateral breast image features of the bilateral breast X-ray images based on the pre-trained feature extraction model, respectively, the method further comprises: preprocessing the bilateral mammography images.
In one embodiment of the present application, the preprocessing the double sided mammography image includes: carrying out gray scale normalization processing on the bilateral mammary X-ray images; compressing the bilateral mammography image in a binarized manner; acquiring a bilateral mammary gland region image in the bilateral mammary gland X-ray image; and performing an opening operation on the bilateral mammary gland region images respectively to align the bilateral mammary gland region images.
In an embodiment of the application, the feature extraction model includes one of the following model classes: resNet, alexNet, denseNet.
In an embodiment of the application, the neural network model comprises one or more combinations of the following layers: convolution layer, pooling layer and full connection layer.
In an embodiment of the present application, the loss function on which the loss function value is based includes one or more of the following functions: contrast loss, triplet loss, N pair loss, chamfer loss, and soil removal loss.
According to one aspect of the present application, an apparatus for training an analysis model for generating an antisymmetric breast X-ray image according to an embodiment of the present application includes: the extraction module is configured to respectively extract original bilateral mammary gland image features of bilateral mammary gland X-ray images based on a pre-trained feature extraction model, wherein the bilateral mammary gland X-ray images respectively correspond to left breast and right breast; the analysis module is configured to input the original bilateral mammary image characteristics into a neural network model to obtain an antisymmetric analysis result opposite to the characteristic information of the original bilateral mammary image characteristics; the calculation module is configured to calculate the loss function value between the generated anti-symmetric analysis result and the original bilateral mammary gland image characteristics; and an adjustment module configured to adjust parameters of the neural network model based on the loss function values.
In one embodiment of the present application, the apparatus further comprises: and the weak supervision learning module is configured to input the anti-symmetric analysis result and the original bilateral mammary gland image characteristics into the neural network model for weak supervision learning.
In one embodiment of the present application, the apparatus further comprises: the preprocessing module is configured to preprocess the double-sided mammary gland X-ray images before original double-sided mammary gland image features of the double-sided mammary gland X-ray images are respectively extracted based on a pre-trained feature extraction model.
In an embodiment of the present application, the preprocessing module includes: a normalization unit configured to perform gray scale normalization processing on the bilateral mammary gland X-ray images; a compression unit configured to compress the bilateral mammography images in a binarized manner; an area acquisition unit configured to acquire a bilateral breast area image in the bilateral breast X-ray images; and an opening operation unit configured to perform an opening operation on the double-sided breast region images, respectively, to align the double-sided breast region images.
In an embodiment of the application, the feature extraction model includes one of the following model classes: resNet, alexNet, denseNet.
In an embodiment of the application, the neural network model comprises one or more combinations of the following layers: convolution layer, pooling layer and full connection layer.
In an embodiment of the present application, the loss function on which the loss function value is based includes one or more of the following functions: contrast loss, triplet loss, N pair loss, chamfer loss, and soil removal loss.
According to another aspect of the present application, an embodiment of the present application provides an electronic device, including: a processor; a memory; and computer program instructions stored in the memory, which when executed by the processor, cause the processor to perform the mammography X-ray image antisymmetric generation analysis model training method as set forth in any one of the above.
According to another aspect of the application, an embodiment of the application provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform a breast X-ray image antisymmetric generation analysis model training method as described in any of the preceding claims.
According to another aspect of the application, an embodiment of the application provides a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform a method of training a breast X-ray image antisymmetric generation analysis model as described in any one of the above.
According to the training method, the device, the electronic equipment and the computer readable storage medium for the anti-symmetric generation analysis model of the mammary gland X-ray image, the anti-symmetric analysis result opposite to the characteristic information of the original bilateral mammary gland image characteristic is generated after the original bilateral mammary gland image characteristic is extracted and input into the neural network model, the generated anti-symmetric analysis result is utilized for model analysis training, the parameters of the neural network model are adjusted based on the generated anti-symmetric analysis result and the loss between the original bilateral mammary gland image characteristics, the information in the bilateral mammary gland X-ray image can be effectively combined, manual labeling is not needed, and the analysis accuracy after model training is remarkably improved. Meanwhile, all operations in the training process can be conducted, so that an end-to-end single-step learning mode can be adopted, and the training efficiency of the model and the accuracy of a final analysis result are ensured.
Drawings
Fig. 1 is a flow chart of a training method for generating an analysis model for the anti-symmetric generation of a mammography X-ray image according to an embodiment of the application.
Fig. 2 is a schematic diagram of a double sided mammography X-ray image according to one embodiment of the present application.
Fig. 3 is a schematic flow chart of a preprocessing process for a double-sided mammography X-ray image in a training method of an anti-symmetric generation analysis model of a mammography X-ray image according to an embodiment of the application.
Fig. 4 is a schematic structural diagram of a breast X-ray image antisymmetric generation analysis model training device according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a breast X-ray image antisymmetric generation analysis model training device according to another embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is a flow chart of a training method for generating an analysis model for the anti-symmetric generation of a mammography X-ray image according to an embodiment of the application. As shown in FIG. 1, the method for training the analysis model for the anti-symmetric generation of the mammary gland X-ray image comprises the following steps:
step 101: the original bilateral mammary gland image features of the bilateral mammary gland X-ray images are respectively extracted based on a pre-trained feature extraction model, wherein the bilateral mammary gland X-ray images respectively correspond to left breast and right breast.
The breast X-ray images (also called breast molybdenum target images) are generally taken and used on opposite sides, as shown in fig. 2, and the two-sided breast X-ray images correspond to the left breast and the right breast, respectively, that is, include images corresponding to the left breast and images corresponding to the right breast. According to the different projection positions, the bilateral mammary gland X-ray image can be further divided into 4 projection positions, including an axial position (LCC) of a left mammary gland, an axial position (RCC) of a right mammary gland, an oblique position (LMLO) of the left mammary gland and an oblique position (RMLO) of the right mammary gland.
The feature extraction model is a neural network model which is established through a pre-training process and is provided with the function of extracting image features based on input images. In one embodiment of the present application, the feature extraction model may include one of the following neural network model classes: resNet, alexNet, denseNet, etc. However, it should be understood that the specific type of the feature extraction model may be selected according to the requirements of the specific application scenario, and the present application is not limited strictly to the specific type of the feature extraction model.
Step 102: the original bilateral mammary gland image features are input into a neural network model to obtain an antisymmetric analysis result opposite to the feature information of the original bilateral mammary gland image features.
The neural network model may employ a deep neural network for classification processing including one or more combinations of convolutional layers, pooling layers, and fully-connected layers, and the specific type and structure of the deep neural network employed is likewise not strictly limited by the present application. The antisymmetric analysis result and the original bilateral mammary gland image feature can respectively correspond to two analysis results of the presence and absence of a focus. The information of the two analysis results can be effectively utilized by using the neural network model for subsequent training based on the original bilateral mammary gland image characteristics and the antisymmetric analysis results.
Step 103: and calculating the loss function value between the generated anti-symmetry analysis result and the image characteristics of the original bilateral mammary glands.
The loss function upon which the loss function is based may include one or more combinations of the following functions: contrast loss (contrast loss), triplets (triplet loss), N pair loss (N-pair loss), chamfer loss (Chamfer loss), soil-moving loss (Earth's loss). However, it should be understood that the specific selection of the loss function may be adjusted according to the requirements of the actual application scenario, and the specific kind of the loss function is not strictly limited in the present application.
Step 104: and adjusting parameters of the neural network model based on the loss function value.
After the neural network model is adjusted based on the loss function, the neural network is optimized. By adopting a large number of bilateral mammary gland X-ray images as samples, the neural network model can reach preset regression accuracy after the steps are circulated for a limited number of times, so that the capability of outputting high-accuracy analysis results based on input mammary gland X-ray images is also provided.
In an embodiment of the application, in order to further improve the efficiency and accuracy of model training, the antisymmetric analysis result and the original bilateral mammary gland image features can be input into the neural network model to perform weak supervision learning at the same time. The process of specific weakly supervised learning is not repeated again.
Therefore, according to the method for training the analysis model generated by the anti-symmetry of the mammary gland X-ray image, the original bilateral mammary gland image characteristics of the bilateral mammary gland X-ray image are extracted, the anti-symmetry analysis result opposite to the characteristic information of the original bilateral mammary gland image characteristics is generated after the neural network model is input, the generated anti-symmetry analysis result is utilized for model analysis training, the parameters of the neural network model are adjusted based on the loss between the generated anti-symmetry analysis result and the original bilateral mammary gland image characteristics, the information in the bilateral mammary gland X-ray image can be effectively combined, manual labeling is not needed, and the analysis accuracy after model training is remarkably improved. Meanwhile, all operations in the training process can be conducted, so that an end-to-end single-step learning mode can be adopted, and the training efficiency of the model and the accuracy of a final analysis result are ensured.
In an embodiment of the present application, in order to further improve accuracy of model training, the two-sided mammography images may be preprocessed before the original two-sided mammography image features of the two-sided mammography images are extracted respectively based on the pre-trained feature extraction model.
Fig. 3 is a schematic flow chart of a preprocessing process for a double-sided mammography X-ray image in a training method of an anti-symmetric generation analysis model of a mammography X-ray image according to an embodiment of the application. As shown in fig. 3, the pretreatment process includes the steps of:
step 301: and carrying out gray scale normalization treatment on the bilateral mammary X-ray images.
The gray scale distribution of the double-sided mammary gland X-ray image can be kept consistent by carrying out gray scale normalization processing.
Step 302: the bilateral mammograms are compressed in a binarized manner.
The binarization method is used for processing the double-sided mammary gland X-ray image, so that the volume of image data can be greatly reduced, and a foundation is laid for obtaining a mammary gland region from the double-sided mammary gland X-ray image subsequently.
Step 303: a bilateral breast region image in a bilateral breast X-ray image is acquired.
For example, a Flood filtering algorithm may be used to extract images of bilateral breast areas from the bilateral breast X-ray images. However, it should be understood that other edge extraction algorithms may be used to obtain images of bilateral breast areas, as the application is not limited in this regard.
Step 304: and respectively performing an opening operation on the bilateral mammary gland region images to align the bilateral mammary gland region images.
An opening operation is performed to remove the excess muscle portion, leaving only the region of interest, thereby maximally aligning the left and right breast. The size of the nucleus used in this opening procedure also needs to be dynamically adjusted to the size of the mammary gland to prevent the smaller mammary gland from being excessively eroded.
Fig. 4 is a schematic structural diagram of a breast X-ray image antisymmetric generation analysis model training device according to an embodiment of the present application. As shown in fig. 4, the apparatus 40 for training an analysis model for generating an antisymmetric mammography image includes:
the extraction module 401 is configured to extract original bilateral breast image features of bilateral breast X-ray images based on a pre-trained feature extraction model, wherein the bilateral breast X-ray images respectively correspond to left breast and right breast;
an analysis module 402 configured to input the original bilateral breast image features into a neural network model to obtain an antisymmetric analysis result opposite to the feature information of the original bilateral breast image features;
a calculation module 403 configured to calculate a loss function value between the generated antisymmetric analysis result and the original bilateral breast image features; and
an adjustment module 404 configured to adjust parameters of the neural network model based on the loss function values.
In one embodiment of the present application, as shown in fig. 5, the apparatus further comprises:
the weak supervision learning module 405 is configured to input the antisymmetric analysis result and the original bilateral mammary gland image features into the neural network model for weak supervision learning.
In one embodiment of the present application, as shown in fig. 5, the apparatus further comprises:
the preprocessing module 406 is configured to preprocess the double-sided mammogram before extracting original double-sided mammogram features of the double-sided mammogram, respectively, based on the pre-trained feature extraction model.
In one embodiment of the present application, as shown in fig. 5, the preprocessing module 406 includes:
a normalization unit 4061 configured to perform gray scale normalization processing on the double-sided breast X-ray images;
a compression unit 4062 configured to compress the bilateral mammography X-ray image in a binarized manner;
a region acquisition unit 4063 configured to acquire a bilateral breast region image in the bilateral breast X-ray images; and
the opening operation unit 4064 is configured to perform an opening operation on the double-sided breast region images, respectively, to align the double-sided breast region images.
In an embodiment of the present application, the feature extraction model may employ one of the following neural network model categories: resNet, alexNet, denseNet, etc.
In one embodiment of the application, the neural network model includes one or more combinations of the following layers: convolution layer, pooling layer and full connection layer.
In one embodiment of the present application, the loss function on which the loss function is based includes one or more of the following functions: contrast loss, triplet loss, N pair loss, chamfer loss, and soil removal loss.
The specific functions and operations of the various modules in the mammography X-ray image antisymmetric generation analysis model training apparatus 40 described above have been described in detail in the mammography X-ray image antisymmetric generation analysis model training method described above with reference to fig. 1-3. Therefore, a repetitive description thereof will be omitted herein.
It should be noted that, the mammography X-ray image antisymmetric generation analysis model training apparatus 40 according to the embodiment of the present application may be integrated into the electronic device 60 as a software module and/or a hardware module, in other words, the electronic device 60 may include the mammography X-ray image antisymmetric generation analysis model training apparatus 40. For example, the mammography X-ray image antisymmetric generation analysis model training apparatus 40 may be a software module in the operating system of the electronic device 60, or may be an application developed for it; of course, the mammography X-ray image antisymmetric generation analysis model training apparatus 40 can also be one of a plurality of hardware modules of the electronic device 60.
In another embodiment of the present application, the mammography-image anti-symmetric generation analysis model trainer 40 and the electronic device 60 may also be separate devices (e.g., servers), and the mammography-image anti-generation analysis model trainer 40 may be connected to the electronic device 60 via a wired and/or wireless network and transmit interaction information in accordance with a agreed data format.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the application. As shown in fig. 6, the electronic device 60 includes: one or more processors 601 and memory 602; and computer program instructions stored in the memory 602, which when executed by the processor 601, cause the processor 601 to perform the mammography X-ray image antisymmetric generation analysis model training method of any one of the embodiments described above.
The processor 601 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities and may control other components in the electronic device to perform desired functions.
The memory 602 may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, random Access Memory (RAM) and/or cache memory (cache) and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and the processor 601 may execute the program instructions to implement the steps in the breast X-ray image antisymmetric generation analysis model training method of the various embodiments of the present application above and/or other desired functions. Information such as light intensity, compensation light intensity, position of the filter, etc. may also be stored in the computer readable storage medium.
In one example, the electronic device 60 may further include: input device 603 and output device 604, which are interconnected by a bus system and/or other form of connection mechanism (not shown in fig. 6).
For example, where the electronic device is a robot, such as on an industrial line, the input device 603 may be a camera for capturing the position of the part to be machined. When the electronic device is a stand-alone device, the input means 603 may be a communication network connector for receiving the acquired input signal from an external, removable device. In addition, the input device 603 may also include, for example, a keyboard, a mouse, a microphone, and the like.
The output device 604 may output various information to the outside, and may include, for example, a display, a speaker, a printer, and a communication network and a remote output apparatus connected thereto, and the like.
Of course, only some of the components of the electronic device 60 that are relevant to the present application are shown in fig. 6, with components such as buses, input/output interfaces, etc. omitted for simplicity. In addition, the electronic device 60 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps of the mammography antisymmetric generation analytical model training method of any one of the embodiments described above.
The computer program product may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium, having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the steps in a mammography X-ray image antisymmetric generation analytical model training method according to various embodiments of the present application described in the "exemplary mammography X-ray image antisymmetric generation analytical model training method" section of this specification.
A computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random access memory ((RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not intended to be limiting, and these advantages, benefits, effects, etc. are not to be considered as essential to the various embodiments of the present application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not necessarily limited to practice with the above described specific details.
The block diagrams of the devices, apparatuses, devices, systems referred to in the present application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is to be construed as including any modifications, equivalents, and alternatives falling within the spirit and principles of the application.
Claims (10)
1. A mammary gland X-ray image antisymmetric generation model training method is characterized by comprising the following steps:
respectively extracting original double-sided mammary gland image features of double-sided mammary gland X-ray images based on a pre-trained feature extraction model, wherein the double-sided mammary gland X-ray images respectively correspond to left breast and right breast;
inputting the original bilateral mammary gland image characteristics into a neural network model to obtain an antisymmetric analysis result opposite to the characteristic information of the original bilateral mammary gland image characteristics;
calculating the loss function value between the generated anti-symmetric analysis result and the original bilateral mammary gland image characteristics; and
and adjusting parameters of the neural network model based on the loss function value.
2. The method as recited in claim 1, further comprising:
and inputting the anti-symmetric analysis result and the original bilateral mammary gland image characteristics into the neural network model for weak supervision learning.
3. The method of claim 1, further comprising, prior to extracting original bilateral breast image features of the bilateral breast X-ray images based on the pre-trained feature extraction model, respectively:
preprocessing the bilateral mammography images.
4. The method of claim 3, wherein the pre-processing the double sided mammography image comprises:
carrying out gray scale normalization processing on the bilateral mammary X-ray images;
compressing the bilateral mammography image in a binarized manner;
acquiring a bilateral mammary gland region image in the bilateral mammary gland X-ray image; and
and respectively performing an opening operation on the bilateral mammary gland region images to align the bilateral mammary gland region images.
5. The method of claim 1, wherein the feature extraction model comprises one of the following model classes: resNet, alexNet, denseNet.
6. The method of claim 1, wherein the neural network model comprises one or a combination of the following layers: convolution layer, pooling layer and full connection layer.
7. The method of claim 1, wherein the loss function upon which the loss function value is based comprises one or more combinations of the following: contrast loss, triplet loss, N pair loss, chamfer loss, and soil removal loss.
8. An apparatus for training an analysis model for the antisymmetric generation of a mammography X-ray image, comprising:
the extraction module is configured to respectively extract original bilateral mammary gland image features of bilateral mammary gland X-ray images based on a pre-trained feature extraction model, wherein the bilateral mammary gland X-ray images respectively correspond to left breast and right breast;
the analysis module is configured to input the original bilateral mammary image characteristics into a neural network model to obtain an antisymmetric analysis result opposite to the characteristic information of the original bilateral mammary image characteristics;
the calculation module is configured to calculate the loss function value between the generated anti-symmetric analysis result and the original bilateral mammary gland image characteristics; and
and the adjustment module is configured to adjust parameters of the neural network model based on the loss function value.
9. An electronic device, comprising:
a processor; and
a memory in which computer program instructions are stored which, when executed by the processor, cause the processor to perform the method of any one of claims 1 to 7.
10. A computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010147836.XA CN111415333B (en) | 2020-03-05 | 2020-03-05 | Mammary gland X-ray image antisymmetric generation analysis model training method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010147836.XA CN111415333B (en) | 2020-03-05 | 2020-03-05 | Mammary gland X-ray image antisymmetric generation analysis model training method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111415333A CN111415333A (en) | 2020-07-14 |
CN111415333B true CN111415333B (en) | 2023-12-01 |
Family
ID=71490905
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010147836.XA Active CN111415333B (en) | 2020-03-05 | 2020-03-05 | Mammary gland X-ray image antisymmetric generation analysis model training method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111415333B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111915576A (en) * | 2020-07-15 | 2020-11-10 | 杭州深睿博联科技有限公司 | Cyclic residual breast X-ray benign and malignant diagnosis learning method and device |
CN113421633A (en) * | 2021-06-25 | 2021-09-21 | 上海联影智能医疗科技有限公司 | Feature classification method, computer device, and storage medium |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106295678A (en) * | 2016-07-27 | 2017-01-04 | 北京旷视科技有限公司 | Neural metwork training and construction method and device and object detection method and device |
CN107123107A (en) * | 2017-03-24 | 2017-09-01 | 广东工业大学 | Cloth defect inspection method based on neutral net deep learning |
CN108229298A (en) * | 2017-09-30 | 2018-06-29 | 北京市商汤科技开发有限公司 | The training of neural network and face identification method and device, equipment, storage medium |
CN108765423A (en) * | 2018-06-20 | 2018-11-06 | 北京七鑫易维信息技术有限公司 | A kind of convolutional neural networks training method and device |
CN109426858A (en) * | 2017-08-29 | 2019-03-05 | 京东方科技集团股份有限公司 | Neural network, training method, image processing method and image processing apparatus |
CN109447088A (en) * | 2018-10-16 | 2019-03-08 | 杭州依图医疗技术有限公司 | A kind of method and device of breast image identification |
CN109753938A (en) * | 2019-01-10 | 2019-05-14 | 京东方科技集团股份有限公司 | Image-recognizing method and equipment and the training method of application, neural network |
CN109886971A (en) * | 2019-01-24 | 2019-06-14 | 西安交通大学 | A kind of image partition method and system based on convolutional neural networks |
CN110069985A (en) * | 2019-03-12 | 2019-07-30 | 北京三快在线科技有限公司 | Aiming spot detection method based on image, device, electronic equipment |
CN110084216A (en) * | 2019-05-06 | 2019-08-02 | 苏州科达科技股份有限公司 | Human face recognition model training and face identification method, system, equipment and medium |
WO2019200745A1 (en) * | 2018-04-20 | 2019-10-24 | 平安科技(深圳)有限公司 | Mri lesion position detection method, device, computer apparatus, and storage medium |
CN110674866A (en) * | 2019-09-23 | 2020-01-10 | 兰州理工大学 | Method for detecting X-ray breast lesion images by using transfer learning characteristic pyramid network |
CN110827335A (en) * | 2019-11-01 | 2020-02-21 | 北京推想科技有限公司 | Mammary gland image registration method and device |
CN110832596A (en) * | 2017-10-16 | 2020-02-21 | 因美纳有限公司 | Deep convolutional neural network training method based on deep learning |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10497257B2 (en) * | 2017-08-31 | 2019-12-03 | Nec Corporation | Parking lot surveillance with viewpoint invariant object recognition by synthesization and domain adaptation |
-
2020
- 2020-03-05 CN CN202010147836.XA patent/CN111415333B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106295678A (en) * | 2016-07-27 | 2017-01-04 | 北京旷视科技有限公司 | Neural metwork training and construction method and device and object detection method and device |
CN107123107A (en) * | 2017-03-24 | 2017-09-01 | 广东工业大学 | Cloth defect inspection method based on neutral net deep learning |
CN109426858A (en) * | 2017-08-29 | 2019-03-05 | 京东方科技集团股份有限公司 | Neural network, training method, image processing method and image processing apparatus |
CN108229298A (en) * | 2017-09-30 | 2018-06-29 | 北京市商汤科技开发有限公司 | The training of neural network and face identification method and device, equipment, storage medium |
CN110832596A (en) * | 2017-10-16 | 2020-02-21 | 因美纳有限公司 | Deep convolutional neural network training method based on deep learning |
WO2019200745A1 (en) * | 2018-04-20 | 2019-10-24 | 平安科技(深圳)有限公司 | Mri lesion position detection method, device, computer apparatus, and storage medium |
CN108765423A (en) * | 2018-06-20 | 2018-11-06 | 北京七鑫易维信息技术有限公司 | A kind of convolutional neural networks training method and device |
CN109447088A (en) * | 2018-10-16 | 2019-03-08 | 杭州依图医疗技术有限公司 | A kind of method and device of breast image identification |
CN109753938A (en) * | 2019-01-10 | 2019-05-14 | 京东方科技集团股份有限公司 | Image-recognizing method and equipment and the training method of application, neural network |
CN109886971A (en) * | 2019-01-24 | 2019-06-14 | 西安交通大学 | A kind of image partition method and system based on convolutional neural networks |
CN110069985A (en) * | 2019-03-12 | 2019-07-30 | 北京三快在线科技有限公司 | Aiming spot detection method based on image, device, electronic equipment |
CN110084216A (en) * | 2019-05-06 | 2019-08-02 | 苏州科达科技股份有限公司 | Human face recognition model training and face identification method, system, equipment and medium |
CN110674866A (en) * | 2019-09-23 | 2020-01-10 | 兰州理工大学 | Method for detecting X-ray breast lesion images by using transfer learning characteristic pyramid network |
CN110827335A (en) * | 2019-11-01 | 2020-02-21 | 北京推想科技有限公司 | Mammary gland image registration method and device |
Non-Patent Citations (2)
Title |
---|
杜燕连等.用差分迭代算法求解实反对称矩阵特征值.《2011 International Conference on Future Computer Science and Application》.2011,全文. * |
许进等.离散型神经网络的数学理论( 1) ——网络状态图.《西安电子科技大学学报》.1999,第第26卷卷(第第3期期),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN111415333A (en) | 2020-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110827335B (en) | Mammary gland image registration method and device | |
CN111415333B (en) | Mammary gland X-ray image antisymmetric generation analysis model training method and device | |
CN110992376A (en) | CT image-based rib segmentation method, device, medium and electronic equipment | |
CN111166362B (en) | Medical image display method and device, storage medium and electronic equipment | |
CN111325743A (en) | Mammary gland X-ray image analysis method and device based on combined signs | |
CN113191392A (en) | Breast cancer image information bottleneck multi-task classification and segmentation method and system | |
Akpinar et al. | Chest X-ray abnormality detection based on SqueezeNet | |
JP2002065653A (en) | Radiography apparatus, method and program relating to control with respect to the same apparatus | |
JP5048233B2 (en) | Method and system for anatomical shape detection in a CAD system | |
CN111429406B (en) | Mammary gland X-ray image lesion detection method and device combining multi-view reasoning | |
CN114565557A (en) | Contrast enhancement energy spectrum photography classification method and device based on coordinate attention | |
Harrison et al. | State-of-the-art of breast cancer diagnosis in medical images via convolutional neural networks (cnns) | |
CN111415741B (en) | Mammary gland X-ray image classification model training method based on implicit apparent learning | |
CN117523204A (en) | Liver tumor image segmentation method and device oriented to medical scene and readable storage medium | |
Bhatia et al. | A proposed quantitative approach to classify brain MRI | |
CN111414939B (en) | Training method and device for spine fracture area analysis model | |
WO2022033598A1 (en) | Breast x-ray radiography acquisition method and apparatus, and computer device and storage medium | |
Qian et al. | Knowledge-based automatic detection of multitype lung nodules from multidetector CT studies | |
Roy Medhi | Lung Cancer Classification from Histologic Images using Capsule Networks | |
CN111415332B (en) | Mammary gland X-ray image linkage method and device | |
CN111401417B (en) | Training method and device for spine fracture area analysis model | |
JP2006239270A (en) | Abnormal shadow detector, abnormal shadow detection method, and its program | |
Okamoto et al. | Detection of Hepatocellular Carcinoma in CT Images Using Deep Learning | |
CN113239978B (en) | Method and device for correlation of medical image preprocessing model and analysis model | |
CN116721143B (en) | Depth information processing device and method for 3D medical image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |