WO2020118618A1 - Mammary gland mass image recognition method and device - Google Patents

Mammary gland mass image recognition method and device Download PDF

Info

Publication number
WO2020118618A1
WO2020118618A1 PCT/CN2018/120885 CN2018120885W WO2020118618A1 WO 2020118618 A1 WO2020118618 A1 WO 2020118618A1 CN 2018120885 W CN2018120885 W CN 2018120885W WO 2020118618 A1 WO2020118618 A1 WO 2020118618A1
Authority
WO
WIPO (PCT)
Prior art keywords
breast
magnetic resonance
feature map
resonance image
mass
Prior art date
Application number
PCT/CN2018/120885
Other languages
French (fr)
Chinese (zh)
Inventor
李程
王珊珊
郑海荣
刘新
梁栋
Original Assignee
深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳先进技术研究院 filed Critical 深圳先进技术研究院
Priority to PCT/CN2018/120885 priority Critical patent/WO2020118618A1/en
Publication of WO2020118618A1 publication Critical patent/WO2020118618A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • This specification belongs to the technical field of image processing, and in particular relates to a method and device for recognizing breast mass images.
  • Breast cancer is the cancer with the highest incidence of women.
  • the cure rate of early breast cancer is much higher than that of advanced breast cancer.
  • Early detection, early diagnosis and early treatment are the keys to reducing the mortality of breast cancer.
  • Imaging examinations including mammography, ultrasound, and magnetic resonance imaging are common techniques for early breast cancer screening.
  • Breast cancer screening can be performed using image examination results to identify breast masses.
  • Breast masses in different patients vary greatly in size and shape, which poses a great challenge to the automatic segmentation method, and mass segmentation is the first priority for all subsequent breast cancer diagnosis and treatment Steps, so the automatic segmentation of breast masses is of great significance to computer-aided diagnosis systems.
  • breast cancer computer-aided diagnosis systems mainly use manual extraction of features to locate or classify breast masses.
  • Manual feature extraction depends on the professional and empirical knowledge of the researcher, and often has certain limitations and subjectivity, and the results will be greatly affected.
  • the automatic segmentation method of breast masses in the prior art usually only gives the location of the masses, and there is no information on the shape and size of the masses. Therefore, there is an urgent need in the art for a technical solution that can accurately segment a breast mass region.
  • the purpose of this specification is to provide a method and device for breast mass image recognition, which realizes the automatic identification of breast masses and improves the accuracy of breast mass identification.
  • the embodiments of the present specification provide a method for recognizing breast mass images, including:
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the basic convolution module in the U-shaped convolutional neural network model is used to extract features using the following formula:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the method further includes: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the method further includes:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the assigning the extracted feature map with different weight values includes:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the assigning the extracted feature map with different weight values includes: using the following formula to obtain the weight value corresponding to the feature map:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the breast mass identification model is constructed using the following method:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the method further includes:
  • this specification provides a breast mass image recognition device, including:
  • the image acquisition module is used to acquire the breast magnetic resonance image to be identified
  • the lump identification module is used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass recognition result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the deep fully convolutional neural network model further includes a channel attention module for:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the channel attention module is specifically used to:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the device further includes a model building module for building the breast mass identification model using the following method:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • the breast mass identification model includes multiple model parameters
  • the breast magnetic resonance image in the sample data is used as the input data of the breast mass identification model, and the corresponding tumor marker in the breast magnetic resonance image is used as the output data of the breast mass identification model to adjust the breast
  • the model parameters of the lump identification model until the breast lump identification model meets the preset requirements.
  • the device further includes a model adjustment module for:
  • this specification provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor-executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
  • the present specification provides a breast mass image recognition system, including at least one processor and a memory for storing processor executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
  • the breast mass image recognition method, device, processing equipment and system provided in this specification based on deep learning, combine the U-shaped convolutional neural network model with the dense convolutional neural network model to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features.
  • the effect of the model on segmentation of MRI breast masses is greatly improved, and subsequent network models are not required to further optimize the segmentation results, which reduces the calculation cost, speeds up the image analysis process, and can better assist doctors in real-time image diagnosis.
  • FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of this specification
  • FIG. 2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of this specification
  • FIG. 3 is a schematic diagram of a module structure of an embodiment of a breast mass image recognition device provided in this specification
  • FIG. 4 is a schematic structural diagram of a breast mass image recognition device according to another embodiment of the present specification.
  • FIG. 5 is a schematic structural diagram of a breast mass image recognition device according to yet another embodiment of the present specification.
  • FIG. 6 is a block diagram of the hardware structure of a breast mass identification server using an embodiment of the present application.
  • the breast mass image recognition method in this manual can be applied to the client or server.
  • the client can be a smart phone, tablet computer, smart wearable device (smart watch, virtual reality glasses, virtual reality helmet, etc.), smart vehicle equipment and other electronic equipment.
  • FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of the present specification.
  • the overall process of the breast mass image recognition method provided in an embodiment of the present specification may include:
  • Step 102 Acquire a breast magnetic resonance image to be identified.
  • Magnetic resonance examination is currently a relatively common medical examination method.
  • a user's breast magnetic resonance image can be acquired, and the breast mass can be identified based on the acquired breast magnetic resonance image to be identified.
  • Step 104 Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • a breast mass identification model can be constructed based on deep learning methods.
  • the existing breast cancer patient’s breast magnetic resonance image data can be used for model training to learn from input magnetic resonance image to output breast A function mapping relationship between the results of mass segmentation to construct a breast mass identification model.
  • 2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of the present specification.
  • the breast mass identification model in the embodiment of the present specification may be a deep fully convolutional neural network model, which can be U-shaped
  • the convolutional neural network model ie U-Net
  • DenseNet dense convolutional neural network model
  • the basic convolution module in the U-shaped convolutional neural network model can be used for feature extraction.
  • the dense connections in the dense convolutional neural network model can be used to unify the feature maps to be fused, and then feature fusion .
  • U-Net can be understood as a variant of the convolutional neural network, whose structure is mainly like the letter U, hence the name U-Net.
  • the entire neural network of U-Net is mainly composed of two parts: the contraction path and the expansion path.
  • the contraction path is mainly used to capture the context information in the picture, and the expansion path commensurate with it is to segment the need in the picture.
  • DenseNet can be understood as a convolutional neural network with dense connections. In this network, there is a direct connection between any two layers, that is, the input of each layer of the network is the union of the outputs of all previous layers. , And the feature map learned by this layer will be directly passed to all subsequent layers as input. Dense connections can alleviate the problem of gradient disappearance, strengthen feature propagation, encourage feature reuse, greatly reduce the amount of parameters, and improve the accuracy of image recognition.
  • the basic convolution module on the left in Figure 2 can use a basic convolution module similar to the U-shaped convolutional neural network model for feature extraction, and the downward arrow of the basic convolution module can indicate maximum pooling .
  • the channel attention module on the right side in FIG. 2 can be understood as a decoding process, and the upward arrow of the channel attention module can represent bilinear interpolation, where the decoding process can perform feature fusion together on the feature maps of the coding process of the same layer.
  • the connecting line with arrows on the right side of the channel attention module in FIG. 2 may represent a dense connection.
  • the network module of the decoding process of the deep fully convolutional neural network model may be densely connected, that is, the The channel attention module makes intensive connections.
  • the channel attention module can uniformly size the feature maps extracted from the coding process of the same layer and the feature maps of other channel attention modules, and then perform feature fusion.
  • the uppermost channel attention module in Figure 2 can unify the feature map obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowest feature adaptation module Size, and then merge the feature maps of uniform size.
  • the feature map obtained by the uppermost channel attention module can be obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowermost feature adaptation module can be understood as the uppermost channel The feature map of the attention module to be fused.
  • dense connections are added in the decoding process. These dense connections can separately upsample feature maps with different degrees of abstraction to a uniform size, and then directly merge them. Feature maps are reused. Feature maps with a high degree of abstraction can better guide the classification results, while feature maps with a low degree of abstraction can better retain location information, and the segmentation results are greatly improved.
  • the breast magnetic resonance image to be identified is input into the constructed breast mass identification model, and the breast mass identification result of the breast magnetic resonance image to be identified is obtained using the breast mass identification model.
  • the mass to be identified can be identified Whether there is a lump in the breast magnetic resonance image, and if so, it can also identify the area where the lump is located, or the shape and size of the breast lump.
  • the bright spot in the output image of the rightmost breast mass recognition model in Figure 2 can represent the segmented mass, that is, the area of the mass in the breast magnetic resonance image input on the left, and the doctor can make a further diagnosis based on the identified mass area And treatment, or used for other medical research.
  • the breast mass image recognition method provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. Greatly improve the effect of network model on the segmentation of MRI breast masses, without the need for further network model to further optimize the segmentation results, reduce the calculation cost, speed up the image analysis process, and better assist doctors in real-time image diagnosis.
  • the basic convolution module in the U-shaped convolutional neural network model can be used to extract features using the following formula:
  • y 1 represents the extracted feature map that is the output of the basic convolution module
  • x 1 represents the input breast magnetic resonance image that is the input of the basic convolution module
  • represents the first activation function (which can be a linear rectification function)
  • W 11 , W 12 , and W 13 represent weights corresponding to different convolutional layers
  • b 11 , b 12 , and b 13 represent offset parameters corresponding to different convolutional layers.
  • the basic convolution module in the figure may include the function of the above formula (1), and use the above formula (1) for feature extraction.
  • the basic convolution module can perform multiple convolution operations, that is, the basic convolution module can include multiple convolutional layers, and different convolutional layers have different weights W 11 , W 12 , W 13 and offset parameters b 11 , b 12 and b 13 .
  • the basic convolution module may include three convolutional layers, and other numbers of convolutional layers may be set according to actual needs. According to the number of convolutional layers, the above formula (1) may be adaptively adjusted or Modifications are not specifically limited in the embodiments of this specification.
  • the basic convolution modules can be connected by maximum pooling.
  • the maximum pooling can increase the receptive field and to a certain extent achieve the invariance of the input image translation. After the feature map passes through each maximum pooling layer, the resolution is halved, and the channel Number doubled.
  • the embodiments of this specification use a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
  • the method may further include: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map, that is, the output of the feature adaptation module
  • x 2 represents the extracted feature map, that is, the input of the feature adaptation module
  • represents the first activation function (which may be a linear rectification function)
  • W 21 , W 22 represents weights corresponding to different convolutional layers
  • b 21 , b 22 represent offset parameters corresponding to different convolutional layers.
  • the deep fully convolutional neural network model in the embodiment of this specification can also include a feature application module, and the feature adaptation module can adjust the basic convolution in the encoding process The feature map generated by the module to enhance the subsequent feature fusion effect.
  • the feature application module may include the function of the above formula (2), and use the above formula (2) to adjust the feature map.
  • the feature adaptation module may also include multiple convolutional layers, and different convolutional layers may correspond to different The weights W 21 , W 22 and b 21 , b 22 .
  • the feature adaptation module may include 2 convolutional layers, and other numbers of convolutional layers may be set according to actual needs.
  • the above formula (2) may be adaptively adjusted or modified according to the number of convolutional layers
  • the examples in this specification are not specifically limited.
  • the feature maps with different levels of abstraction generated in the encoding process can be fused with the feature maps in the decoding process in a cascade manner after passing through the feature adaptation module.
  • the embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
  • the method further includes:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the embodiments of this specification can assign different feature maps with different weight values, such as: setting the weight value of some feature maps with more information to be higher, and reducing the amount of information or useless
  • the weight value of the feature map is set relatively low.
  • perform feature fusion according to the weight value corresponding to the feature map for example: you can multiply the weight value corresponding to each feature map to the feature map to be fused, increase the impact of the important feature map on the mass recognition result, and reduce the unimportant feature map Impact on the results of mass identification.
  • the extracted feature maps can be averagely pooled, for example: the feature maps adjusted by the feature adaptation module are averagely pooled, then through two fully connected layers, and finally through the second activation function as :
  • the Sigmoid function is activated to generate the weight of each feature map, and then the weight value is multiplied back to the fused feature map.
  • Average pooling can mean averaging all the values in the local acceptance domain.
  • the Sigmoid function is a common S-shaped function in biology, and it can also be called an S-shaped growth curve.
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • i can represent 1-H Numerical value
  • j can represent the value of 1-M
  • x c can represent the c-th feature map input by the channel attention module
  • S represents the weight value vector corresponding to the feature map (which can include the weight values corresponding to multiple feature maps)
  • Z represents the pooling result vector after the feature map is pooled (which may include pooling results corresponding to multiple feature maps, such as: Z c )
  • represents the second activation function (which may be a Sigmoid function)
  • W 32 Weights corresponding to different fully connected layers of W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the product of the weight values corresponding to each feature map and the feature map can be used as an output for feature fusion.
  • y 3 can represent the output of the channel attention module
  • S c can represent the weight value corresponding to the c-th feature map
  • x c can represent the c-th feature map input by the channel attention module.
  • the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
  • a breast mass identification model can be constructed in the following manner:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the breast magnetic resonance image of a historical user may be acquired as sample data, and the sample data may be a breast magnetic resonance image of a user who has been diagnosed with breast cancer.
  • the sample data may also include the acquired mass markers in the breast magnetic resonance image as training labels.
  • the specific number of sample data may be selected according to actual needs, which is not specifically limited in the embodiments of this specification.
  • the acquired breast magnetic resonance images can be normalized, that is, the pixels of the breast magnetic resonance images in the sample data are processed in a unified manner, which is convenient Follow-up model training. Then, the normalized breast magnetic resonance image is labeled with a mass, which can be labeled by a professional doctor or based on the user's diagnosis, and the location, size, etc. of the mass can be marked.
  • a breast mass identification model may be constructed, such as: constructing a network architecture of the breast mass identification model, etc.
  • the lump recognition model may include a basic convolution module, a feature adaptation module, and a channel attention module, where the basic convolution module and feature adaptation module may be understood as an encoding process, channel attention The module can be understood as the decoding process.
  • the breast mass identification model may also include multiple model parameters, such as: the size of the convolution kernel and the number of convolution layers.
  • the breast magnetic resonance image in the sample data can be used as the input data of the breast mass identification model, and the corresponding mass marker in the breast magnetic resonance image can be used as the output data of the breast mass identification model.
  • the recognition model performs model training until the breast mass recognition model meets the preset requirements, such as: the model output accuracy meets the requirements or the model training times meet the requirements, that is, the model training is concluded.
  • the embodiment of the present specification uses deep learning training to construct a breast mass identification model, which can realize the automatic identification of breast masses, without the need for manual identification, and improves the accuracy of breast mass identification.
  • the breast mass identification in the embodiment of this specification mainly includes two stages of model training and prediction.
  • existing breast magnetic resonance data can be used to design
  • the deep fully convolutional network performs optimization and parameter learning.
  • the trained network model is used to analyze new data that has not been seen during training, that is, the breast magnetic resonance image to be identified.
  • the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data of the training stage (such as: main field intensity, relaxation time T1, T2, etc.) ) Compare and compare whether the distribution of the two image data is uniform, that is, whether the image contrast, signal to noise ratio, etc. are consistent.
  • the breast mass identification model constructed by training can be used directly for mass identification.
  • a small amount of new data that is, the breast magnetic resonance image to be identified (or an image with the same parameters as the breast magnetic resonance image to be identified) can be used to construct the breast Quickly fine-tune the parameters of the lump identification model.
  • the adjusted breast mass identification model is used to identify the mass of the breast magnetic resonance image to be identified, and the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
  • the embodiment of the present specification proposes an asymmetric codec main frame structure, designs a new feature fusion and screening mechanism, and introduces dense connections.
  • the segmentation method in the embodiment of the present specification can better retain the beneficial influence of useful features on the segmentation result, and weaken the role of useless features.
  • the effect of the network on the segmentation of the magnetic resonance breast mass can be greatly improved, and the subsequent network is not required to further optimize the segmentation result, which reduces the calculation cost, speeds up the image analysis process, and can better assist the doctor in real-time image diagnosis.
  • the breast mass image recognition method in the embodiment of the present specification may not be limited to identifying breast masses, but may also be used in other image recognition processes, such as: identifying other lesion areas (such as brain tumors) and the like. You can use the magnetic resonance image training of other parts to build a corresponding recognition model to complete the automatic recognition of other lesion areas.
  • one or more embodiments of the present specification further provide a breast mass image recognition device.
  • the device may include a system (including a distributed system), software (applications), modules, components, servers, clients, etc. using the method described in the embodiments of the present specification in combination with necessary hardware implementation devices.
  • the devices in one or more embodiments provided by the embodiments of this specification are as described in the following embodiments. Since the implementation solution of the device to solve the problem is similar to the method, the implementation of the specific device in the embodiments of the present specification may refer to the implementation of the foregoing method, and the repetition is not repeated.
  • unit or “module” may implement a combination of software and/or hardware that achieves a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementation of hardware or a combination of software and hardware is also possible and conceived.
  • FIG. 3 is a schematic diagram of a module structure of an embodiment of the breast mass image recognition device provided in this specification.
  • the breast mass image recognition device provided in this specification includes: an image acquisition module 31 and a mass identification module 32 ,among them:
  • the image acquisition module 31 can be used to acquire a breast magnetic resonance image to be identified
  • the mass identification module 32 may be used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the breast mass image recognition device provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. It can greatly improve the effect of the network on the segmentation of magnetic resonance breast masses, without the need for further network optimization of the segmentation results, reducing the calculation cost, speeding up the image analysis process, and better assisting doctors in real-time image diagnosis.
  • the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the breast mass image recognition device provided by the embodiment of the present specification uses a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
  • the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
  • the deep fully convolutional neural network model further includes a channel attention module for:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the channel attention module is specifically used to:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
  • FIG. 4 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 4, on the basis of the above embodiment, the device further includes a model building module 41 for constructing the The breast mass identification model:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the breast mass identification model is constructed by using deep learning training, which can realize the automatic identification of the breast mass without manual identification, and improves the accuracy of the breast mass identification.
  • FIG. 5 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 5, based on the above embodiment, the device further includes a model adjustment module 51 for:
  • the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
  • An embodiment of the present specification also provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor executable instructions, and the processor implements the instructions to implement the breast mass image of the above embodiment Identification methods, such as:
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the storage medium may include a physical device for storing information, usually after the information is digitized and then stored in a medium using electrical, magnetic, or optical means.
  • the storage medium may include: devices that use electrical energy to store information, such as various types of memory, such as RAM, ROM, etc.; devices that use magnetic energy to store information, such as hard disks, floppy disks, magnetic tapes, magnetic core memories, and bubble memories, U disk; a device that uses optical means to store information such as CD or DVD.
  • devices that use electrical energy to store information such as various types of memory, such as RAM, ROM, etc.
  • devices that use magnetic energy to store information such as hard disks, floppy disks, magnetic tapes, magnetic core memories, and bubble memories, U disk
  • a device that uses optical means to store information such as CD or DVD.
  • quantum memory graphene memory, and so on.
  • the breast mass identification system provided in this manual can be a separate breast mass identification system, or it can be applied to various data analysis and processing systems.
  • the system may include any breast mass image recognition device in the above embodiments.
  • the system may be a separate server, or it may include a server cluster, system (including distributed system), software (application) using one or more of the methods or one or more embodiments of this specification. Terminal devices that actually operate devices, logic gate devices, quantum computers, etc., combined with the necessary implementation hardware.
  • the detection system for checking the difference data may include at least one processor and a memory storing computer-executable instructions. When the processor executes the instructions, the steps of the method in any one or more of the above embodiments are implemented.
  • FIG. 6 is a block diagram of a hardware structure of a breast mass identification server using an embodiment of the present application.
  • the server 10 may include one or more (only one is shown in the figure) processor 100 (the processor 100 may include but is not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA), A memory 200 for storing data, and a transmission module 300 for communication functions.
  • a processing device such as a microprocessor MCU or a programmable logic device FPGA
  • a memory 200 for storing data
  • a transmission module 300 for communication functions.
  • the server 10 may also include more or fewer components than those shown in FIG. 6, for example, it may also include other processing hardware, such as a database or a multi-level cache, a GPU, or have a configuration different from that shown in FIG.
  • the memory 200 may be used to store software programs and modules of application software, such as program instructions/modules corresponding to the breast mass image recognition method in the embodiments of the present specification, and the processor 100 executes the software programs and modules stored in the memory 200 to execute Various functional applications and data processing.
  • the memory 200 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 200 may further include memories remotely provided with respect to the processor 100, and these remote memories may be connected to a computer terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the transmission module 300 is used to receive or send data via a network.
  • the specific example of the network described above may include a wireless network provided by a communication provider of computer terminals.
  • the transmission module 300 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through the base station to communicate with the Internet.
  • the transmission module 300 may be a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF radio frequency
  • the method or apparatus described in the above embodiments provided in this specification can implement business logic through a computer program and be recorded on a storage medium, and the storage medium can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of this specification.
  • the above-mentioned breast mass image recognition method or device provided by the embodiment of the present specification can be implemented by a processor executing corresponding program instructions in a computer, such as using the Windows operating system C++ language to implement on the PC side, Linux system, or other, for example, using Android and iOS system programming languages are implemented in smart terminals, and quantum logic-based processing logic.
  • embodiments of this specification are not limited to those that must comply with industry communication standards, standard computer data processing and data storage rules, or those described in one or more embodiments of this specification.
  • Some industry standards or implementations described in a custom manner or embodiments based on slightly modified implementations can also achieve the same, equivalent, or similar, or predictable implementation effects of the foregoing embodiments. Examples obtained by applying these modified or deformed data acquisition, storage, judgment, processing methods, etc., can still fall within the scope of optional implementations of the examples in this specification.
  • the improvement of a technology can be clearly distinguished from the improvement in hardware (for example, the improvement of circuit structures such as diodes, transistors, and switches) or the improvement in software (the improvement of the process flow).
  • the improvement of many methods and processes can be regarded as a direct improvement of the hardware circuit structure.
  • Designers almost get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by hardware physical modules.
  • a programmable logic device Programmable Logic Device, PLD
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description Language
  • ABEL Advanced Boolean Expression
  • AHDL AlteraHardwareDescriptionLanguage
  • Confluence a specific programming language
  • CUPL CornellUniversityProgrammingLanguage
  • HDCal JHDL (JavaHardwareDescriptionLanguage)
  • Lava Lola
  • MyHDL PALASM
  • RHDL RubyHardwareDescription
  • the controller may be implemented in any suitable manner, for example, the controller may take a microprocessor or processor and a computer-readable medium storing computer-readable program code (such as software or firmware) executable by the (micro)processor , Logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers and embedded microcontrollers.
  • Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the control logic of the memory.
  • controller in addition to implementing the controller in the form of pure computer-readable program code, it is entirely possible to logically program method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers and embedded To achieve the same function in the form of a microcontroller, etc. Therefore, such a controller can be regarded as a hardware component, and the device for implementing various functions included therein can also be regarded as a structure within the hardware component. Or even, the means for realizing various functions can be regarded as both a software module of an implementation method and a structure within a hardware component.
  • the system, device, module or unit explained in the above embodiments may be specifically implemented by a computer chip or entity, or implemented by a product with a certain function.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, an on-board human-machine interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet A computer, a wearable device, or any combination of these devices.
  • the functions are divided into various modules and described separately.
  • the functions of each module may be implemented in the same or more software and/or hardware, or the modules that achieve the same function may be implemented by a combination of multiple submodules or subunits, etc. .
  • the device embodiments described above are only schematic.
  • the division of the unit is only a division of logical functions.
  • there may be another division manner for example, multiple units or components may be combined or integrated To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device
  • These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions
  • the device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device
  • the instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory, random access memory (RAM) and/or non-volatile memory in computer-readable media, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
  • RAM random access memory
  • ROM read only memory
  • flash RAM flash random access memory
  • Computer-readable media including permanent and non-permanent, removable and non-removable media, can store information by any method or technology.
  • the information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only compact disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic cassette tapes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.
  • one or more embodiments of this specification may be provided as a method, system, or computer program product. Therefore, one or more embodiments of this specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, one or more embodiments of this specification may employ computer programs implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code The form of the product.
  • computer usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • One or more embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as program modules.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • One or more embodiments of this specification can also be practiced in distributed computing environments in which tasks are performed by remote processing devices connected through a communication network.
  • program modules may be located in local and remote computer storage media including storage devices.

Abstract

A mammary gland mass image recognition method and device. The method comprises: acquiring a mammary gland magnetic resonance image to be recognized (102); inputting said image into a constructed mammary gland mass identification model to obtain a mass identification result in said image; wherein the mass recognition model uses a deep full convolutional neural network model; a coding process of the deep full convolutional neural network model uses a basic convolutional module in a U-shaped convolutional neural network model to perform feature extraction, and a decoding process of the deep full convolutional neural network model uses dense connection to fuse feature maps to be fused after performing size unification (104). According to the method, automatic recognition of the mammary gland mass is implemented, and the recognition accuracy of the mammary gland mass is improved.

Description

一种乳腺肿块图像识别方法及装置Breast mass image recognition method and device 技术领域Technical field
本说明书属于图像处理技术领域,尤其涉及一种乳腺肿块图像识别方法及装置。This specification belongs to the technical field of image processing, and in particular relates to a method and device for recognizing breast mass images.
背景技术Background technique
乳腺癌是女性发病率最高的癌症,早期乳腺癌的治愈率比晚期乳腺癌要高很多,早发现、早诊断和早治疗是降低乳腺癌病死率的关键。影像检查包括乳腺钼靶、超声和磁共振成像等是早期乳腺癌筛查的常用技术。可以利用影像检查结果识别乳腺肿块来进行乳腺癌筛查,不同病人的乳腺肿块大小形状差异很大,给自动分割方法带来了很大挑战,而肿块分割是所有后续乳腺癌诊断和治疗的首要步骤,因此实现乳腺肿块的自动分割对计算机辅助诊断系统具有重要意义。Breast cancer is the cancer with the highest incidence of women. The cure rate of early breast cancer is much higher than that of advanced breast cancer. Early detection, early diagnosis and early treatment are the keys to reducing the mortality of breast cancer. Imaging examinations including mammography, ultrasound, and magnetic resonance imaging are common techniques for early breast cancer screening. Breast cancer screening can be performed using image examination results to identify breast masses. Breast masses in different patients vary greatly in size and shape, which poses a great challenge to the automatic segmentation method, and mass segmentation is the first priority for all subsequent breast cancer diagnosis and treatment Steps, so the automatic segmentation of breast masses is of great significance to computer-aided diagnosis systems.
现有技术中,乳腺癌计算机辅助诊断系统主要采用手工提取特征来对乳腺肿块进行定位或者分类。手工提取特征依赖于研究者的专业和经验知识,往往具有一定局限性和主观性,结果会受较大影响。现有技术中乳腺肿块的自动分割方法通常只是给出了肿块的位置,并没有肿块的形状大小信息。因此,本领域亟需一种能够准确分割出乳腺肿块区域的技术方案。In the prior art, breast cancer computer-aided diagnosis systems mainly use manual extraction of features to locate or classify breast masses. Manual feature extraction depends on the professional and empirical knowledge of the researcher, and often has certain limitations and subjectivity, and the results will be greatly affected. The automatic segmentation method of breast masses in the prior art usually only gives the location of the masses, and there is no information on the shape and size of the masses. Therefore, there is an urgent need in the art for a technical solution that can accurately segment a breast mass region.
发明内容Summary of the invention
本说明书目的在于提供一种乳腺肿块图像识别方法及装置,实现了乳腺肿块的自动识别,提高了乳腺肿块识别的准确性。The purpose of this specification is to provide a method and device for breast mass image recognition, which realizes the automatic identification of breast masses and improves the accuracy of breast mass identification.
一方面本说明书实施例提供了一种乳腺肿块图像识别方法,包括:On the one hand, the embodiments of the present specification provide a method for recognizing breast mass images, including:
获取待识别的乳腺磁共振图像;Obtain the breast magnetic resonance image to be identified;
将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
进一步地,所述方法的另一个实施例中,利用所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:Further, in another embodiment of the method, the basic convolution module in the U-shaped convolutional neural network model is used to extract features using the following formula:
y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 )
上式中,y 1表示提取出的特征图,x 1表示输入的乳腺磁共振图像,δ表示第一激活函数,W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map, x 1 represents the input breast magnetic resonance image, δ represents the first activation function, W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers, b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
进一步地,所述方法的另一个实施例中,所述方法还包括:利用所述深度全卷积神经网络模型采用下述公式将提取出的特征图进行调整:Further, in another embodiment of the method, the method further includes: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 )
上式中,y 2表示调整后的特征图,x 2表示提取出的特征图,δ表示第一激活函数,W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, x 2 represents the extracted feature map, δ represents the first activation function, W 21 and W 22 represent the weights corresponding to different convolutional layers, and b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
进一步地,所述方法的另一个实施例中,所述方法还包括:Further, in another embodiment of the method, the method further includes:
利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
进一步地,所述方法的另一个实施例中,所述将提取出的特征图赋予不同的权重值,包括:Further, in another embodiment of the method, the assigning the extracted feature map with different weight values includes:
对提取出的特征图进行平均池化;Perform average pooling on the extracted feature maps;
将平均池化后的特征图利用全连接层进行连接,并采用第二激活函数进行激活,获得所述特征图对应的权重值。The averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
进一步地,所述方法的另一个实施例中,所述将提取出的特征图赋予不同的权重值,包括:采用下述公式获得所述特征图对应的权重值:Further, in another embodiment of the method, the assigning the extracted feature map with different weight values includes: using the following formula to obtain the weight value corresponding to the feature map:
Figure PCTCN2018120885-appb-000001
Figure PCTCN2018120885-appb-000001
上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,x c表示输入的第c个特征图,S表示特征图对应的权重值向量,Z表示特征图池化后的池化结果向量,σ表示所述第二激活函数,W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and x c represents the input c-th feature map Feature maps, S represents the weight value vector corresponding to the feature map, Z represents the pooled result vector after the feature map is pooled, σ represents the second activation function, the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
进一步地,所述方法的另一个实施例中,所述乳腺肿块识别模型采用下述方法构建:Further, in another embodiment of the method, the breast mass identification model is constructed using the following method:
获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
建立所述乳腺肿块识别模型,将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,对所述乳腺肿块识别模型进行训练,直至所述乳腺肿块识别模型达到预设要求。Establish the breast mass identification model, use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
进一步地,所述方法的另一个实施例中,所述方法还包括:Further, in another embodiment of the method, the method further includes:
将所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数进行对比;Comparing the image parameters of the breast magnetic resonance image to be identified with the breast magnetic resonance image in the sample data;
若所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数不一致,则利用所述待识别的乳腺磁共振图像对所述乳腺肿块识别模型进行调整;If the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data are inconsistent, use the breast magnetic resonance image to be identified to adjust the breast mass recognition model;
利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别。Using the adjusted breast mass identification model, mass identification of the breast magnetic resonance image to be identified is performed.
另一方面,本说明书提供了一种乳腺肿块图像识别装置,包括:On the other hand, this specification provides a breast mass image recognition device, including:
图像获取模块,用于获取待识别的乳腺磁共振图像;The image acquisition module is used to acquire the breast magnetic resonance image to be identified;
肿块识别模块,用于将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;The lump identification module is used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass recognition result in the breast magnetic resonance image to be identified;
其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
进一步地,所述装置的另一个实施例中,所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:Further, in another embodiment of the device, the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 )
上式中,y 1表示提取出的特征图,x 1表示输入的乳腺磁共振图像,δ表示第一激活函数,W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map, x 1 represents the input breast magnetic resonance image, δ represents the first activation function, W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers, b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
进一步地,所述装置的另一个实施例中,所述深度全卷积神经网络模型中包括特征适应模块,用于采用下述公式将提取出的特征图进行调整:Further, in another embodiment of the device, the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 )
上式中,y 2表示调整后的特征图,x 2表示提取出的特征图,δ表示第一激活函数, W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, x 2 represents the extracted feature map, δ represents the first activation function, W 21 and W 22 represent the weights corresponding to different convolutional layers, and b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
进一步地,所述装置的另一个实施例中,所述深度全卷积神经网络模型中还包括通道注意力模块用于:Further, in another embodiment of the device, the deep fully convolutional neural network model further includes a channel attention module for:
利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
进一步地,所述装置的另一个实施例中,所述通道注意力模块具体用于:Further, in another embodiment of the device, the channel attention module is specifically used to:
对提取出的特征图进行平均池化;Perform average pooling on the extracted feature maps;
将平均池化后的特征图利用全连接层进行连接,并采用第二激活函数进行激活,获得所述特征图对应的权重值。The averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
进一步地,所述装置的另一个实施例中,所述通道注意力模块具体用于采用下述公式获得所述特征图对应的权重值:Further, in another embodiment of the device, the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
Figure PCTCN2018120885-appb-000002
Figure PCTCN2018120885-appb-000002
上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,x c表示输入的第c个特征图,S表示特征图对应的权重值向量,Z表示特征图池化后的池化结果向量,σ表示所述第二激活函数,W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and x c represents the input c-th feature map Feature maps, S represents the weight value vector corresponding to the feature map, Z represents the pooled result vector after the feature map is pooled, σ represents the second activation function, the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
进一步地,所述装置的另一个实施例中,所述装置还包括模型构建模块用于采用下述方法构建所述乳腺肿块识别模型:Further, in another embodiment of the device, the device further includes a model building module for building the breast mass identification model using the following method:
获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
建立所述乳腺肿块识别模型,其中,所述乳腺肿块识别模型中包括多个模型参数;Establishing the breast mass identification model, wherein the breast mass identification model includes multiple model parameters;
将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,调整所述乳腺肿块识别模型的所述模型参数,直至所述乳腺肿块识别模型达到预设要求。The breast magnetic resonance image in the sample data is used as the input data of the breast mass identification model, and the corresponding tumor marker in the breast magnetic resonance image is used as the output data of the breast mass identification model to adjust the breast The model parameters of the lump identification model until the breast lump identification model meets the preset requirements.
进一步地,所述装置的另一个实施例中,所述装置还包括模型调整模块用于:Further, in another embodiment of the device, the device further includes a model adjustment module for:
将所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数进 行对比;Comparing the image parameters of the breast magnetic resonance image to be identified with the breast magnetic resonance image in the sample data;
若所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数不一致,则利用所述待识别的乳腺磁共振图像对所述乳腺肿块识别模型进行调整;If the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data are inconsistent, use the breast magnetic resonance image to be identified to adjust the breast mass recognition model;
利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别。Using the adjusted breast mass identification model, mass identification of the breast magnetic resonance image to be identified is performed.
还一方面,本说明书提供了乳腺肿块图像识别处理设备,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现本说明书实施例中的乳腺肿块图像识别方法。In still another aspect, this specification provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor-executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
再一方面,本说明书提供了一种乳腺肿块图像识别系统,包括至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现本说明书实施例中的乳腺肿块图像识别方法。In still another aspect, the present specification provides a breast mass image recognition system, including at least one processor and a memory for storing processor executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
本说明书提供的乳腺肿块图像识别方法、装置、处理设备、系统,基于深度学习,将U型卷积神经网络模型与密集卷积神经网络模型相结合,构建出非对称的编码解码乳腺肿块识别模型结构。再将待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,即可以获得待识别的乳腺磁共振图像的肿块识别结果,实现了乳腺肿块的自动识别,不需要人工肉眼识别,提高了乳腺肿块的识别结果。本说明书实施提供的乳腺肿块图像识别方法可以更好地保留有用特征对分割结果的有利影响,而削弱无用特征的作用。大大提高了模型对磁共振乳腺肿块分割的效果,不需要后续网络模型对分割结果进行进一步优化,减少了计算成本,加快了影像分析过程,可以更好地辅助医生进行实时影像诊断。The breast mass image recognition method, device, processing equipment and system provided in this specification, based on deep learning, combine the U-shaped convolutional neural network model with the dense convolutional neural network model to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results. The breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. The effect of the model on segmentation of MRI breast masses is greatly improved, and subsequent network models are not required to further optimize the segmentation results, which reduces the calculation cost, speeds up the image analysis process, and can better assist doctors in real-time image diagnosis.
附图说明BRIEF DESCRIPTION
为了更清楚地说明本说明书实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本说明书中记载的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly explain the embodiments of the specification or the technical solutions in the prior art, the following will briefly introduce the drawings required in the embodiments or the description of the prior art. Obviously, the drawings in the following description are only These are some of the embodiments described in this specification. For those of ordinary skill in the art, without paying any creative labor, other drawings can also be obtained based on these drawings.
图1是本说明书一个实施例中乳腺肿块图像识别方法的流程示意图;FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of this specification;
图2是本说明书一个实施例中乳腺肿块识别模型的网络架构示意图;2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of this specification;
图3是本说明书提供的乳腺肿块图像识别装置一个实施例的模块结构示意图;FIG. 3 is a schematic diagram of a module structure of an embodiment of a breast mass image recognition device provided in this specification;
图4是本说明书又一实施例中乳腺肿块图像识别装置的结构示意图;4 is a schematic structural diagram of a breast mass image recognition device according to another embodiment of the present specification;
图5是本说明书又一实施例中乳腺肿块图像识别装置的结构示意图;5 is a schematic structural diagram of a breast mass image recognition device according to yet another embodiment of the present specification;
图6是应用本申请实施例乳腺肿块识别服务器的硬件结构框图。6 is a block diagram of the hardware structure of a breast mass identification server using an embodiment of the present application.
具体实施方式detailed description
为了使本技术领域的人员更好地理解本说明书中的技术方案,下面将结合本说明书实施例中的附图,对本说明书实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本说明书一部分实施例,而不是全部的实施例。基于本说明书中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都应当属于本说明书保护的范围。In order to enable those skilled in the art to better understand the technical solutions in this specification, the technical solutions in the embodiments of this specification will be described clearly and completely in conjunction with the drawings in the embodiments of this specification. Obviously, the described The embodiments are only a part of the embodiments of this specification, but not all the embodiments. Based on the embodiments in this specification, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of this specification.
本说明书中乳腺肿块图像识别方法可以应用在客户端或服务器中,客户端可以是智能手机、平板电脑、智能可穿戴设备(智能手表、虚拟现实眼镜、虚拟现实头盔等)、智能车载设备等电子设备。The breast mass image recognition method in this manual can be applied to the client or server. The client can be a smart phone, tablet computer, smart wearable device (smart watch, virtual reality glasses, virtual reality helmet, etc.), smart vehicle equipment and other electronic equipment.
具体的,图1是本说明书一个实施例中乳腺肿块图像识别方法的流程示意图,如图1所示,本说明书一个实施例中提供的乳腺肿块图像识别方法的整体过程可以包括:Specifically, FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of the present specification. As shown in FIG. 1, the overall process of the breast mass image recognition method provided in an embodiment of the present specification may include:
步骤102、获取待识别的乳腺磁共振图像。Step 102: Acquire a breast magnetic resonance image to be identified.
磁共振检查是目前比较常见的医学检查方法,本说明书实施例中,可以获取用户的乳腺磁共振图像,基于获取到的待识别的乳腺磁共振图像进行乳腺肿块的识别。Magnetic resonance examination is currently a relatively common medical examination method. In the embodiment of this specification, a user's breast magnetic resonance image can be acquired, and the breast mass can be identified based on the acquired breast magnetic resonance image to be identified.
步骤104、将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;Step 104: Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
在具体的实施过程中,可以基于深度学习方法,构建出乳腺肿块识别模型,如:可以利用已有的乳腺癌患者的乳腺磁共振图像数据,进行模型训练,学习从输入磁共振图像到输出乳腺肿块分割结果的函数映射关系,构建乳腺肿块识别模型。图2是本说明书一个实施例中乳腺肿块识别模型的网络架构示意图,如图2所示,本说明书实施例中的乳腺肿块识别模型可以是一种深度全卷积神经网络模型,可以将U型卷积神经网络模型(即U-Net)与密集卷积神经网络模型即(DenseNet)相结合。在编码过程可以采用U型卷积神经网络模型中的基本卷积模块进行特征提取,解码过程则可以采用密集卷积神经网络模型中的密集连接将待融合的特征图统一大小后,进行特征融合。In the specific implementation process, a breast mass identification model can be constructed based on deep learning methods. For example, the existing breast cancer patient’s breast magnetic resonance image data can be used for model training to learn from input magnetic resonance image to output breast A function mapping relationship between the results of mass segmentation to construct a breast mass identification model. 2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of the present specification. As shown in FIG. 2, the breast mass identification model in the embodiment of the present specification may be a deep fully convolutional neural network model, which can be U-shaped The convolutional neural network model (ie U-Net) is combined with the dense convolutional neural network model (DenseNet). In the encoding process, the basic convolution module in the U-shaped convolutional neural network model can be used for feature extraction. In the decoding process, the dense connections in the dense convolutional neural network model can be used to unify the feature maps to be fused, and then feature fusion .
U-Net可以理解为卷积神经网络的一种变形,主要其结构形似字母U,因而得名 U-Net。U-Net的整个神经网络主要有两部分组成:收缩路径和扩展路径,收缩路径主要是用来捕捉图片中的上下文信息,而与之相对称的扩展路径则是为了对图片中所需要分割出来的部分进行精准定位。DenseNet可以理解为一种具有密集连接的卷积神经网络,在该网络中,任何两层之间都有直接的连接,也就是说,网络每一层的输入都是前面所有层输出的并集,而该层所学习的特征图也会被直接传给其后面所有层作为输入。密集连接可以缓解梯度消失问题,加强特征传播,鼓励特征复用,极大的减少了参数量,提高图像识别的准确性。U-Net can be understood as a variant of the convolutional neural network, whose structure is mainly like the letter U, hence the name U-Net. The entire neural network of U-Net is mainly composed of two parts: the contraction path and the expansion path. The contraction path is mainly used to capture the context information in the picture, and the expansion path commensurate with it is to segment the need in the picture. For precise positioning. DenseNet can be understood as a convolutional neural network with dense connections. In this network, there is a direct connection between any two layers, that is, the input of each layer of the network is the union of the outputs of all previous layers. , And the feature map learned by this layer will be directly passed to all subsequent layers as input. Dense connections can alleviate the problem of gradient disappearance, strengthen feature propagation, encourage feature reuse, greatly reduce the amount of parameters, and improve the accuracy of image recognition.
如图2所示,图2中左侧基本卷积模块可以采用类似U型卷积神经网络模型中的基本卷积模块,用于特征提取,基本卷积模块向下的箭头可以表示最大池化。图2中右侧通道注意力模块可以理解为解码过程,通道注意力模块向上的箭头可以表示双线性插值,其中,解码过程可以将同一层的编码过程的特征图一起进行特征融合。图2中通道注意力模块右侧的带箭头的连接线可以表示密集连接,本说明书一些实施例中可以将深度全卷积神经网络模型的解码过程的网络模块进行密集连接,即将图2中的通道注意力模块进行密集连接。通道注意力模块可以将同一层的编码过程提取的特征图以及其他通道注意力模块的特征图进行统一大小后,进行特征融合。As shown in Figure 2, the basic convolution module on the left in Figure 2 can use a basic convolution module similar to the U-shaped convolutional neural network model for feature extraction, and the downward arrow of the basic convolution module can indicate maximum pooling . The channel attention module on the right side in FIG. 2 can be understood as a decoding process, and the upward arrow of the channel attention module can represent bilinear interpolation, where the decoding process can perform feature fusion together on the feature maps of the coding process of the same layer. The connecting line with arrows on the right side of the channel attention module in FIG. 2 may represent a dense connection. In some embodiments of this specification, the network module of the decoding process of the deep fully convolutional neural network model may be densely connected, that is, the The channel attention module makes intensive connections. The channel attention module can uniformly size the feature maps extracted from the coding process of the same layer and the feature maps of other channel attention modules, and then perform feature fusion.
例如:图2中最上层的通道注意力模块可以将其左侧的编码过程获得的特征图、其下方3个通道注意力模块输出的特征图以及最下方的特征适应模块输出的特征图进行统一大小,然后将统一大小后的特征图进行融合。其中,最上层的通道注意力模块可左侧的编码过程获得的特征图、其下方3个通道注意力模块输出的特征图以及最下方的特征适应模块输出的特征图可以理解为最上层的通道注意力模块的待融合的特征图。For example: the uppermost channel attention module in Figure 2 can unify the feature map obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowest feature adaptation module Size, and then merge the feature maps of uniform size. Among them, the feature map obtained by the uppermost channel attention module can be obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowermost feature adaptation module can be understood as the uppermost channel The feature map of the attention module to be fused.
本说明书实施例在解码过程中加入了密集连接,这些密集连接可以将不同抽象程度的特征图分别进行上采样到统一大小,然后将它们直接融合。特征图得到了重复利用,抽象程度高的特征图可以更好地指引分类结果,而抽象程度低的特征图可以更好地保留位置信息,分割结果得到更大提升。In the embodiment of the present specification, dense connections are added in the decoding process. These dense connections can separately upsample feature maps with different degrees of abstraction to a uniform size, and then directly merge them. Feature maps are reused. Feature maps with a high degree of abstraction can better guide the classification results, while feature maps with a low degree of abstraction can better retain location information, and the segmentation results are greatly improved.
乳腺肿块识别模型构建完成后,将待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,利用乳腺肿块识别模型获得待识别的乳腺磁共振图像的肿块识别结果如:可以识别出待识别的乳腺磁共振图像中是否有肿块,若有,还可以识别出肿块所在的区域,或乳腺肿块的形状、大小等。图2中最右侧乳腺肿块识别模型的输出图像中的亮点可以表示分割出的肿块,即左侧输入的乳腺磁共振图像中肿块的区域,医生可以根据识别出的肿块区域,进行进一步的诊断以及治疗,或者用于其他的医学研究等。After the breast mass identification model is constructed, the breast magnetic resonance image to be identified is input into the constructed breast mass identification model, and the breast mass identification result of the breast magnetic resonance image to be identified is obtained using the breast mass identification model. For example, the mass to be identified can be identified Whether there is a lump in the breast magnetic resonance image, and if so, it can also identify the area where the lump is located, or the shape and size of the breast lump. The bright spot in the output image of the rightmost breast mass recognition model in Figure 2 can represent the segmented mass, that is, the area of the mass in the breast magnetic resonance image input on the left, and the doctor can make a further diagnosis based on the identified mass area And treatment, or used for other medical research.
本说明书实施例提供的乳腺肿块图像识别方法,基于深度学习,将U型卷积神经网络模型与密集卷积神经网络模型相结合,构建出非对称的编码解码乳腺肿块识别模型结构。再将待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,即可以获得待识别的乳腺磁共振图像的肿块识别结果,实现了乳腺肿块的自动识别,不需要人工肉眼识别,提高了乳腺肿块的识别结果。本说明书实施提供的乳腺肿块图像识别方法可以更好地保留有用特征对分割结果的有利影响,而削弱无用特征的作用。大大提高了网络模型对磁共振乳腺肿块分割的效果,不需要后续网络模型对分割结果进行进一步优化,减少了计算成本,加快了影像分析过程,可以更好地辅助医生进行实时影像诊断。The breast mass image recognition method provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results. The breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. Greatly improve the effect of network model on the segmentation of MRI breast masses, without the need for further network model to further optimize the segmentation results, reduce the calculation cost, speed up the image analysis process, and better assist doctors in real-time image diagnosis.
在上述实施例的基础上,本说明书一些实施例中,可以利用所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:Based on the above embodiments, in some embodiments of this specification, the basic convolution module in the U-shaped convolutional neural network model can be used to extract features using the following formula:
y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13)      (1) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 ) (1)
上式中,y 1表示提取出的特征图即基本卷积模块的输出,x 1表示输入的乳腺磁共振图像即基本卷积模块的输入,δ表示第一激活函数(可以是线性整流函数),W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map that is the output of the basic convolution module, x 1 represents the input breast magnetic resonance image that is the input of the basic convolution module, and δ represents the first activation function (which can be a linear rectification function) , W 11 , W 12 , and W 13 represent weights corresponding to different convolutional layers, and b 11 , b 12 , and b 13 represent offset parameters corresponding to different convolutional layers.
在具体的实施过程中,如图2所示,图中的基本卷积模块内可以包括上述公式(1)的函数,利用上述公式(1)进行特征提取。基本卷积模块中可以进行多次的卷积操作,即基本卷积模块中可以包括多个卷积层,不同卷积层对应有不同的权重W 11、W 12、W 13和偏置参数b 11、b 12、b 13。本说明书实施例中基本卷积模块中可以包括3个卷积层,根据实际需要还可以设置其他数量的卷积层,根据卷积层的数量可以对上述公式(1)进行适应性的调整或修改,本说明书实施例不作具体限定。 In a specific implementation process, as shown in FIG. 2, the basic convolution module in the figure may include the function of the above formula (1), and use the above formula (1) for feature extraction. The basic convolution module can perform multiple convolution operations, that is, the basic convolution module can include multiple convolutional layers, and different convolutional layers have different weights W 11 , W 12 , W 13 and offset parameters b 11 , b 12 and b 13 . In the embodiment of the present specification, the basic convolution module may include three convolutional layers, and other numbers of convolutional layers may be set according to actual needs. According to the number of convolutional layers, the above formula (1) may be adaptively adjusted or Modifications are not specifically limited in the embodiments of this specification.
基本卷积模块之间可以由最大池化连接,最大池化可以增加感受野及一定程度上实现了输入图像平移不变性,特征图在经过每个最大池化层之后,分辨率减半,通道数加倍。The basic convolution modules can be connected by maximum pooling. The maximum pooling can increase the receptive field and to a certain extent achieve the invariance of the input image translation. After the feature map passes through each maximum pooling layer, the resolution is halved, and the channel Number doubled.
本说明书实施例采用类似U-Net的基本卷积模块进行特征提取,可以减少样本数据的数量,提升数据处理的效率。The embodiments of this specification use a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
在上述实施例的基础上,本说明书一个实施例中,所述方法还可以包括:利用所述深度全卷积神经网络模型采用下述公式将提取出的特征图进行调整:Based on the above embodiment, in one embodiment of the present specification, the method may further include: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2)       (2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 ) (2)
上式中,y 2表示调整后的特征图即特征适应模块的输出,x 2表示提取出的特征图即特征适应模块的输入,δ表示第一激活函数(可以是线性整流函数),W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, that is, the output of the feature adaptation module, x 2 represents the extracted feature map, that is, the input of the feature adaptation module, δ represents the first activation function (which may be a linear rectification function), W 21 , W 22 represents weights corresponding to different convolutional layers, and b 21 , b 22 represent offset parameters corresponding to different convolutional layers.
在具体的实施过程中,如图2所示,本说明书实施例中的深度全卷积神经网络模型即乳腺肿块识别模型中还可以加入特征适用模块,特征适应模块可以调整编码过程中基本卷积模块产生的特征图,来提升后续的特征融合效果。特征适用模块中可以包括上述公式(2)的函数,利用上述公式(2)进行特征图的调整,特征适应模块中也可以包含有多个卷积层,不同的卷积层中可以对应有不同的权重W 21、W 22和b 21、b 22。本说明书实施例中特征适应模块中可以包括2个卷积层,根据实际需要还可以设置其他数量的卷积层,根据卷积层的数量可以对上述公式(2)进行适应性的调整或修改,本说明书实施例不作具体限定。编码过程产生的不同抽象程度的特征图经过特征适应模块后可以与解码过程中的特征图使用级联方式进行特征融合。 In the specific implementation process, as shown in FIG. 2, the deep fully convolutional neural network model in the embodiment of this specification, that is, the breast mass recognition model, can also include a feature application module, and the feature adaptation module can adjust the basic convolution in the encoding process The feature map generated by the module to enhance the subsequent feature fusion effect. The feature application module may include the function of the above formula (2), and use the above formula (2) to adjust the feature map. The feature adaptation module may also include multiple convolutional layers, and different convolutional layers may correspond to different The weights W 21 , W 22 and b 21 , b 22 . In the embodiment of this specification, the feature adaptation module may include 2 convolutional layers, and other numbers of convolutional layers may be set according to actual needs. The above formula (2) may be adaptively adjusted or modified according to the number of convolutional layers The examples in this specification are not specifically limited. The feature maps with different levels of abstraction generated in the encoding process can be fused with the feature maps in the decoding process in a cascade manner after passing through the feature adaptation module.
本说明书实施例利用乳腺肿块识别模型中的特征适应模块,可以对特征图进行调整,提高特征融合的效果,进一步提高乳腺肿块识别的准确性。The embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
在上述实施例的基础上,本说明书一个实施例中,所述方法还包括:Based on the above embodiments, in an embodiment of this specification, the method further includes:
利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
在具体的实施过程中,本说明书实施例可以将不同的特征图赋予不同的权重值,如:将一些信息比较多的特征图的权重值设置的比较高,将信息量比较少或没有用的特征图的权重值设置的比较低。再根据特征图对应的权重值进行特征融合,如:可以将各个特征图对应的权重值乘到待融合的特征图上,加大重要特征图对肿块识别结果的影响,减小不重要特征图对肿块识别结果的影响。In the specific implementation process, the embodiments of this specification can assign different feature maps with different weight values, such as: setting the weight value of some feature maps with more information to be higher, and reducing the amount of information or useless The weight value of the feature map is set relatively low. Then perform feature fusion according to the weight value corresponding to the feature map, for example: you can multiply the weight value corresponding to each feature map to the feature map to be fused, increase the impact of the important feature map on the mass recognition result, and reduce the unimportant feature map Impact on the results of mass identification.
本说明书一个实施例中,可以对提取出的特征图进行平均池化,如:将特征适应模块调整后的特征图进行平均池化,然后经由两个全连接层,最后通过第二激活函数如:Sigmoid函数激活,产生每个特征图的权重,再将该权重值乘回到融合的特征图上。平均池化可以表示对局部接受域中的所有值求均值,Sigmoid函数是一个在生物学中常见的S型函数,也可以称为S型生长曲线。In an embodiment of the present specification, the extracted feature maps can be averagely pooled, for example: the feature maps adjusted by the feature adaptation module are averagely pooled, then through two fully connected layers, and finally through the second activation function as : The Sigmoid function is activated to generate the weight of each feature map, and then the weight value is multiplied back to the fused feature map. Average pooling can mean averaging all the values in the local acceptance domain. The Sigmoid function is a common S-shaped function in biology, and it can also be called an S-shaped growth curve.
本说明书一个实施例中,可以采用下述公式(3)获得各个特征图对应的权重值:In an embodiment of this specification, the following formula (3) may be used to obtain the weight value corresponding to each feature map:
Figure PCTCN2018120885-appb-000003
Figure PCTCN2018120885-appb-000003
上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,i可以表示1-H的数值,j可以表示1-M的数值,x c可以表示通道注意力模块输入的第c个特征图,S表示特征图对应的权重值向量(其中可以包括多个特征图对应的权重值),Z表示特征图池化后的池化结果向量(其中可以包括多个特征图对应的池化结果,如:Z c),σ表示所述第二激活函数(可以是Sigmoid函数),W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and i can represent 1-H Numerical value, j can represent the value of 1-M, x c can represent the c-th feature map input by the channel attention module, S represents the weight value vector corresponding to the feature map (which can include the weight values corresponding to multiple feature maps), Z represents the pooling result vector after the feature map is pooled (which may include pooling results corresponding to multiple feature maps, such as: Z c ), σ represents the second activation function (which may be a Sigmoid function), W 32 , Weights corresponding to different fully connected layers of W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
获得不同特征图对应的权重值后,可以将各特征图对应的权重值与特征图的乘积作为输出,用于特征融合。如:After obtaining the weight values corresponding to different feature maps, the product of the weight values corresponding to each feature map and the feature map can be used as an output for feature fusion. Such as:
y 3=S c·x c          (4) y 3 =S c ·x c (4)
上式中,y 3可以表示通道注意力模块的输出,S c可以表示第c个特征图对应权重值,x c可以表示通道注意力模块输入的第c个特征图。 In the above formula, y 3 can represent the output of the channel attention module, S c can represent the weight value corresponding to the c-th feature map, and x c can represent the c-th feature map input by the channel attention module.
需要说明书的是,本说明书实施例中的公式仅仅是示意性的表示,根据实际需要还可以对上述公式进行调整、变换或修改,本说明书实施例不作具体限定。It should be noted that the formulas in the embodiments of the present specification are only schematic representations, and the above formulas can be adjusted, transformed, or modified according to actual needs, and the embodiments of the present specification are not specifically limited.
本说明书实施例,利用通道注意力模块将不同的特征图赋予不同的权重值后进行特征融合,实现特征图的筛选,可以提高重要特征图的影响,进一步提高肿块识别结果的准确性。In the embodiments of the present specification, the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
在上述实施例的基础上,本说明书实施例中,可以采用下述方式构建乳腺肿块识别模型:Based on the above embodiments, in the embodiments of the present specification, a breast mass identification model can be constructed in the following manner:
获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
建立所述乳腺肿块识别模型,将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,对所述乳腺肿块识别模型进行训练,直至所述乳腺肿块识别模型 达到预设要求。Establish the breast mass identification model, use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
在具体的实施例过程中,可以获取历史用户的乳腺磁共振图像作为样本数据,样本数据可以是已经确诊为乳腺癌的用户的乳腺磁共振图像。样本数据还可以包括获取到的乳腺磁共振图像中的肿块标记作为训练标签,样本数据的具体数量可以根据实际需要进行选择,本说明书实施例不作具体限定。本说明书一个实施例中,在获取到多个乳腺磁共振图像后,可以对获取到的乳腺磁共振图像进行归一化处理,即对样本数据中的乳腺磁共振图像的像素进行统一处理,方便后续进行模型训练。再对归一化处理后的乳腺磁共振图像进行肿块标注,可以由专业的医生进行标注或者根据用户的诊断结果进行标注等,可以标注出肿块的位置、大小等内容。In a specific embodiment, the breast magnetic resonance image of a historical user may be acquired as sample data, and the sample data may be a breast magnetic resonance image of a user who has been diagnosed with breast cancer. The sample data may also include the acquired mass markers in the breast magnetic resonance image as training labels. The specific number of sample data may be selected according to actual needs, which is not specifically limited in the embodiments of this specification. In one embodiment of the present specification, after acquiring multiple breast magnetic resonance images, the acquired breast magnetic resonance images can be normalized, that is, the pixels of the breast magnetic resonance images in the sample data are processed in a unified manner, which is convenient Follow-up model training. Then, the normalized breast magnetic resonance image is labeled with a mass, which can be labeled by a professional doctor or based on the user's diagnosis, and the location, size, etc. of the mass can be marked.
样本数据准备结束后,可以构建乳腺肿块识别模型,如:构建乳腺肿块识别模型的网络架构等,乳腺肿块识别模型的网络架构具体可以参考上述实施例的记载,此处不再赘述。如图2所示,本说明书一些实施例中肿块识别模型可以包括基本卷积模块、特征适应模块、通道注意力模块,其中,基本卷积模块、特征适应模块可以理解为编码过程,通道注意力模块可以理解为解码过程。乳腺肿块识别模型中还可以包括多个模型参数,如:卷积核的大小、卷积层的数量等。乳腺肿块识别模型构建完成后,可以将样本数据中的乳腺磁共振图像作为乳腺肿块识别模型的输入数据,将对应的乳腺磁共振图像中的肿块标记作为乳腺肿块识别模型的输出数据,对乳腺肿块识别模型进行模型训练,直至乳腺肿块识别模型达到预设要求,如:模型输出精度符合要求或模型训练次数符合要求,即可以认为模型训练结束。After the sample data preparation is completed, a breast mass identification model may be constructed, such as: constructing a network architecture of the breast mass identification model, etc. For the network architecture of the breast mass identification model, reference may be made to the records of the foregoing embodiments, and details are not described here. As shown in FIG. 2, in some embodiments of this specification, the lump recognition model may include a basic convolution module, a feature adaptation module, and a channel attention module, where the basic convolution module and feature adaptation module may be understood as an encoding process, channel attention The module can be understood as the decoding process. The breast mass identification model may also include multiple model parameters, such as: the size of the convolution kernel and the number of convolution layers. After the breast mass identification model is constructed, the breast magnetic resonance image in the sample data can be used as the input data of the breast mass identification model, and the corresponding mass marker in the breast magnetic resonance image can be used as the output data of the breast mass identification model. The recognition model performs model training until the breast mass recognition model meets the preset requirements, such as: the model output accuracy meets the requirements or the model training times meet the requirements, that is, the model training is concluded.
本说明书实施例利用深度学习训练构建出乳腺肿块识别模型,可以实现乳腺肿块的自动识别,不需要人工识别,提高了乳腺肿块识别的准确性。The embodiment of the present specification uses deep learning training to construct a breast mass identification model, which can realize the automatic identification of breast masses, without the need for manual identification, and improves the accuracy of breast mass identification.
本说明书实施例中的乳腺肿块识别主要包括模型训练和预测两个阶段,在上述实施例的基础上,本说明书一个实施例中,在训练阶段,可以利用已有的乳腺磁共振数据对设计的深度全卷积网络进行优化和参数学习,在测试阶段,将训练好的网络模型用于分析训练中未见过的新数据即待识别的乳腺磁共振图像。在对待识别的乳腺磁共振图像进行肿块识别时,可以将待识别的乳腺磁共振图像与训练阶段的样本数据中的乳腺磁共振图像的图像参数(如:主场强度、弛豫时间T1、T2等)进行对比,对比两个图像数据的分布是否统一,即图像的对比度、信噪比等是否一致。若待识别的乳腺磁共振图像和样本数据符合统一分布,可以直接使用训练构建的乳腺肿块识别模型进行肿块识别。当待识别的乳腺磁共振图像和样本数据不满足统一分布时,可以使用少量新数据即待识别 的乳腺磁共振图像(或与待识别的乳腺磁共振图像参数一致的图像)对构建出的乳腺肿块识别模型的参数进行快速微调。利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别,通过不断的更新调整乳腺肿块识别模型,提高模型的适用性,进一步提高肿块识别结果的准确性。The breast mass identification in the embodiment of this specification mainly includes two stages of model training and prediction. On the basis of the above embodiment, in one embodiment of this specification, in the training stage, existing breast magnetic resonance data can be used to design The deep fully convolutional network performs optimization and parameter learning. In the testing stage, the trained network model is used to analyze new data that has not been seen during training, that is, the breast magnetic resonance image to be identified. When performing mass identification of the breast magnetic resonance image to be identified, the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data of the training stage (such as: main field intensity, relaxation time T1, T2, etc.) ) Compare and compare whether the distribution of the two image data is uniform, that is, whether the image contrast, signal to noise ratio, etc. are consistent. If the breast magnetic resonance image and sample data to be identified conform to a uniform distribution, the breast mass identification model constructed by training can be used directly for mass identification. When the breast magnetic resonance image to be identified and the sample data do not satisfy the uniform distribution, a small amount of new data, that is, the breast magnetic resonance image to be identified (or an image with the same parameters as the breast magnetic resonance image to be identified) can be used to construct the breast Quickly fine-tune the parameters of the lump identification model. The adjusted breast mass identification model is used to identify the mass of the breast magnetic resonance image to be identified, and the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
本说明书实施例提出了非对称的编码解码主体框架结构,设计了新的特征融合和筛选机制,引入了密集连接。本说明书实施例中的分割方法可以更好地保留有用特征对分割结果的有利影响,而削弱无用特征的作用。由此,可以大大提高网络对磁共振乳腺肿块分割的效果,不需要后续网络对分割结果进行进一步优化,减少了计算成本,加快了影像分析过程,可以更好地辅助医生进行实时影像诊断。The embodiment of the present specification proposes an asymmetric codec main frame structure, designs a new feature fusion and screening mechanism, and introduces dense connections. The segmentation method in the embodiment of the present specification can better retain the beneficial influence of useful features on the segmentation result, and weaken the role of useless features. As a result, the effect of the network on the segmentation of the magnetic resonance breast mass can be greatly improved, and the subsequent network is not required to further optimize the segmentation result, which reduces the calculation cost, speeds up the image analysis process, and can better assist the doctor in real-time image diagnosis.
需要说明的是,本说明书实施例中的乳腺肿块图像识别方法可以不限于识别乳腺肿块,还可以用于其他的图像识别过程,如:识别其他的病灶区(如:脑肿瘤)等。可以利用其他部位的磁共振图像训练构建对应的识别模型,完成其他病灶区的自动识别。It should be noted that the breast mass image recognition method in the embodiment of the present specification may not be limited to identifying breast masses, but may also be used in other image recognition processes, such as: identifying other lesion areas (such as brain tumors) and the like. You can use the magnetic resonance image training of other parts to build a corresponding recognition model to complete the automatic recognition of other lesion areas.
本说明书中上述方法的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。相关之处参见方法实施例的部分说明即可。The embodiments of the above method in this specification are described in a progressive manner. The same or similar parts between the embodiments can be referred to each other. Each embodiment focuses on the differences from other embodiments. For the relevant parts, please refer to the description of the method embodiments.
基于上述所述的乳腺肿块图像识别方法,本说明书一个或多个实施例还提供一种乳腺肿块图像识别装置。所述的装置可以包括使用了本说明书实施例所述方法的系统(包括分布式系统)、软件(应用)、模块、组件、服务器、客户端等并结合必要的实施硬件的装置。基于同一创新构思,本说明书实施例提供的一个或多个实施例中的装置如下面的实施例所述。由于装置解决问题的实现方案与方法相似,因此本说明书实施例具体的装置的实施可以参见前述方法的实施,重复之处不再赘述。以下所使用的,术语“单元”或者“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。Based on the breast mass image recognition method described above, one or more embodiments of the present specification further provide a breast mass image recognition device. The device may include a system (including a distributed system), software (applications), modules, components, servers, clients, etc. using the method described in the embodiments of the present specification in combination with necessary hardware implementation devices. Based on the same innovative concept, the devices in one or more embodiments provided by the embodiments of this specification are as described in the following embodiments. Since the implementation solution of the device to solve the problem is similar to the method, the implementation of the specific device in the embodiments of the present specification may refer to the implementation of the foregoing method, and the repetition is not repeated. As used below, the term "unit" or "module" may implement a combination of software and/or hardware that achieves a predetermined function. Although the devices described in the following embodiments are preferably implemented in software, implementation of hardware or a combination of software and hardware is also possible and conceived.
具体地,图3是本说明书提供的乳腺肿块图像识别装置一个实施例的模块结构示意图,如图3所示,本说明书中提供的乳腺肿块图像识别装置包括:图像获取模块31、肿块识别模块32,其中:Specifically, FIG. 3 is a schematic diagram of a module structure of an embodiment of the breast mass image recognition device provided in this specification. As shown in FIG. 3, the breast mass image recognition device provided in this specification includes: an image acquisition module 31 and a mass identification module 32 ,among them:
图像获取模块31,可以用于获取待识别的乳腺磁共振图像;The image acquisition module 31 can be used to acquire a breast magnetic resonance image to be identified;
肿块识别模块32,可以用于将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;The mass identification module 32 may be used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络 模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
本说明书实施例提供的乳腺肿块图像识别装置,基于深度学习,将U型卷积神经网络模型与密集卷积神经网络模型相结合,构建出非对称的编码解码乳腺肿块识别模型结构。再将待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,即可以获得待识别的乳腺磁共振图像的肿块识别结果,实现了乳腺肿块的自动识别,不需要人工肉眼识别,提高了乳腺肿块的识别结果。本说明书实施提供的乳腺肿块图像识别方法可以更好地保留有用特征对分割结果的有利影响,而削弱无用特征的作用。可以大大提高网络对磁共振乳腺肿块分割的效果,不需要后续网络对分割结果进行进一步优化,减少了计算成本,加快了影像分析过程,可以更好地辅助医生进行实时影像诊断。The breast mass image recognition device provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results. The breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. It can greatly improve the effect of the network on the segmentation of magnetic resonance breast masses, without the need for further network optimization of the segmentation results, reducing the calculation cost, speeding up the image analysis process, and better assisting doctors in real-time image diagnosis.
在上述实施例的基础上,所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:Based on the above embodiments, the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 )
上式中,y 1表示提取出的特征图,x 1表示输入的乳腺磁共振图像,δ表示第一激活函数,W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map, x 1 represents the input breast magnetic resonance image, δ represents the first activation function, W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers, b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
本说明书实施例提供的乳腺肿块图像识别装置,采用类似U-Net的基本卷积模块进行特征提取,可以减少样本数据的数量,提升数据处理的效率。The breast mass image recognition device provided by the embodiment of the present specification uses a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
在上述实施例的基础上,所述深度全卷积神经网络模型中包括特征适应模块,用于采用下述公式将提取出的特征图进行调整:Based on the above embodiment, the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 )
上式中,y 2表示调整后的特征图,x 2表示提取出的特征图,δ表示第一激活函数,W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, x 2 represents the extracted feature map, δ represents the first activation function, W 21 and W 22 represent the weights corresponding to different convolutional layers, and b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
本说明书实施例利用乳腺肿块识别模型中的特征适应模块,可以对特征图进行调整,提高特征融合的效果,进一步提高乳腺肿块识别的准确性。The embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
在上述实施例的基础上,所述深度全卷积神经网络模型中还包括通道注意力模块用于:Based on the above embodiments, the deep fully convolutional neural network model further includes a channel attention module for:
利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
本说明书实施例,将不同的特征图赋予不同的权重值,加大重要特征图对肿块识别结果的影响,减小不重要特征图对肿块识别结果的影响。In the embodiment of the present specification, different feature maps are given different weight values, the influence of the important feature map on the mass recognition result is increased, and the influence of the unimportant feature map on the mass recognition result is reduced.
在上述实施例的基础上,所述通道注意力模块具体用于:Based on the above embodiments, the channel attention module is specifically used to:
对提取出的特征图进行平均池化;Perform average pooling on the extracted feature maps;
将平均池化后的特征图利用全连接层进行连接,并采用第二激活函数进行激活,获得所述特征图对应的权重值。The averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
在上述实施例的基础上,所述通道注意力模块具体用于采用下述公式获得所述特征图对应的权重值:Based on the above embodiment, the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
Figure PCTCN2018120885-appb-000004
Figure PCTCN2018120885-appb-000004
上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,x c表示输入的第c个特征图,S表示特征图对应的权重值向量,Z表示特征图池化后的池化结果向量,σ表示所述第二激活函数,W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and x c represents the input c-th feature map Feature maps, S represents the weight value vector corresponding to the feature map, Z represents the pooled result vector after the feature map is pooled, σ represents the second activation function, the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
本说明书实施例,利用通道注意力模块将不同的特征图赋予不同的权重值后进行特征融合,实现特征图的筛选,可以提高重要特征图的影响,进一步提高肿块识别结果的准确性。In the embodiments of the present specification, the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
图4是本说明书又一实施例中乳腺肿块图像识别装置的结构示意图,如图4所示,在上述实施例的基础上,所述装置还包括模型构建模块41用于采用下述方法构建所述乳腺肿块识别模型:FIG. 4 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 4, on the basis of the above embodiment, the device further includes a model building module 41 for constructing the The breast mass identification model:
获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
建立所述乳腺肿块识别模型,将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,对所述乳腺肿块识别模型进行训练,直至所述乳腺肿块识别模型达到预设要求。Establish the breast mass identification model, use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
本说明书实施例,利用深度学习训练构建出乳腺肿块识别模型,可以实现乳腺肿块 的自动识别,不需要人工识别,提高了乳腺肿块识别的准确性。In the embodiment of the present specification, the breast mass identification model is constructed by using deep learning training, which can realize the automatic identification of the breast mass without manual identification, and improves the accuracy of the breast mass identification.
图5是本说明书又一实施例中乳腺肿块图像识别装置的结构示意图,如图5所示,在上述实施例的基础上,所述装置还包括模型调整模块51用于:FIG. 5 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 5, based on the above embodiment, the device further includes a model adjustment module 51 for:
将所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数进行对比;Comparing the image parameters of the breast magnetic resonance image to be identified with the breast magnetic resonance image in the sample data;
若所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数不一致,则利用所述待识别的乳腺磁共振图像对所述乳腺肿块识别模型进行调整;If the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data are inconsistent, use the breast magnetic resonance image to be identified to adjust the breast mass recognition model;
利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别。Using the adjusted breast mass identification model, mass identification of the breast magnetic resonance image to be identified is performed.
本说明书实施例,通过不断的更新调整乳腺肿块识别模型,提高模型的适用性,进一步提高肿块识别结果的准确性。In the embodiment of the present specification, the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
需要说明的,上述所述的装置根据方法实施例的描述还可以包括其他的实施方式。具体的实现方式可以参照相关方法实施例的描述,在此不作一一赘述。It should be noted that the above description of the device according to the method embodiment may also include other implementations. For a specific implementation manner, reference may be made to the description of related method embodiments, and details are not repeated herein.
本说明书实施例还提供一种乳腺肿块图像识别处理设备,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现上述实施例的乳腺肿块图像识别方法,如:An embodiment of the present specification also provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor executable instructions, and the processor implements the instructions to implement the breast mass image of the above embodiment Identification methods, such as:
获取待识别的乳腺磁共振图像;Obtain the breast magnetic resonance image to be identified;
将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
所述存储介质可以包括用于存储信息的物理装置,通常是将信息数字化后再以利用电、磁或者光学等方式的媒体加以存储。所述存储介质有可以包括:利用电能方式存储信息的装置如,各式存储器,如RAM、ROM等;利用磁能方式存储信息的装置如,硬盘、软盘、磁带、磁芯存储器、磁泡存储器、U盘;利用光学方式存储信息的装置如,CD或DVD。当然,还有其他方式的可读存储介质,例如量子存储器、石墨烯存储器等等。The storage medium may include a physical device for storing information, usually after the information is digitized and then stored in a medium using electrical, magnetic, or optical means. The storage medium may include: devices that use electrical energy to store information, such as various types of memory, such as RAM, ROM, etc.; devices that use magnetic energy to store information, such as hard disks, floppy disks, magnetic tapes, magnetic core memories, and bubble memories, U disk; a device that uses optical means to store information such as CD or DVD. Of course, there are other ways of readable storage media, such as quantum memory, graphene memory, and so on.
需要说明的,上述所述的处理设备根据方法实施例的描述还可以包括其他的实施方式。具体的实现方式可以参照相关方法实施例的描述,在此不作一一赘述。It should be noted that the above description of the processing device according to the method embodiment may also include other implementation manners. For a specific implementation manner, reference may be made to the description of related method embodiments, and details are not repeated herein.
本说明书提供的乳腺肿块识别系统可以为单独的乳腺肿块识别系统,也可以应用在 多种数据分析处理系统中。所述系统可以包括上述实施例中任意一个乳腺肿块图像识别装置。所述的系统可以为单独的服务器,也可以包括使用了本说明书的一个或多个所述方法或一个或多个实施例装置的服务器集群、系统(包括分布式系统)、软件(应用)、实际操作装置、逻辑门电路装置、量子计算机等并结合必要的实施硬件的终端装置。所述核对差异数据的检测系统可以包括至少一个处理器以及存储计算机可执行指令的存储器,所述处理器执行所述指令时实现上述任意一个或者多个实施例中所述方法的步骤。The breast mass identification system provided in this manual can be a separate breast mass identification system, or it can be applied to various data analysis and processing systems. The system may include any breast mass image recognition device in the above embodiments. The system may be a separate server, or it may include a server cluster, system (including distributed system), software (application) using one or more of the methods or one or more embodiments of this specification. Terminal devices that actually operate devices, logic gate devices, quantum computers, etc., combined with the necessary implementation hardware. The detection system for checking the difference data may include at least one processor and a memory storing computer-executable instructions. When the processor executes the instructions, the steps of the method in any one or more of the above embodiments are implemented.
本说明书实施例所提供的方法实施例可以在移动终端、计算机终端、服务器或者类似的运算装置中执行。以运行在服务器上为例,图6是应用本申请实施例乳腺肿块识别服务器的硬件结构框图。如图6所示,服务器10可以包括一个或多个(图中仅示出一个)处理器100(处理器100可以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)、用于存储数据的存储器200、以及用于通信功能的传输模块300。本邻域普通技术人员可以理解,图6所示的结构仅为示意,其并不对上述电子装置的结构造成限定。例如,服务器10还可包括比图6中所示更多或者更少的组件,例如还可以包括其他的处理硬件,如数据库或多级缓存、GPU,或者具有与图6所示不同的配置。The method embodiments provided in the embodiments of this specification can be executed in a mobile terminal, a computer terminal, a server, or a similar computing device. Taking running on a server as an example, FIG. 6 is a block diagram of a hardware structure of a breast mass identification server using an embodiment of the present application. As shown in FIG. 6, the server 10 may include one or more (only one is shown in the figure) processor 100 (the processor 100 may include but is not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA), A memory 200 for storing data, and a transmission module 300 for communication functions. A person of ordinary skill in this neighborhood can understand that the structure shown in FIG. 6 is merely an illustration, which does not limit the structure of the foregoing electronic device. For example, the server 10 may also include more or fewer components than those shown in FIG. 6, for example, it may also include other processing hardware, such as a database or a multi-level cache, a GPU, or have a configuration different from that shown in FIG.
存储器200可用于存储应用软件的软件程序以及模块,如本说明书实施例中的乳腺肿块图像识别方法对应的程序指令/模块,处理器100通过运行存储在存储器200内的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器200可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器200可进一步包括相对于处理器100远程设置的存储器,这些远程存储器可以通过网络连接至计算机终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。The memory 200 may be used to store software programs and modules of application software, such as program instructions/modules corresponding to the breast mass image recognition method in the embodiments of the present specification, and the processor 100 executes the software programs and modules stored in the memory 200 to execute Various functional applications and data processing. The memory 200 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 200 may further include memories remotely provided with respect to the processor 100, and these remote memories may be connected to a computer terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
传输模块300用于经由一个网络接收或者发送数据。上述的网络具体实例可包括计算机终端的通信供应商提供的无线网络。在一个实例中,传输模块300包括一个网络适配器(Network Interface Controller,NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输模块300可以为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。The transmission module 300 is used to receive or send data via a network. The specific example of the network described above may include a wireless network provided by a communication provider of computer terminals. In one example, the transmission module 300 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through the base station to communicate with the Internet. In one example, the transmission module 300 may be a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
上述对本说明书特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是 可以的或者可能是有利的。The foregoing describes specific embodiments of the present specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve the desired results. In addition, the processes depicted in the drawings do not necessarily require the particular order shown or sequential order to achieve the desired results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
本说明书提供的上述实施例所述的方法或装置可以通过计算机程序实现业务逻辑并记录在存储介质上,所述的存储介质可以计算机读取并执行,实现本说明书实施例所描述方案的效果。The method or apparatus described in the above embodiments provided in this specification can implement business logic through a computer program and be recorded on a storage medium, and the storage medium can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of this specification.
本说明书实施例提供的上述乳腺肿块图像识别方法或装置可以在计算机中由处理器执行相应的程序指令来实现,如使用windows操作系统的c++语言在PC端实现、linux系统实现,或其他例如使用android、iOS系统程序设计语言在智能终端实现,以及基于量子计算机的处理逻辑实现等。The above-mentioned breast mass image recognition method or device provided by the embodiment of the present specification can be implemented by a processor executing corresponding program instructions in a computer, such as using the Windows operating system C++ language to implement on the PC side, Linux system, or other, for example, using Android and iOS system programming languages are implemented in smart terminals, and quantum logic-based processing logic.
需要说明的是说明书上述所述的装置、计算机存储介质、系统根据相关方法实施例的描述还可以包括其他的实施方式,具体的实现方式可以参照对应方法实施例的描述,在此不作一一赘述。It should be noted that the description of the device, computer storage medium, and system described above in the specification according to the related method embodiments may also include other implementation manners. For specific implementation manners, reference may be made to the description of the corresponding method embodiments, and details are not repeated here. .
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于硬件+程序类实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。The embodiments in this specification are described in a progressive manner. The same or similar parts between the embodiments can be referred to each other. Each embodiment focuses on the differences from other embodiments. In particular, for the hardware + program embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment.
本说明书实施例并不局限于必须是符合行业通信标准、标准计算机数据处理和数据存储规则或本说明书一个或多个实施例所描述的情况。某些行业标准或者使用自定义方式或实施例描述的实施基础上略加修改后的实施方案也可以实现上述实施例相同、等同或相近、或变形后可预料的实施效果。应用这些修改或变形后的数据获取、存储、判断、处理方式等获取的实施例,仍然可以属于本说明书实施例的可选实施方案范围之内。The embodiments of this specification are not limited to those that must comply with industry communication standards, standard computer data processing and data storage rules, or those described in one or more embodiments of this specification. Some industry standards or implementations described in a custom manner or embodiments based on slightly modified implementations can also achieve the same, equivalent, or similar, or predictable implementation effects of the foregoing embodiments. Examples obtained by applying these modified or deformed data acquisition, storage, judgment, processing methods, etc., can still fall within the scope of optional implementations of the examples in this specification.
在20世纪90年代,对于一个技术的改进可以很明显地区分是硬件上的改进(例如,对二极管、晶体管、开关等电路结构的改进)还是软件上的改进(对于方法流程的改进)。然而,随着技术的发展,当今的很多方法流程的改进已经可以视为硬件电路结构的直接改进。设计人员几乎都通过将改进的方法流程编程到硬件电路中来得到相应的硬件电路结构。因此,不能说一个方法流程的改进就不能用硬件实体模块来实现。例如,可编程逻辑器件(Programmable Logic Device,PLD)(例如现场可编程门阵列(Field Programmable Gate Array,FPGA))就是这样一种集成电路,其逻辑功能由用户对器件编程来确定。由设计人员自行编程来把一个数字系统“集成”在一片PLD上,而不需要请芯片制造厂商来设计和制作专用的集成电路芯片。而且,如今,取代手工地制作集成电路芯片,这种编程也多半改用“逻辑编译器(logic compiler)”软件来实现,它与程序开 发撰写时所用的软件编译器相类似,而要编译之前的原始代码也得用特定的编程语言来撰写,此称之为硬件描述语言(Hardware Description Language,HDL),而HDL也并非仅有一种,而是有许多种,如ABEL(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language)等,目前最普遍使用的是VHDL(Very-High-Speed Integrated Circuit Hardware Description Language)与Verilog。本领域技术人员也应该清楚,只需要将方法流程用上述几种硬件描述语言稍作逻辑编程并编程到集成电路中,就可以很容易得到实现该逻辑方法流程的硬件电路。In the 1990s, the improvement of a technology can be clearly distinguished from the improvement in hardware (for example, the improvement of circuit structures such as diodes, transistors, and switches) or the improvement in software (the improvement of the process flow). However, with the development of technology, the improvement of many methods and processes can be regarded as a direct improvement of the hardware circuit structure. Designers almost get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by hardware physical modules. For example, a programmable logic device (Programmable Logic Device, PLD) (such as a field programmable gate array (Field Programmable Gate Array, FPGA)) is such an integrated circuit, and its logic function is determined by the user programming the device. Designers can program themselves to "integrate" a digital system on a PLD without having to ask chip manufacturers to design and make dedicated integrated circuit chips. Moreover, nowadays, instead of manually making integrated circuit chips, this kind of programming is also mostly implemented with "logic compiler" software, which is similar to the software compiler used in program development and writing, but before compilation The original code must also be written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), and HDL is not only one kind, but there are many kinds, such as ABEL (Advanced Boolean Expression) Language , AHDL (AlteraHardwareDescriptionLanguage), Confluence, CUPL (CornellUniversityProgrammingLanguage), HDCal, JHDL (JavaHardwareDescriptionLanguage), Lava, Lola, MyHDL, PALASM, RHDL (RubyHardwareDescription) It is VHDL (Very-High-Speed Integrated Circuit Hardware Description) and Verilog. Those skilled in the art should also be clear that by simply programming the method flow in the above hardware description languages and programming into the integrated circuit, the hardware circuit that implements the logic method flow can be easily obtained.
控制器可以按任何适当的方式实现,例如,控制器可以采取例如微处理器或处理器以及存储可由该(微)处理器执行的计算机可读程序代码(例如软件或固件)的计算机可读介质、逻辑门、开关、专用集成电路(Application Specific Integrated Circuit,ASIC)、可编程逻辑控制器和嵌入微控制器的形式,控制器的例子包括但不限于以下微控制器:ARC 625D、Atmel AT91SAM、Microchip PIC18F26K20以及Silicone Labs C8051F320,存储器控制器还可以被实现为存储器的控制逻辑的一部分。本领域技术人员也知道,除了以纯计算机可读程序代码方式实现控制器以外,完全可以通过将方法步骤进行逻辑编程来使得控制器以逻辑门、开关、专用集成电路、可编程逻辑控制器和嵌入微控制器等的形式来实现相同功能。因此这种控制器可以被认为是一种硬件部件,而对其内包括的用于实现各种功能的装置也可以视为硬件部件内的结构。或者甚至,可以将用于实现各种功能的装置视为既可以是实现方法的软件模块又可以是硬件部件内的结构。The controller may be implemented in any suitable manner, for example, the controller may take a microprocessor or processor and a computer-readable medium storing computer-readable program code (such as software or firmware) executable by the (micro)processor , Logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers and embedded microcontrollers. Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the control logic of the memory. Those skilled in the art also know that, in addition to implementing the controller in the form of pure computer-readable program code, it is entirely possible to logically program method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers and embedded To achieve the same function in the form of a microcontroller, etc. Therefore, such a controller can be regarded as a hardware component, and the device for implementing various functions included therein can also be regarded as a structure within the hardware component. Or even, the means for realizing various functions can be regarded as both a software module of an implementation method and a structure within a hardware component.
上述实施例阐明的系统、装置、模块或单元,具体可以由计算机芯片或实体实现,或者由具有某种功能的产品来实现。一种典型的实现设备为计算机。具体的,计算机例如可以为个人计算机、膝上型计算机、车载人机交互设备、蜂窝电话、相机电话、智能电话、个人数字助理、媒体播放器、导航设备、电子邮件设备、游戏控制台、平板计算机、可穿戴设备或者这些设备中的任何设备的组合。The system, device, module or unit explained in the above embodiments may be specifically implemented by a computer chip or entity, or implemented by a product with a certain function. A typical implementation device is a computer. Specifically, the computer may be, for example, a personal computer, a laptop computer, an on-board human-machine interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet A computer, a wearable device, or any combination of these devices.
虽然本说明书一个或多个实施例提供了如实施例或流程图所述的方法操作步骤,但基于常规或者无创造性的手段可以包括更多或者更少的操作步骤。实施例中列举的步骤顺序仅仅为众多步骤执行顺序中的一种方式,不代表唯一的执行顺序。在实际中的装置或终端产品执行时,可以按照实施例或者附图所示的方法顺序执行或者并行执行(例如并行处理器或者多线程处理的环境,甚至为分布式数据处理环境)。术语“包括”、“包含” 或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、产品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、产品或者设备所固有的要素。在没有更多限制的情况下,并不排除在包括所述要素的过程、方法、产品或者设备中还存在另外的相同或等同要素。第一,第二等词语用来表示名称,而并不表示任何特定的顺序。Although one or more embodiments of this specification provide method operation steps as described in the embodiments or flowcharts, more or fewer operation steps may be included based on conventional or non-inventive means. The order of the steps listed in the embodiment is only one way among the order of execution of many steps, and does not represent a unique order of execution. When the actual device or terminal product is executed, it may be executed sequentially or in parallel according to the method shown in the embodiments or the drawings (for example, a parallel processor or multi-threaded processing environment, or even a distributed data processing environment). The terms "include", "include" or any other variation thereof are intended to cover non-exclusive inclusion, so that a process, method, product, or device that includes a series of elements includes not only those elements, but also others that are not explicitly listed Elements, or also include elements inherent to such processes, methods, products, or equipment. Without more restrictions, it does not exclude that there are other identical or equivalent elements in the process, method, product or equipment including the elements. The first and second words are used to indicate names, but do not indicate any particular order.
为了描述的方便,描述以上装置时以功能分为各种模块分别描述。当然,在实施本说明书一个或多个时可以把各模块的功能在同一个或多个软件和/或硬件中实现,也可以将实现同一功能的模块由多个子模块或子单元的组合实现等。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。For the convenience of description, when describing the above device, the functions are divided into various modules and described separately. Of course, when implementing one or more of this specification, the functions of each module may be implemented in the same or more software and/or hardware, or the modules that achieve the same function may be implemented by a combination of multiple submodules or subunits, etc. . The device embodiments described above are only schematic. For example, the division of the unit is only a division of logical functions. In actual implementation, there may be another division manner, for example, multiple units or components may be combined or integrated To another system, or some features can be ignored, or not implemented. In addition, the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
本发明是参照根据本发明实施例的方法、装置(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The present invention is described with reference to flowcharts and/or block diagrams of methods, apparatuses (systems), and computer program products according to embodiments of the present invention. It should be understood that each flow and/or block in the flowchart and/or block diagram and a combination of the flow and/or block in the flowchart and/or block diagram may be implemented by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device An apparatus for realizing the functions specified in one block or multiple blocks of one flow or multiple flows of a flowchart and/or one block or multiple blocks of a block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions The device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device The instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。In a typical configuration, the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。The memory may include non-permanent memory, random access memory (RAM) and/or non-volatile memory in computer-readable media, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储、石墨烯存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。Computer-readable media, including permanent and non-permanent, removable and non-removable media, can store information by any method or technology. The information may be computer readable instructions, data structures, modules of programs, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only compact disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic cassette tapes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. As defined in this article, computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.
本领域技术人员应明白,本说明书一个或多个实施例可提供为方法、系统或计算机程序产品。因此,本说明书一个或多个实施例可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本说明书一个或多个实施例可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。Those skilled in the art should understand that one or more embodiments of this specification may be provided as a method, system, or computer program product. Therefore, one or more embodiments of this specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, one or more embodiments of this specification may employ computer programs implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code The form of the product.
本说明书一个或多个实施例可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本本说明书一个或多个实施例,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。One or more embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as program modules. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types. One or more embodiments of this specification can also be practiced in distributed computing environments in which tasks are performed by remote processing devices connected through a communication network. In a distributed computing environment, program modules may be located in local and remote computer storage media including storage devices.
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本说明书的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结 构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。The embodiments in this specification are described in a progressive manner. The same or similar parts between the embodiments can be referred to each other. Each embodiment focuses on the differences from other embodiments. In particular, for the system embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant part can be referred to the description of the method embodiment. In the description of this specification, the description referring to the terms "one embodiment", "some embodiments", "examples", "specific examples", or "some examples" means specific features described in conjunction with the embodiment or examples , Structure, material or characteristic is included in at least one embodiment or example of this specification. In this specification, the schematic representation of the above terms does not necessarily refer to the same embodiment or example. Furthermore, the specific features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. In addition, without contradicting each other, those skilled in the art may combine and combine different embodiments or examples and features of the different embodiments or examples described in this specification.
以上所述仅为本说明书一个或多个实施例的实施例而已,并不用于限制本本说明书一个或多个实施例。对于本领域技术人员来说,本说明书一个或多个实施例可以有各种更改和变化。凡在本说明书的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在权利要求范围之内。The above is only an embodiment of one or more embodiments of this specification, and is not intended to limit one or more embodiments of this specification. For those skilled in the art, various modifications and changes can be made to one or more embodiments of this specification. Any modifications, equivalent replacements, improvements, etc. made within the spirit and principle of this specification shall be included in the scope of the claims.

Claims (18)

  1. 一种乳腺肿块图像识别方法,其特征在于,包括:A breast mass image recognition method, characterized in that it includes:
    获取待识别的乳腺磁共振图像;Obtain the breast magnetic resonance image to be identified;
    将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
    其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  2. 如权利要求1所述的方法,其特征在于,利用所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:The method according to claim 1, characterized in that the basic convolution module in the U-shaped convolutional neural network model is used for feature extraction using the following formula:
    y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 )
    上式中,y 1表示提取出的特征图,x 1表示输入的乳腺磁共振图像,δ表示第一激活函数,W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map, x 1 represents the input breast magnetic resonance image, δ represents the first activation function, W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers, b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  3. 如权利要求1所述的方法,其特征在于,所述方法还包括:利用所述深度全卷积神经网络模型采用下述公式将提取出的特征图进行调整:The method of claim 1, wherein the method further comprises: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
    y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 )
    上式中,y 2表示调整后的特征图,x 2表示提取出的特征图,δ表示第一激活函数,W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, x 2 represents the extracted feature map, δ represents the first activation function, W 21 and W 22 represent the weights corresponding to different convolutional layers, and b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  4. 如权利要求1所述的方法,其特征在于,所述方法还包括:The method of claim 1, wherein the method further comprises:
    利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
    根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
  5. 如权利要求4所述的方法,其特征在于,所述将提取出的特征图赋予不同的权重值,包括:The method according to claim 4, wherein the assigning the extracted feature maps with different weight values includes:
    对提取出的特征图进行平均池化;Perform average pooling on the extracted feature maps;
    将平均池化后的特征图利用全连接层进行连接,并采用第二激活函数进行激活,获得所述特征图对应的权重值。The averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  6. 如权利要求5所述的方法,其特征在于,所述将提取出的特征图赋予不同的权重值,包括:采用下述公式获得所述特征图对应的权重值:The method according to claim 5, wherein the assigning the extracted feature maps with different weight values includes: using the following formula to obtain the weight values corresponding to the feature maps:
    Figure PCTCN2018120885-appb-100001
    Figure PCTCN2018120885-appb-100001
    上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,x c表示输入的第c个特征图,S表示特征图对应的权重值向量,Z表示特征图池化后的池化结果向量,σ表示所述第二激活函数,W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and x c represents the input c-th feature map Feature maps, S represents the weight value vector corresponding to the feature map, Z represents the pooled result vector after the feature map is pooled, σ represents the second activation function, the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  7. 如权利要求1所述的方法,其特征在于,所述乳腺肿块识别模型采用下述方法构建:The method according to claim 1, wherein the breast mass identification model is constructed using the following method:
    获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
    建立所述乳腺肿块识别模型,Establish the breast mass identification model,
    将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,对所述乳腺肿块识别模型进行训练,直至所述乳腺肿块识别模型达到预设要求。The breast magnetic resonance image in the sample data is used as the input data of the breast mass identification model, and the corresponding tumor marker in the breast magnetic resonance image is used as the output data of the breast mass identification model. The lump identification model is trained until the breast lump identification model meets the preset requirements.
  8. 如权利要求7所述的方法,其特征在于,所述方法还包括:The method of claim 7, wherein the method further comprises:
    将所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数进行对比;Comparing the image parameters of the breast magnetic resonance image to be identified with the breast magnetic resonance image in the sample data;
    若所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数不一致,则利用所述待识别的乳腺磁共振图像对所述乳腺肿块识别模型进行调整;If the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data are inconsistent, use the breast magnetic resonance image to be identified to adjust the breast mass recognition model;
    利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别。Using the adjusted breast mass identification model, mass identification of the breast magnetic resonance image to be identified is performed.
  9. 一种乳腺肿块图像识别装置,其特征在于,包括:A breast mass image recognition device, characterized in that it includes:
    图像获取模块,用于获取待识别的乳腺磁共振图像;The image acquisition module is used to acquire the breast magnetic resonance image to be identified;
    肿块识别模块,用于将所述待识别的乳腺磁共振图像输入到构建的乳腺肿块识别模型中,获得所述待识别的乳腺磁共振图像中的肿块识别结果;The lump identification module is used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass recognition result in the breast magnetic resonance image to be identified;
    其中,所述肿块识别模型采用深度全卷积神经网络模型,所述深度全卷积神经网络模型的编码过程采用U型卷积神经网络模型中的基本卷积模块进行特征提取,所述深度全卷积神经网络模型的解码过程采用密集连接将待融合的特征图统一大小后再融合。Wherein, the mass recognition model uses a deep full convolutional neural network model, and the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction. The decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  10. 如权利要求9所述的装置,其特征在于,所述U型卷积神经网络模型中的基本卷积模块采用下述公式进行特征提取:The device according to claim 9, wherein the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
    y 1=δ(W 13*δ(W 12*δ(W 11*x 1+b 11)+b 12)+b 13) y 1 =δ(W 13 *δ(W 12 *δ(W 11 *x 1 +b 11 )+b 12 )+b 13 )
    上式中,y 1表示提取出的特征图,x 1表示输入的乳腺磁共振图像,δ表示第一激活函数,W 11、W 12、W 13表示不同卷积层对应的权重,b 11、b 12、b 13表示不同卷积层对应的偏置参数。 In the above formula, y 1 represents the extracted feature map, x 1 represents the input breast magnetic resonance image, δ represents the first activation function, W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers, b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  11. 如权利要求9所述的装置,其特征在于,所述深度全卷积神经网络模型中包括特征适应模块,用于采用下述公式将提取出的特征图进行调整:The device according to claim 9, wherein the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
    y 2=δ((W 22*δ(W 21*x 2+b 21)+b 22)+x 2) y 2 =δ((W 22 *δ(W 21 *x 2 +b 21 )+b 22 )+x 2 )
    上式中,y 2表示调整后的特征图,x 2表示提取出的特征图,δ表示第一激活函数,W 21、W 22表示不同卷积层对应的权重,b 21、b 22表示不同卷积层对应的偏置参数。 In the above formula, y 2 represents the adjusted feature map, x 2 represents the extracted feature map, δ represents the first activation function, W 21 and W 22 represent the weights corresponding to different convolutional layers, and b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  12. 如权利要求9所述的装置,其特征在于,所述深度全卷积神经网络模型中还包括通道注意力模块用于:The apparatus of claim 9, wherein the deep fully convolutional neural network model further includes a channel attention module for:
    利用所述深度全卷积神经网络模型将提取出的特征图赋予不同的权重值;Using the deep fully convolutional neural network model to assign different weight values to the extracted feature map;
    根据各特征图对应的权重值进行特征融合。Feature fusion is performed according to the weight value corresponding to each feature map.
  13. 如权利要求12所述的装置,其特征在于,所述通道注意力模块具体用于:The device of claim 12, wherein the channel attention module is specifically used to:
    对提取出的特征图进行平均池化;Perform average pooling on the extracted feature maps;
    将平均池化后的特征图利用全连接层进行连接,并采用第二激活函数进行激活,获得所述特征图对应的权重值。The averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  14. 如权利要求13所述的装置,其特征在于,所述通道注意力模块具体用于采用下述公式获得所述特征图对应的权重值:The apparatus according to claim 13, wherein the channel attention module is specifically configured to obtain the weight value corresponding to the feature map using the following formula:
    Figure PCTCN2018120885-appb-100002
    Figure PCTCN2018120885-appb-100002
    上式中,Z c表示特征图平均池化第c个特征图对应的池化结果,H表示第c个特征图的高,M表示第c个特征图的宽,x c表示输入的第c个特征图,S表示特征图对应的权重值向量,Z表示特征图池化后的池化结果向量,σ表示所述第二激活函数,W 32、W 31不同全连接层对应的权重,b 31、b 32表示不同全连接层对应的偏置参数。 In the above formula, Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map, H represents the height of the c-th feature map, M represents the width of the c-th feature map, and x c represents the input c-th feature map Feature maps, S represents the weight value vector corresponding to the feature map, Z represents the pooled result vector after the feature map is pooled, σ represents the second activation function, the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  15. 如权利要求9所述的装置,其特征在于,所述装置还包括模型构建模块用于采用下述方法构建所述乳腺肿块识别模型:The device according to claim 9, characterized in that the device further comprises a model building module for building the breast mass identification model by the following method:
    获取多个样本数据,所述样本数据包括:乳腺磁共振图像和所述乳腺磁共振图像中的肿块标记;Acquire multiple sample data, the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
    建立所述乳腺肿块识别模型,将所述样本数据中的乳腺磁共振图像作为所述乳腺肿块识别模型的输入数据,将对应的所述乳腺磁共振图像中的肿块标记作为所述乳腺肿块识别模型的输出数据,对所述乳腺肿块识别模型进行训练,直至所述乳腺肿块识别模型达到预设要求。Establish the breast mass identification model, use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  16. 如权利要求15所述的装置,其特征在于,所述装置还包括模型调整模块用于:The apparatus of claim 15, wherein the apparatus further comprises a model adjustment module for:
    将所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数进行对比;Comparing the image parameters of the breast magnetic resonance image to be identified with the breast magnetic resonance image in the sample data;
    若所述待识别的乳腺磁共振图像与所述样本数据中的乳腺磁共振图像的图像参数不一致,则利用所述待识别的乳腺磁共振图像对所述乳腺肿块识别模型进行调整;If the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data are inconsistent, use the breast magnetic resonance image to be identified to adjust the breast mass recognition model;
    利用调整后的乳腺肿块识别模型对所述待识别的乳腺磁共振图像进行肿块识别。Using the adjusted breast mass identification model, mass identification of the breast magnetic resonance image to be identified is performed.
  17. 一种乳腺肿块图像识别处理设备,其特征在于,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现权利要求1-8任一项所述的方法。A breast mass image recognition processing device, characterized in that it includes: at least one processor and a memory for storing processor executable instructions, and the processor implements the instructions to implement any of claims 1-8 The method described.
  18. 一种乳腺肿块图像识别系统,其特征在于,包括:至少一个处理器以及用于存储处理器可执行指令的存储器,所述处理器执行所述指令时实现权利要求1-8任一项所述的方法。A breast mass image recognition system, characterized in that it includes: at least one processor and a memory for storing processor executable instructions, and the processor implements the instructions to implement any one of claims 1-8 Methods.
PCT/CN2018/120885 2018-12-13 2018-12-13 Mammary gland mass image recognition method and device WO2020118618A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/120885 WO2020118618A1 (en) 2018-12-13 2018-12-13 Mammary gland mass image recognition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/120885 WO2020118618A1 (en) 2018-12-13 2018-12-13 Mammary gland mass image recognition method and device

Publications (1)

Publication Number Publication Date
WO2020118618A1 true WO2020118618A1 (en) 2020-06-18

Family

ID=71075281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/120885 WO2020118618A1 (en) 2018-12-13 2018-12-13 Mammary gland mass image recognition method and device

Country Status (1)

Country Link
WO (1) WO2020118618A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112053363A (en) * 2020-08-19 2020-12-08 苏州超云生命智能产业研究院有限公司 Retinal vessel segmentation method and device and model construction method
CN112164028A (en) * 2020-09-02 2021-01-01 陈燕铭 Pituitary adenoma magnetic resonance image positioning diagnosis method and device based on artificial intelligence
CN112990359A (en) * 2021-04-19 2021-06-18 深圳市深光粟科技有限公司 Image data processing method and device, computer and storage medium
CN113887499A (en) * 2021-10-21 2022-01-04 清华大学 Sand dune image recognition model, creation method thereof and sand dune image recognition method
CN114419064A (en) * 2022-01-10 2022-04-29 陕西师范大学 Mammary gland duct region image segmentation method based on RN-DoubleU-Net network
CN116310406A (en) * 2023-05-22 2023-06-23 浙江之科云创数字科技有限公司 Image detection method and device, storage medium and electronic equipment
CN116309585A (en) * 2023-05-22 2023-06-23 山东大学 Method and system for identifying breast ultrasound image target area based on multitask learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657239A (en) * 2017-09-30 2018-02-02 清华大学深圳研究生院 Palmprint image gender classification method and device, computer installation and readable storage medium storing program for executing
CN107945181A (en) * 2017-12-30 2018-04-20 北京羽医甘蓝信息技术有限公司 Treating method and apparatus for breast cancer Lymph Node Metastasis pathological image
CN108154105A (en) * 2017-12-21 2018-06-12 深圳先进技术研究院 Aquatic organism detects and recognition methods, device, server and terminal device
CN108364025A (en) * 2018-02-11 2018-08-03 广州市碳码科技有限责任公司 Gastroscope image-recognizing method, device, equipment and medium based on deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657239A (en) * 2017-09-30 2018-02-02 清华大学深圳研究生院 Palmprint image gender classification method and device, computer installation and readable storage medium storing program for executing
CN108154105A (en) * 2017-12-21 2018-06-12 深圳先进技术研究院 Aquatic organism detects and recognition methods, device, server and terminal device
CN107945181A (en) * 2017-12-30 2018-04-20 北京羽医甘蓝信息技术有限公司 Treating method and apparatus for breast cancer Lymph Node Metastasis pathological image
CN108364025A (en) * 2018-02-11 2018-08-03 广州市碳码科技有限责任公司 Gastroscope image-recognizing method, device, equipment and medium based on deep learning

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112053363B (en) * 2020-08-19 2023-12-15 苏州超云生命智能产业研究院有限公司 Retina blood vessel segmentation method, retina blood vessel segmentation device and model construction method
CN112053363A (en) * 2020-08-19 2020-12-08 苏州超云生命智能产业研究院有限公司 Retinal vessel segmentation method and device and model construction method
CN112164028A (en) * 2020-09-02 2021-01-01 陈燕铭 Pituitary adenoma magnetic resonance image positioning diagnosis method and device based on artificial intelligence
CN112990359A (en) * 2021-04-19 2021-06-18 深圳市深光粟科技有限公司 Image data processing method and device, computer and storage medium
CN112990359B (en) * 2021-04-19 2024-01-26 深圳市深光粟科技有限公司 Image data processing method, device, computer and storage medium
CN113887499A (en) * 2021-10-21 2022-01-04 清华大学 Sand dune image recognition model, creation method thereof and sand dune image recognition method
CN113887499B (en) * 2021-10-21 2022-11-18 清华大学 Sand dune image recognition model, creation method thereof and sand dune image recognition method
CN114419064A (en) * 2022-01-10 2022-04-29 陕西师范大学 Mammary gland duct region image segmentation method based on RN-DoubleU-Net network
CN114419064B (en) * 2022-01-10 2024-04-05 陕西师范大学 Mammary gland area image segmentation method based on RN-DoubleU-Net network
CN116310406A (en) * 2023-05-22 2023-06-23 浙江之科云创数字科技有限公司 Image detection method and device, storage medium and electronic equipment
CN116309585B (en) * 2023-05-22 2023-08-22 山东大学 Method and system for identifying breast ultrasound image target area based on multitask learning
CN116310406B (en) * 2023-05-22 2023-08-11 浙江之科云创数字科技有限公司 Image detection method and device, storage medium and electronic equipment
CN116309585A (en) * 2023-05-22 2023-06-23 山东大学 Method and system for identifying breast ultrasound image target area based on multitask learning

Similar Documents

Publication Publication Date Title
WO2020118618A1 (en) Mammary gland mass image recognition method and device
WO2020215984A1 (en) Medical image detection method based on deep learning, and related device
Wang et al. Deep attentional features for prostate segmentation in ultrasound
CN110475505B (en) Automatic segmentation using full convolution network
WO2020133636A1 (en) Method and system for intelligent envelope detection and warning in prostate surgery
WO2022001623A1 (en) Image processing method and apparatus based on artificial intelligence, and device and storage medium
CN109685077A (en) A kind of breast lump image-recognizing method and device
JP2020516428A (en) Evaluation of density in mammography
Jiao et al. Burn image segmentation based on Mask Regions with Convolutional Neural Network deep learning framework: more accurate and more convenient
Eslami et al. Automatic vocal tract landmark localization from midsagittal MRI data
McCullough et al. Convolutional neural network models for automatic preoperative severity assessment in unilateral cleft lip
Ding et al. A multi-scale channel attention network for prostate segmentation
Ai et al. ResCaps: an improved capsule network and its application in ultrasonic image classification of thyroid papillary carcinoma
Banerjee et al. A semi-automated approach to improve the efficiency of medical imaging segmentation for haptic rendering
Škardová et al. Mechanical and imaging models-based image registration
Tang et al. Lesion segmentation and RECIST diameter prediction via click-driven attention and dual-path connection
CN111127400A (en) Method and device for detecting breast lesions
Choi et al. Automatic initialization active contour model for the segmentation of the chest wall on chest CT
Kurzendorfer et al. Random forest based left ventricle segmentation in LGE-MRI
Gonçalves et al. Deep aesthetic assessment of breast cancer surgery outcomes
WO2020118614A1 (en) Image identification method and device for patches on head and neck
Adegun et al. Fully convolutional encoder-decoder architecture (FCEDA) for skin lesions segmentation
Hossain et al. The segmentation of nuclei from histopathology images with synthetic data
Xu et al. MD-TransUNet: TransUNet with Multi-attention and Dilated Convolution for Brain Stroke Lesion Segmentation
CN116977385A (en) Image registration method, device and equipment based on unsupervised learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18943287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18943287

Country of ref document: EP

Kind code of ref document: A1