WO2020118618A1 - Méthode et dispositif de reconnaissance d'image de masse de glande mammaire - Google Patents

Méthode et dispositif de reconnaissance d'image de masse de glande mammaire Download PDF

Info

Publication number
WO2020118618A1
WO2020118618A1 PCT/CN2018/120885 CN2018120885W WO2020118618A1 WO 2020118618 A1 WO2020118618 A1 WO 2020118618A1 CN 2018120885 W CN2018120885 W CN 2018120885W WO 2020118618 A1 WO2020118618 A1 WO 2020118618A1
Authority
WO
WIPO (PCT)
Prior art keywords
breast
magnetic resonance
feature map
resonance image
mass
Prior art date
Application number
PCT/CN2018/120885
Other languages
English (en)
Chinese (zh)
Inventor
李程
王珊珊
郑海荣
刘新
梁栋
Original Assignee
深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳先进技术研究院 filed Critical 深圳先进技术研究院
Priority to PCT/CN2018/120885 priority Critical patent/WO2020118618A1/fr
Publication of WO2020118618A1 publication Critical patent/WO2020118618A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • This specification belongs to the technical field of image processing, and in particular relates to a method and device for recognizing breast mass images.
  • Breast cancer is the cancer with the highest incidence of women.
  • the cure rate of early breast cancer is much higher than that of advanced breast cancer.
  • Early detection, early diagnosis and early treatment are the keys to reducing the mortality of breast cancer.
  • Imaging examinations including mammography, ultrasound, and magnetic resonance imaging are common techniques for early breast cancer screening.
  • Breast cancer screening can be performed using image examination results to identify breast masses.
  • Breast masses in different patients vary greatly in size and shape, which poses a great challenge to the automatic segmentation method, and mass segmentation is the first priority for all subsequent breast cancer diagnosis and treatment Steps, so the automatic segmentation of breast masses is of great significance to computer-aided diagnosis systems.
  • breast cancer computer-aided diagnosis systems mainly use manual extraction of features to locate or classify breast masses.
  • Manual feature extraction depends on the professional and empirical knowledge of the researcher, and often has certain limitations and subjectivity, and the results will be greatly affected.
  • the automatic segmentation method of breast masses in the prior art usually only gives the location of the masses, and there is no information on the shape and size of the masses. Therefore, there is an urgent need in the art for a technical solution that can accurately segment a breast mass region.
  • the purpose of this specification is to provide a method and device for breast mass image recognition, which realizes the automatic identification of breast masses and improves the accuracy of breast mass identification.
  • the embodiments of the present specification provide a method for recognizing breast mass images, including:
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the basic convolution module in the U-shaped convolutional neural network model is used to extract features using the following formula:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the method further includes: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the method further includes:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the assigning the extracted feature map with different weight values includes:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the assigning the extracted feature map with different weight values includes: using the following formula to obtain the weight value corresponding to the feature map:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the breast mass identification model is constructed using the following method:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the method further includes:
  • this specification provides a breast mass image recognition device, including:
  • the image acquisition module is used to acquire the breast magnetic resonance image to be identified
  • the lump identification module is used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass recognition result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the deep fully convolutional neural network model further includes a channel attention module for:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the channel attention module is specifically used to:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the device further includes a model building module for building the breast mass identification model using the following method:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • the breast mass identification model includes multiple model parameters
  • the breast magnetic resonance image in the sample data is used as the input data of the breast mass identification model, and the corresponding tumor marker in the breast magnetic resonance image is used as the output data of the breast mass identification model to adjust the breast
  • the model parameters of the lump identification model until the breast lump identification model meets the preset requirements.
  • the device further includes a model adjustment module for:
  • this specification provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor-executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
  • the present specification provides a breast mass image recognition system, including at least one processor and a memory for storing processor executable instructions, and when the processor executes the instructions, the breast in the embodiments of the specification is implemented Lump image recognition method.
  • the breast mass image recognition method, device, processing equipment and system provided in this specification based on deep learning, combine the U-shaped convolutional neural network model with the dense convolutional neural network model to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features.
  • the effect of the model on segmentation of MRI breast masses is greatly improved, and subsequent network models are not required to further optimize the segmentation results, which reduces the calculation cost, speeds up the image analysis process, and can better assist doctors in real-time image diagnosis.
  • FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of this specification
  • FIG. 2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of this specification
  • FIG. 3 is a schematic diagram of a module structure of an embodiment of a breast mass image recognition device provided in this specification
  • FIG. 4 is a schematic structural diagram of a breast mass image recognition device according to another embodiment of the present specification.
  • FIG. 5 is a schematic structural diagram of a breast mass image recognition device according to yet another embodiment of the present specification.
  • FIG. 6 is a block diagram of the hardware structure of a breast mass identification server using an embodiment of the present application.
  • the breast mass image recognition method in this manual can be applied to the client or server.
  • the client can be a smart phone, tablet computer, smart wearable device (smart watch, virtual reality glasses, virtual reality helmet, etc.), smart vehicle equipment and other electronic equipment.
  • FIG. 1 is a schematic flowchart of a breast mass image recognition method in an embodiment of the present specification.
  • the overall process of the breast mass image recognition method provided in an embodiment of the present specification may include:
  • Step 102 Acquire a breast magnetic resonance image to be identified.
  • Magnetic resonance examination is currently a relatively common medical examination method.
  • a user's breast magnetic resonance image can be acquired, and the breast mass can be identified based on the acquired breast magnetic resonance image to be identified.
  • Step 104 Input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • a breast mass identification model can be constructed based on deep learning methods.
  • the existing breast cancer patient’s breast magnetic resonance image data can be used for model training to learn from input magnetic resonance image to output breast A function mapping relationship between the results of mass segmentation to construct a breast mass identification model.
  • 2 is a schematic diagram of a network architecture of a breast mass identification model in an embodiment of the present specification.
  • the breast mass identification model in the embodiment of the present specification may be a deep fully convolutional neural network model, which can be U-shaped
  • the convolutional neural network model ie U-Net
  • DenseNet dense convolutional neural network model
  • the basic convolution module in the U-shaped convolutional neural network model can be used for feature extraction.
  • the dense connections in the dense convolutional neural network model can be used to unify the feature maps to be fused, and then feature fusion .
  • U-Net can be understood as a variant of the convolutional neural network, whose structure is mainly like the letter U, hence the name U-Net.
  • the entire neural network of U-Net is mainly composed of two parts: the contraction path and the expansion path.
  • the contraction path is mainly used to capture the context information in the picture, and the expansion path commensurate with it is to segment the need in the picture.
  • DenseNet can be understood as a convolutional neural network with dense connections. In this network, there is a direct connection between any two layers, that is, the input of each layer of the network is the union of the outputs of all previous layers. , And the feature map learned by this layer will be directly passed to all subsequent layers as input. Dense connections can alleviate the problem of gradient disappearance, strengthen feature propagation, encourage feature reuse, greatly reduce the amount of parameters, and improve the accuracy of image recognition.
  • the basic convolution module on the left in Figure 2 can use a basic convolution module similar to the U-shaped convolutional neural network model for feature extraction, and the downward arrow of the basic convolution module can indicate maximum pooling .
  • the channel attention module on the right side in FIG. 2 can be understood as a decoding process, and the upward arrow of the channel attention module can represent bilinear interpolation, where the decoding process can perform feature fusion together on the feature maps of the coding process of the same layer.
  • the connecting line with arrows on the right side of the channel attention module in FIG. 2 may represent a dense connection.
  • the network module of the decoding process of the deep fully convolutional neural network model may be densely connected, that is, the The channel attention module makes intensive connections.
  • the channel attention module can uniformly size the feature maps extracted from the coding process of the same layer and the feature maps of other channel attention modules, and then perform feature fusion.
  • the uppermost channel attention module in Figure 2 can unify the feature map obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowest feature adaptation module Size, and then merge the feature maps of uniform size.
  • the feature map obtained by the uppermost channel attention module can be obtained by the encoding process on the left, the feature map output by the three channel attention modules below it, and the feature map output by the lowermost feature adaptation module can be understood as the uppermost channel The feature map of the attention module to be fused.
  • dense connections are added in the decoding process. These dense connections can separately upsample feature maps with different degrees of abstraction to a uniform size, and then directly merge them. Feature maps are reused. Feature maps with a high degree of abstraction can better guide the classification results, while feature maps with a low degree of abstraction can better retain location information, and the segmentation results are greatly improved.
  • the breast magnetic resonance image to be identified is input into the constructed breast mass identification model, and the breast mass identification result of the breast magnetic resonance image to be identified is obtained using the breast mass identification model.
  • the mass to be identified can be identified Whether there is a lump in the breast magnetic resonance image, and if so, it can also identify the area where the lump is located, or the shape and size of the breast lump.
  • the bright spot in the output image of the rightmost breast mass recognition model in Figure 2 can represent the segmented mass, that is, the area of the mass in the breast magnetic resonance image input on the left, and the doctor can make a further diagnosis based on the identified mass area And treatment, or used for other medical research.
  • the breast mass image recognition method provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. Greatly improve the effect of network model on the segmentation of MRI breast masses, without the need for further network model to further optimize the segmentation results, reduce the calculation cost, speed up the image analysis process, and better assist doctors in real-time image diagnosis.
  • the basic convolution module in the U-shaped convolutional neural network model can be used to extract features using the following formula:
  • y 1 represents the extracted feature map that is the output of the basic convolution module
  • x 1 represents the input breast magnetic resonance image that is the input of the basic convolution module
  • represents the first activation function (which can be a linear rectification function)
  • W 11 , W 12 , and W 13 represent weights corresponding to different convolutional layers
  • b 11 , b 12 , and b 13 represent offset parameters corresponding to different convolutional layers.
  • the basic convolution module in the figure may include the function of the above formula (1), and use the above formula (1) for feature extraction.
  • the basic convolution module can perform multiple convolution operations, that is, the basic convolution module can include multiple convolutional layers, and different convolutional layers have different weights W 11 , W 12 , W 13 and offset parameters b 11 , b 12 and b 13 .
  • the basic convolution module may include three convolutional layers, and other numbers of convolutional layers may be set according to actual needs. According to the number of convolutional layers, the above formula (1) may be adaptively adjusted or Modifications are not specifically limited in the embodiments of this specification.
  • the basic convolution modules can be connected by maximum pooling.
  • the maximum pooling can increase the receptive field and to a certain extent achieve the invariance of the input image translation. After the feature map passes through each maximum pooling layer, the resolution is halved, and the channel Number doubled.
  • the embodiments of this specification use a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
  • the method may further include: using the deep fully convolutional neural network model to adjust the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map, that is, the output of the feature adaptation module
  • x 2 represents the extracted feature map, that is, the input of the feature adaptation module
  • represents the first activation function (which may be a linear rectification function)
  • W 21 , W 22 represents weights corresponding to different convolutional layers
  • b 21 , b 22 represent offset parameters corresponding to different convolutional layers.
  • the deep fully convolutional neural network model in the embodiment of this specification can also include a feature application module, and the feature adaptation module can adjust the basic convolution in the encoding process The feature map generated by the module to enhance the subsequent feature fusion effect.
  • the feature application module may include the function of the above formula (2), and use the above formula (2) to adjust the feature map.
  • the feature adaptation module may also include multiple convolutional layers, and different convolutional layers may correspond to different The weights W 21 , W 22 and b 21 , b 22 .
  • the feature adaptation module may include 2 convolutional layers, and other numbers of convolutional layers may be set according to actual needs.
  • the above formula (2) may be adaptively adjusted or modified according to the number of convolutional layers
  • the examples in this specification are not specifically limited.
  • the feature maps with different levels of abstraction generated in the encoding process can be fused with the feature maps in the decoding process in a cascade manner after passing through the feature adaptation module.
  • the embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
  • the method further includes:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the embodiments of this specification can assign different feature maps with different weight values, such as: setting the weight value of some feature maps with more information to be higher, and reducing the amount of information or useless
  • the weight value of the feature map is set relatively low.
  • perform feature fusion according to the weight value corresponding to the feature map for example: you can multiply the weight value corresponding to each feature map to the feature map to be fused, increase the impact of the important feature map on the mass recognition result, and reduce the unimportant feature map Impact on the results of mass identification.
  • the extracted feature maps can be averagely pooled, for example: the feature maps adjusted by the feature adaptation module are averagely pooled, then through two fully connected layers, and finally through the second activation function as :
  • the Sigmoid function is activated to generate the weight of each feature map, and then the weight value is multiplied back to the fused feature map.
  • Average pooling can mean averaging all the values in the local acceptance domain.
  • the Sigmoid function is a common S-shaped function in biology, and it can also be called an S-shaped growth curve.
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • i can represent 1-H Numerical value
  • j can represent the value of 1-M
  • x c can represent the c-th feature map input by the channel attention module
  • S represents the weight value vector corresponding to the feature map (which can include the weight values corresponding to multiple feature maps)
  • Z represents the pooling result vector after the feature map is pooled (which may include pooling results corresponding to multiple feature maps, such as: Z c )
  • represents the second activation function (which may be a Sigmoid function)
  • W 32 Weights corresponding to different fully connected layers of W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the product of the weight values corresponding to each feature map and the feature map can be used as an output for feature fusion.
  • y 3 can represent the output of the channel attention module
  • S c can represent the weight value corresponding to the c-th feature map
  • x c can represent the c-th feature map input by the channel attention module.
  • the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
  • a breast mass identification model can be constructed in the following manner:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the breast magnetic resonance image of a historical user may be acquired as sample data, and the sample data may be a breast magnetic resonance image of a user who has been diagnosed with breast cancer.
  • the sample data may also include the acquired mass markers in the breast magnetic resonance image as training labels.
  • the specific number of sample data may be selected according to actual needs, which is not specifically limited in the embodiments of this specification.
  • the acquired breast magnetic resonance images can be normalized, that is, the pixels of the breast magnetic resonance images in the sample data are processed in a unified manner, which is convenient Follow-up model training. Then, the normalized breast magnetic resonance image is labeled with a mass, which can be labeled by a professional doctor or based on the user's diagnosis, and the location, size, etc. of the mass can be marked.
  • a breast mass identification model may be constructed, such as: constructing a network architecture of the breast mass identification model, etc.
  • the lump recognition model may include a basic convolution module, a feature adaptation module, and a channel attention module, where the basic convolution module and feature adaptation module may be understood as an encoding process, channel attention The module can be understood as the decoding process.
  • the breast mass identification model may also include multiple model parameters, such as: the size of the convolution kernel and the number of convolution layers.
  • the breast magnetic resonance image in the sample data can be used as the input data of the breast mass identification model, and the corresponding mass marker in the breast magnetic resonance image can be used as the output data of the breast mass identification model.
  • the recognition model performs model training until the breast mass recognition model meets the preset requirements, such as: the model output accuracy meets the requirements or the model training times meet the requirements, that is, the model training is concluded.
  • the embodiment of the present specification uses deep learning training to construct a breast mass identification model, which can realize the automatic identification of breast masses, without the need for manual identification, and improves the accuracy of breast mass identification.
  • the breast mass identification in the embodiment of this specification mainly includes two stages of model training and prediction.
  • existing breast magnetic resonance data can be used to design
  • the deep fully convolutional network performs optimization and parameter learning.
  • the trained network model is used to analyze new data that has not been seen during training, that is, the breast magnetic resonance image to be identified.
  • the image parameters of the breast magnetic resonance image to be identified and the breast magnetic resonance image in the sample data of the training stage (such as: main field intensity, relaxation time T1, T2, etc.) ) Compare and compare whether the distribution of the two image data is uniform, that is, whether the image contrast, signal to noise ratio, etc. are consistent.
  • the breast mass identification model constructed by training can be used directly for mass identification.
  • a small amount of new data that is, the breast magnetic resonance image to be identified (or an image with the same parameters as the breast magnetic resonance image to be identified) can be used to construct the breast Quickly fine-tune the parameters of the lump identification model.
  • the adjusted breast mass identification model is used to identify the mass of the breast magnetic resonance image to be identified, and the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
  • the embodiment of the present specification proposes an asymmetric codec main frame structure, designs a new feature fusion and screening mechanism, and introduces dense connections.
  • the segmentation method in the embodiment of the present specification can better retain the beneficial influence of useful features on the segmentation result, and weaken the role of useless features.
  • the effect of the network on the segmentation of the magnetic resonance breast mass can be greatly improved, and the subsequent network is not required to further optimize the segmentation result, which reduces the calculation cost, speeds up the image analysis process, and can better assist the doctor in real-time image diagnosis.
  • the breast mass image recognition method in the embodiment of the present specification may not be limited to identifying breast masses, but may also be used in other image recognition processes, such as: identifying other lesion areas (such as brain tumors) and the like. You can use the magnetic resonance image training of other parts to build a corresponding recognition model to complete the automatic recognition of other lesion areas.
  • one or more embodiments of the present specification further provide a breast mass image recognition device.
  • the device may include a system (including a distributed system), software (applications), modules, components, servers, clients, etc. using the method described in the embodiments of the present specification in combination with necessary hardware implementation devices.
  • the devices in one or more embodiments provided by the embodiments of this specification are as described in the following embodiments. Since the implementation solution of the device to solve the problem is similar to the method, the implementation of the specific device in the embodiments of the present specification may refer to the implementation of the foregoing method, and the repetition is not repeated.
  • unit or “module” may implement a combination of software and/or hardware that achieves a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementation of hardware or a combination of software and hardware is also possible and conceived.
  • FIG. 3 is a schematic diagram of a module structure of an embodiment of the breast mass image recognition device provided in this specification.
  • the breast mass image recognition device provided in this specification includes: an image acquisition module 31 and a mass identification module 32 ,among them:
  • the image acquisition module 31 can be used to acquire a breast magnetic resonance image to be identified
  • the mass identification module 32 may be used to input the breast magnetic resonance image to be identified into the constructed breast mass identification model to obtain the mass identification result in the breast magnetic resonance image to be identified;
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the breast mass image recognition device provided in the embodiment of the present specification combines the U-shaped convolutional neural network model with the dense convolutional neural network model based on deep learning to construct an asymmetric encoding and decoding breast mass identification model structure. Then input the breast magnetic resonance image to be recognized into the constructed breast mass recognition model, that is, to obtain the breast mass recognition result of the breast magnetic resonance image to be recognized, the automatic identification of the breast mass is realized, and the artificial naked eye recognition is not required, which improves Breast lump identification results.
  • the breast mass image recognition method provided by the implementation of this specification can better retain the beneficial effects of useful features on the segmentation results, and weaken the role of useless features. It can greatly improve the effect of the network on the segmentation of magnetic resonance breast masses, without the need for further network optimization of the segmentation results, reducing the calculation cost, speeding up the image analysis process, and better assisting doctors in real-time image diagnosis.
  • the basic convolution module in the U-shaped convolutional neural network model uses the following formula for feature extraction:
  • y 1 represents the extracted feature map
  • x 1 represents the input breast magnetic resonance image
  • represents the first activation function
  • W 11 , W 12 , W 13 represent the corresponding weights of different convolutional layers
  • b 11 , b 12 and b 13 represent offset parameters corresponding to different convolutional layers.
  • the breast mass image recognition device provided by the embodiment of the present specification uses a basic convolution module similar to U-Net for feature extraction, which can reduce the amount of sample data and improve the efficiency of data processing.
  • the deep fully convolutional neural network model includes a feature adaptation module for adjusting the extracted feature map using the following formula:
  • y 2 represents the adjusted feature map
  • x 2 represents the extracted feature map
  • represents the first activation function
  • W 21 and W 22 represent the weights corresponding to different convolutional layers
  • b 21 and b 22 represent different The offset parameter corresponding to the convolutional layer.
  • the embodiment of the present specification uses the feature adaptation module in the breast mass recognition model to adjust the feature map, improve the effect of feature fusion, and further improve the accuracy of breast mass identification.
  • the deep fully convolutional neural network model further includes a channel attention module for:
  • Feature fusion is performed according to the weight value corresponding to each feature map.
  • the channel attention module is specifically used to:
  • the averaged pooled feature map is connected using a fully connected layer, and is activated using a second activation function to obtain a weight value corresponding to the feature map.
  • the channel attention module is specifically used to obtain the weight value corresponding to the feature map using the following formula:
  • Z c represents the pooling result corresponding to the c-th feature map averaged by the feature map
  • H represents the height of the c-th feature map
  • M represents the width of the c-th feature map
  • x c represents the input c-th feature map Feature maps
  • S represents the weight value vector corresponding to the feature map
  • Z represents the pooled result vector after the feature map is pooled
  • represents the second activation function
  • the weights corresponding to different fully connected layers of W 32 and W 31 , b 31 and b 32 represent offset parameters corresponding to different fully connected layers.
  • the channel attention module is used to assign different feature maps with different weight values and then perform feature fusion to realize the screening of the feature maps, which can improve the influence of important feature maps and further improve the accuracy of the mass recognition results.
  • FIG. 4 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 4, on the basis of the above embodiment, the device further includes a model building module 41 for constructing the The breast mass identification model:
  • the sample data includes: a breast magnetic resonance image and a tumor marker in the breast magnetic resonance image;
  • Establish the breast mass identification model use the breast magnetic resonance image in the sample data as the input data of the breast mass identification model, and use the corresponding mass marker in the breast magnetic resonance image as the breast mass identification model Output data, training the breast mass identification model until the breast mass identification model meets the preset requirements.
  • the breast mass identification model is constructed by using deep learning training, which can realize the automatic identification of the breast mass without manual identification, and improves the accuracy of the breast mass identification.
  • FIG. 5 is a schematic structural diagram of a breast mass image recognition device in another embodiment of the present specification. As shown in FIG. 5, based on the above embodiment, the device further includes a model adjustment module 51 for:
  • the breast mass identification model is adjusted through continuous updates to improve the applicability of the model and further improve the accuracy of the mass identification results.
  • An embodiment of the present specification also provides a breast mass image recognition processing device, including: at least one processor and a memory for storing processor executable instructions, and the processor implements the instructions to implement the breast mass image of the above embodiment Identification methods, such as:
  • the mass recognition model uses a deep full convolutional neural network model
  • the encoding process of the deep full convolutional neural network model uses the basic convolution module in the U-shaped convolutional neural network model for feature extraction.
  • the decoding process of the convolutional neural network model uses dense connections to unify the feature maps to be fused and then fused.
  • the storage medium may include a physical device for storing information, usually after the information is digitized and then stored in a medium using electrical, magnetic, or optical means.
  • the storage medium may include: devices that use electrical energy to store information, such as various types of memory, such as RAM, ROM, etc.; devices that use magnetic energy to store information, such as hard disks, floppy disks, magnetic tapes, magnetic core memories, and bubble memories, U disk; a device that uses optical means to store information such as CD or DVD.
  • devices that use electrical energy to store information such as various types of memory, such as RAM, ROM, etc.
  • devices that use magnetic energy to store information such as hard disks, floppy disks, magnetic tapes, magnetic core memories, and bubble memories, U disk
  • a device that uses optical means to store information such as CD or DVD.
  • quantum memory graphene memory, and so on.
  • the breast mass identification system provided in this manual can be a separate breast mass identification system, or it can be applied to various data analysis and processing systems.
  • the system may include any breast mass image recognition device in the above embodiments.
  • the system may be a separate server, or it may include a server cluster, system (including distributed system), software (application) using one or more of the methods or one or more embodiments of this specification. Terminal devices that actually operate devices, logic gate devices, quantum computers, etc., combined with the necessary implementation hardware.
  • the detection system for checking the difference data may include at least one processor and a memory storing computer-executable instructions. When the processor executes the instructions, the steps of the method in any one or more of the above embodiments are implemented.
  • FIG. 6 is a block diagram of a hardware structure of a breast mass identification server using an embodiment of the present application.
  • the server 10 may include one or more (only one is shown in the figure) processor 100 (the processor 100 may include but is not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA), A memory 200 for storing data, and a transmission module 300 for communication functions.
  • a processing device such as a microprocessor MCU or a programmable logic device FPGA
  • a memory 200 for storing data
  • a transmission module 300 for communication functions.
  • the server 10 may also include more or fewer components than those shown in FIG. 6, for example, it may also include other processing hardware, such as a database or a multi-level cache, a GPU, or have a configuration different from that shown in FIG.
  • the memory 200 may be used to store software programs and modules of application software, such as program instructions/modules corresponding to the breast mass image recognition method in the embodiments of the present specification, and the processor 100 executes the software programs and modules stored in the memory 200 to execute Various functional applications and data processing.
  • the memory 200 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 200 may further include memories remotely provided with respect to the processor 100, and these remote memories may be connected to a computer terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the transmission module 300 is used to receive or send data via a network.
  • the specific example of the network described above may include a wireless network provided by a communication provider of computer terminals.
  • the transmission module 300 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through the base station to communicate with the Internet.
  • the transmission module 300 may be a radio frequency (RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF radio frequency
  • the method or apparatus described in the above embodiments provided in this specification can implement business logic through a computer program and be recorded on a storage medium, and the storage medium can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of this specification.
  • the above-mentioned breast mass image recognition method or device provided by the embodiment of the present specification can be implemented by a processor executing corresponding program instructions in a computer, such as using the Windows operating system C++ language to implement on the PC side, Linux system, or other, for example, using Android and iOS system programming languages are implemented in smart terminals, and quantum logic-based processing logic.
  • embodiments of this specification are not limited to those that must comply with industry communication standards, standard computer data processing and data storage rules, or those described in one or more embodiments of this specification.
  • Some industry standards or implementations described in a custom manner or embodiments based on slightly modified implementations can also achieve the same, equivalent, or similar, or predictable implementation effects of the foregoing embodiments. Examples obtained by applying these modified or deformed data acquisition, storage, judgment, processing methods, etc., can still fall within the scope of optional implementations of the examples in this specification.
  • the improvement of a technology can be clearly distinguished from the improvement in hardware (for example, the improvement of circuit structures such as diodes, transistors, and switches) or the improvement in software (the improvement of the process flow).
  • the improvement of many methods and processes can be regarded as a direct improvement of the hardware circuit structure.
  • Designers almost get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by hardware physical modules.
  • a programmable logic device Programmable Logic Device, PLD
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description Language
  • ABEL Advanced Boolean Expression
  • AHDL AlteraHardwareDescriptionLanguage
  • Confluence a specific programming language
  • CUPL CornellUniversityProgrammingLanguage
  • HDCal JHDL (JavaHardwareDescriptionLanguage)
  • Lava Lola
  • MyHDL PALASM
  • RHDL RubyHardwareDescription
  • the controller may be implemented in any suitable manner, for example, the controller may take a microprocessor or processor and a computer-readable medium storing computer-readable program code (such as software or firmware) executable by the (micro)processor , Logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers and embedded microcontrollers.
  • Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the control logic of the memory.
  • controller in addition to implementing the controller in the form of pure computer-readable program code, it is entirely possible to logically program method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers and embedded To achieve the same function in the form of a microcontroller, etc. Therefore, such a controller can be regarded as a hardware component, and the device for implementing various functions included therein can also be regarded as a structure within the hardware component. Or even, the means for realizing various functions can be regarded as both a software module of an implementation method and a structure within a hardware component.
  • the system, device, module or unit explained in the above embodiments may be specifically implemented by a computer chip or entity, or implemented by a product with a certain function.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, an on-board human-machine interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet A computer, a wearable device, or any combination of these devices.
  • the functions are divided into various modules and described separately.
  • the functions of each module may be implemented in the same or more software and/or hardware, or the modules that achieve the same function may be implemented by a combination of multiple submodules or subunits, etc. .
  • the device embodiments described above are only schematic.
  • the division of the unit is only a division of logical functions.
  • there may be another division manner for example, multiple units or components may be combined or integrated To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device
  • These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions
  • the device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device
  • the instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory, random access memory (RAM) and/or non-volatile memory in computer-readable media, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
  • RAM random access memory
  • ROM read only memory
  • flash RAM flash random access memory
  • Computer-readable media including permanent and non-permanent, removable and non-removable media, can store information by any method or technology.
  • the information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only compact disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic cassette tapes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.
  • one or more embodiments of this specification may be provided as a method, system, or computer program product. Therefore, one or more embodiments of this specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, one or more embodiments of this specification may employ computer programs implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code The form of the product.
  • computer usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • One or more embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as program modules.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • One or more embodiments of this specification can also be practiced in distributed computing environments in which tasks are performed by remote processing devices connected through a communication network.
  • program modules may be located in local and remote computer storage media including storage devices.

Abstract

L'invention concerne une méthode et un dispositif de reconnaissance d'image de masse de glande mammaire. La méthode consiste : à acquérir une image par résonance magnétique de glande mammaire à reconnaître (102) ; à entrer ladite image dans un modèle d'identification de masse de glande mammaire construite afin d'obtenir un résultat d'identification de masse dans ladite image ; le modèle de reconnaissance de masse utilisant un modèle de réseau neuronal convolutif complet profond ; un processus de codage du modèle de réseau neuronal convolutif complet profond utilisant un module convolutif basique dans un modèle de réseau neuronal convolutif en forme de U pour effectuer une extraction de caractéristiques, et un processus de décodage du modèle de réseau neuronal convolutif complet profond utilisant une connexion dense pour fusionner des cartes de caractéristiques à fusionner après la réalisation d'une unification de taille (104). Selon la méthode, une reconnaissance automatique de la masse de glande mammaire est mise en œuvre, et la précision de reconnaissance de la masse de glande mammaire est améliorée.
PCT/CN2018/120885 2018-12-13 2018-12-13 Méthode et dispositif de reconnaissance d'image de masse de glande mammaire WO2020118618A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/120885 WO2020118618A1 (fr) 2018-12-13 2018-12-13 Méthode et dispositif de reconnaissance d'image de masse de glande mammaire

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/120885 WO2020118618A1 (fr) 2018-12-13 2018-12-13 Méthode et dispositif de reconnaissance d'image de masse de glande mammaire

Publications (1)

Publication Number Publication Date
WO2020118618A1 true WO2020118618A1 (fr) 2020-06-18

Family

ID=71075281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/120885 WO2020118618A1 (fr) 2018-12-13 2018-12-13 Méthode et dispositif de reconnaissance d'image de masse de glande mammaire

Country Status (1)

Country Link
WO (1) WO2020118618A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112053363A (zh) * 2020-08-19 2020-12-08 苏州超云生命智能产业研究院有限公司 视网膜血管分割方法、装置及模型构建方法
CN112164028A (zh) * 2020-09-02 2021-01-01 陈燕铭 基于人工智能的垂体腺瘤磁共振图像定位诊断方法和装置
CN112990359A (zh) * 2021-04-19 2021-06-18 深圳市深光粟科技有限公司 一种影像数据处理方法、装置、计算机及存储介质
CN113887499A (zh) * 2021-10-21 2022-01-04 清华大学 一种沙丘图像识别模型及其创建方法和沙丘图像识别方法
CN114419064A (zh) * 2022-01-10 2022-04-29 陕西师范大学 基于RN-DoubleU-Net网络的乳腺腺管区域图像分割方法
CN116309585A (zh) * 2023-05-22 2023-06-23 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统
CN116310406A (zh) * 2023-05-22 2023-06-23 浙江之科云创数字科技有限公司 一种图像检测的方法、装置、存储介质及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657239A (zh) * 2017-09-30 2018-02-02 清华大学深圳研究生院 掌纹图像性别分类方法及装置、计算机装置及可读存储介质
CN107945181A (zh) * 2017-12-30 2018-04-20 北京羽医甘蓝信息技术有限公司 用于乳腺癌淋巴转移病理图像的处理方法和装置
CN108154105A (zh) * 2017-12-21 2018-06-12 深圳先进技术研究院 水下生物检测与识别方法、装置、服务器及终端设备
CN108364025A (zh) * 2018-02-11 2018-08-03 广州市碳码科技有限责任公司 基于深度学习的胃镜图像识别方法、装置、设备及介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657239A (zh) * 2017-09-30 2018-02-02 清华大学深圳研究生院 掌纹图像性别分类方法及装置、计算机装置及可读存储介质
CN108154105A (zh) * 2017-12-21 2018-06-12 深圳先进技术研究院 水下生物检测与识别方法、装置、服务器及终端设备
CN107945181A (zh) * 2017-12-30 2018-04-20 北京羽医甘蓝信息技术有限公司 用于乳腺癌淋巴转移病理图像的处理方法和装置
CN108364025A (zh) * 2018-02-11 2018-08-03 广州市碳码科技有限责任公司 基于深度学习的胃镜图像识别方法、装置、设备及介质

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112053363B (zh) * 2020-08-19 2023-12-15 苏州超云生命智能产业研究院有限公司 视网膜血管分割方法、装置及模型构建方法
CN112053363A (zh) * 2020-08-19 2020-12-08 苏州超云生命智能产业研究院有限公司 视网膜血管分割方法、装置及模型构建方法
CN112164028A (zh) * 2020-09-02 2021-01-01 陈燕铭 基于人工智能的垂体腺瘤磁共振图像定位诊断方法和装置
CN112990359A (zh) * 2021-04-19 2021-06-18 深圳市深光粟科技有限公司 一种影像数据处理方法、装置、计算机及存储介质
CN112990359B (zh) * 2021-04-19 2024-01-26 深圳市深光粟科技有限公司 一种影像数据处理方法、装置、计算机及存储介质
CN113887499A (zh) * 2021-10-21 2022-01-04 清华大学 一种沙丘图像识别模型及其创建方法和沙丘图像识别方法
CN113887499B (zh) * 2021-10-21 2022-11-18 清华大学 一种沙丘图像识别模型及其创建方法和沙丘图像识别方法
CN114419064A (zh) * 2022-01-10 2022-04-29 陕西师范大学 基于RN-DoubleU-Net网络的乳腺腺管区域图像分割方法
CN114419064B (zh) * 2022-01-10 2024-04-05 陕西师范大学 基于RN-DoubleU-Net网络的乳腺腺管区域图像分割方法
CN116309585A (zh) * 2023-05-22 2023-06-23 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统
CN116309585B (zh) * 2023-05-22 2023-08-22 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统
CN116310406B (zh) * 2023-05-22 2023-08-11 浙江之科云创数字科技有限公司 一种图像检测的方法、装置、存储介质及电子设备
CN116310406A (zh) * 2023-05-22 2023-06-23 浙江之科云创数字科技有限公司 一种图像检测的方法、装置、存储介质及电子设备

Similar Documents

Publication Publication Date Title
WO2020118618A1 (fr) Méthode et dispositif de reconnaissance d'image de masse de glande mammaire
WO2020215984A1 (fr) Procédé de détection d'images médicales basée sur un apprentissage profond, et dispositif associé
Wang et al. Deep attentional features for prostate segmentation in ultrasound
CN110475505B (zh) 利用全卷积网络的自动分割
WO2020133636A1 (fr) Procédé et système de détection et d'avertissement d'enveloppe intelligente dans la chirurgie de la prostate
WO2022001623A1 (fr) Procédé et appareil de traitement d'image faisant appel à l'intelligence artificielle, dispositif et support de stockage
CN109685077A (zh) 一种乳腺肿块图像识别方法及装置
JP2020516428A (ja) マンモグラフィにおける密度の評価
Moawad et al. Artificial intelligence in diagnostic radiology: where do we stand, challenges, and opportunities
Eslami et al. Automatic vocal tract landmark localization from midsagittal MRI data
Qi et al. Automatic lacunae localization in placental ultrasound images via layer aggregation
McCullough et al. Convolutional neural network models for automatic preoperative severity assessment in unilateral cleft lip
Ding et al. A multi-scale channel attention network for prostate segmentation
Ai et al. ResCaps: an improved capsule network and its application in ultrasonic image classification of thyroid papillary carcinoma
Škardová et al. Mechanical and imaging models-based image registration
Rama et al. Image pre-processing: enhance the performance of medical image classification using various data augmentation technique
CN111127400A (zh) 一种乳腺病变检测方法和装置
Choi et al. Automatic initialization active contour model for the segmentation of the chest wall on chest CT
Kurzendorfer et al. Random forest based left ventricle segmentation in LGE-MRI
Gonçalves et al. Deep aesthetic assessment of breast cancer surgery outcomes
WO2020118614A1 (fr) Procédé et dispositif d'identification d'image pour foyers sur la tête et le cou
Adegun et al. Fully convolutional encoder-decoder architecture (FCEDA) for skin lesions segmentation
Hossain et al. The segmentation of nuclei from histopathology images with synthetic data
Sreelekshmi et al. A Review on Multimodal Medical Image Fusion
Xu et al. MD-TransUNet: TransUNet with Multi-attention and Dilated Convolution for Brain Stroke Lesion Segmentation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18943287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18943287

Country of ref document: EP

Kind code of ref document: A1