EP3005291A1 - Automated aorta detection in a cta volume - Google Patents

Automated aorta detection in a cta volume

Info

Publication number
EP3005291A1
EP3005291A1 EP14726005.3A EP14726005A EP3005291A1 EP 3005291 A1 EP3005291 A1 EP 3005291A1 EP 14726005 A EP14726005 A EP 14726005A EP 3005291 A1 EP3005291 A1 EP 3005291A1
Authority
EP
European Patent Office
Prior art keywords
voxels
class
cluster
clusters
volume
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP14726005.3A
Other languages
German (de)
French (fr)
Other versions
EP3005291B1 (en
Inventor
Asma Ouji
Koen Vergote
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AGFA HEALTHCARE
Agfa HealthCare NV
Original Assignee
AGFA HEALTHCARE
Agfa HealthCare NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AGFA HEALTHCARE, Agfa HealthCare NV filed Critical AGFA HEALTHCARE
Priority to EP14726005.3A priority Critical patent/EP3005291B1/en
Publication of EP3005291A1 publication Critical patent/EP3005291A1/en
Application granted granted Critical
Publication of EP3005291B1 publication Critical patent/EP3005291B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/033Recognition of patterns in medical or anatomical images of skeletal patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present invention relates to a computer- implemented method of automated vessel detection in medical images, such as computed tomography angiography (CTA) images.
  • CTA computed tomography angiography
  • the vessel tree will therefore have a density similar to that of bony tissue.
  • the radiologist is presented with an image containing only the vessel tree and bone.
  • This task can be broken up in a segmentation, and a classification task. During segmentation, the image data is broken up into regions that contain image elements likely to be of the same type (i.e. bone or vessel). Based on some quantitative or
  • a classification scheme or user determines if a particular region should be considered osseous or vascular tissue.
  • Bone removal algorithms do not allow to detect the vessel structure in a perfect way. There are always some fragments that need to be cleaned up.
  • the present invention is applicable to a 2D image represented by a digital pixel representation as well as to a 3D volume represented by a voxel representation.
  • 2D image is mentioned it is understood to be interchangeable with a 3D volume and vice versa.
  • the present invention can be implemented as a computer program product adapted to carry out all aspects of the method of the present invention when run on a computer.
  • the invention also comprises a computer readable medium comprising computer executable program code adapted to carry out the steps of the method of the present invention.
  • Figure 1 shows the input 3D volume, the result of the bone removal algorithm applied to this input 3D volume and the result of the aorta detection algorithm.
  • Figure 2 is a flow chart illustrating the different steps of the method of the present invention
  • Figure 3 is a flow chart illustrating the bone segmentation part of the present invention
  • Figure 4 shows a classified hierarchical breakdown of part of a volume .
  • CTA image computed tomography angiography image
  • the aorta is the largest artery in the body, originating from the left ventricle of the heart and extending down to the abdomen, where it bifurcates into two smaller arteries.
  • the aorta corresponds to the largest component in the vessel tree .
  • the method of the invention can be applied to detect the largest vessel instead of for detecting the aorta.
  • the proposed method encompasses two major segmentation steps: Bone removal and Aorta detection.
  • the bone removal steps are illustrated in figure 3.
  • Bone removal methods are known in the art and include, for example, interactively controlled thresholding methods such as described in "Semiautomatic bone removal technique from CT angiography data” . Med Imaging, Proc . SPIE 4322 (2001) 1273-1283 (by Alyassin, A. M . , Avinash, G. B.) . Other methods are based on the watershed technique such as described in "Improved watershed transform for medical image segmentation using prior information", IEEE Trans Med Imaging 23(4) (2004) 447-458 (by Grau, V., Mewes, A. U. J., Alca ⁇ niz, M . , Kikinis, R-, Warfield, S. K.) ⁇ An example of region growing based bone removal is the one proposed by M. Fiebich: "Automatic bone
  • thresholding methods in general with respect to that of watershed based methods, and the relative ease at which they can be
  • a threshold based segmenter is preferred in the context of the present invention.
  • a watershed based segmenting algorithm (illustrated in figure 3) as described below is preferably used in the method of the present invention .
  • the method in general comprises a segmentation stage and a
  • the segmentation stage consists of an iterative process of
  • the threshold operations are performed iteratively, with increasing threshold value each time: the mask of voxels that remain after each threshold operation is fed into the new threshold operation, at each stage reducing the computational cost as the number of voxels decreases .
  • the masks rendered by each of the threshold operations are analyzed to find clusters of adjacent voxels. During this analysis, a number of qualitative features is calculated for each cluster .
  • the method of the present invention starts with an initial threshold operation at 180 Hounsfield units.
  • the output is a binary mask in which only the voxels with intensity higher than 180 HU are set to 1. Due to the sparsity of this mask, it is stored in memory as a run-length encoded mask. This first mask forms the input to the iterative process of cluster analysis and thresholding:
  • Clusters are defined as a group of voxels in which each voxel is adjacent to at least one of the other voxels in the group. At this stage adjacency is defined in the 6-neighborhood sense, but the cluster generator can be configured to use e.g. a 26-neighborhood of voxels .
  • Clusters are created by labelling runs in the run-length encoded mask.
  • a run is labelled using an integer label and this label is propagated to all of its adjacent runs. This is achieved in a forward sweep followed by a pruning operation in which previously established corresponding labels are replaced by one unique label.
  • One cluster is generated for each unique label in the mask.
  • intensity based features such as variance, maximum value, average value, histogram data, and morphological features, such as volume, compactness, center of gravity, porosity, and principal components can be computed for each cluster.
  • a cluster is therefore characterised by a combination of an integer label and a series of features computed on the voxels of runs carrying that label.
  • clusters smaller than 500 mm 3 are removed from the run-length mask before it is passed to the next threshold operation.
  • the parameter that controls the increase of the threshold value between consecutive thresholds is in the described example set to 20 HU.
  • Cluster hierarchy The process of cluster generation and thresholding is continued until no clusters meet the minimum size requirement of 500m 3 any more, or until a threshold level of 700 HU is reached.
  • the algorithm can be configured to omit the minimum size requirement. This allows the cluster analysis step to be performed after the iterative thresholding .
  • thresholding is performed with a monotonically increasing threshold value, clusters will fall apart into smaller clusters. This is exactly the envisioned effect to provide segmentation between bone and vascular regions .
  • relations need to be established between the clusters computed at successive threshold levels. The tracing of the break-up events allows assigning classes to clusters and propagating these to lower threshold clusters until a break-up event marks the joining of two distinct classes. Relationships between a higher and a lower threshold value mask are established by linking all clusters of the mask with the higher threshold value to the ones in the mask with a lower threshold value.
  • a direct 'ancestor' is established by taking an arbitrary voxel position of the cluster and looking up the label corresponding to this position in the lower threshold value mask.
  • Each ancestor cluster maintains a list of its "successor' clusters and each successor retains its direct ancestor.
  • Establishing hierarchy also enables to compute differential features describing the evolution of cluster features with respect to changing threshold levels.
  • a learning algorithm can be used to train such a classifier based on manually labelled training data.
  • clusters are classified directly whereas others are assigned a class through propagation. Clusters are only classified directly if they have no successors any more. All other clusters in the hierarchy are ancestors of these 'leaves' and will be assigned a class based on propagation rules:
  • the cluster receives the 'mixed' class attribute.
  • classification are the 'top ancestral clusters'.
  • the class propagation scheme is implemented recursively, ensuring clusters are visited only once during classification.
  • Each cluster also contains accumulators to keep track of the number of leafs each class has among its successors. This allows to, optionally, use a voting system: a direct classification of a leaf cluster can be overruled if there are sufficient indications that the direct classification was erroneous. As an example, consider a vessel tree in which one of the bifurcations is calcified. A calcification cluster has a higher probability of being
  • the described implementation is configured to down sample the volume on which the algorithm is performed, to slices with a minimal thickness of 2mm.
  • classifier trained to classify only the leaves of the cluster hierarchy also effectively solves the problem of the overlapping density values of trabecular bone and vessel tissue. Since the trabecular bone is typically first thresholded away, leaving only cortical bone, the classifier is never forced to label low density leaves as bone .
  • the classifier used by the algorithm is a decision tree trained on a manually labelled training set of leaf clusters coming from a mixture of CT-scanners.
  • the data was labelled by generating and visualizing the cluster hierarchy for each dataset. Selecting a cluster from the hierarchy would highlight the corresponding voxels in the CT scan. The selected cluster and all of its successors would then be labeled as a certain class by keystroke.
  • the learner is configured to discern the valuable from the useless cluster features and selects only the valuable features to train on.
  • the cluster features the classifier is trained on are both features computed during the segmentation stage (cluster average, variance, maximum and skewness) , and a differential feature named 'minimum relative volume' (MRV) .
  • the MRV of a cluster is the minimum of the volume ratios encountered when tracing from its root ancestral cluster to itself. In which the volume ratio is defined as the ratio between the volume of the direct ancestor, and the sum of the volumes of its direct successors.
  • Calcifications and vascular clusters typically have a very low MRV, due to a sudden volume reduction above a certain threshold.
  • the volumes of osseous clusters typically reduce much more slowly with respect to increasing threshold values, typically resulting in MRV values in the range 0.75 and 0.90.
  • the output of the described embodiment of the method of the present invention so far consists of 26 run-length encoded masks (each corresponding to a threshold level) and a hierarchy of linked and classified clusters.
  • a preliminary bone mask can be found by merging all the osseous 'top ancestral clusters' .
  • a top ancestral cluster is a non-mixed class cluster at the highest possible level of the hierarchy. As such, top ancestral clusters are always located at the threshold level of a break-up event .
  • the algorithm can be configured to use two methods: morphological dilation or distance transform-based assignment.
  • distance transform-based assignment voxels present in the initial threshold mask, but not in the preliminary bone or vessel mask are assigned to a cluster based on their distance to the nearest bone or vascular cluster.
  • the class of the voxel is determined by looking up the distance of the voxel to the bone mask and to the vessel mask.
  • the voxel is assigned to the class with whom the distance is smallest. This is achieved by generating two distance transforms of the initial threshold mask using the vessel, and bone masks respectively as source volumes .
  • the resulting first binary mask is used in the next steps.
  • the original voxel representation of the medical image is subjected to a low-thresholding operation so as to yield a second binary mask.
  • the low threshold is set at 156 HU
  • first and second binary masks are pixel-wise subtracted and in this way yield a third binary mask.
  • This third mask forms the input to the process of cluster analysis.
  • Clusters are computed using a connected component extraction process similar to the one used in the bone removal step to build the watershed tree.
  • a cluster is defined as a group of voxels in which each voxel is adjacent to at least one of the other voxels in the group.
  • adjacency is defined in the 6-neighborhood sense, but the cluster generator can be configured to use e.g. a 26- neighborhood of voxels.
  • Clusters are created by labelling runs in the run-length encoded mask.
  • a run is labelled using an integer label and this label is propagated to all of its adjacent runs. This is achieved in a forward sweep followed by a pruning operation in which previously established corresponding labels are replaced by one unique label.
  • One cluster is generated for each unique label in the mask.
  • a cluster is therefore characterised by a combination of an integer label and a series of features computed on the voxels of runs carrying that label .
  • Examples of such features are the number of voxels within the cluster and the shape of the cluster.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computer Graphics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • External Artificial Organs (AREA)

Abstract

A method for detecting the main body vessels (e.g. the aorta) in a medical volume by refining the result of a bone removal algorithm.

Description

Automated aorta detection in a CTA volume
[DESCRIPTION]
FIELD OF THE INVENTION
The present invention relates to a computer- implemented method of automated vessel detection in medical images, such as computed tomography angiography (CTA) images. BACKGROUND OF THE INVENTION
In a radiocontrast medical imaging setting, a patient is
administered a contrast agent to increase the radiodensity of some lumen in the body. In a reconstruction of angiographic X-ray projections, the vessel tree will therefore have a density similar to that of bony tissue. As such, when displaying only high intensity voxels of the volume, the radiologist is presented with an image containing only the vessel tree and bone. As bone might visually obstruct certain parts of the vessel tree, a significant speed-up of the diagnosis can be achieved by removing the skeletal structures from the view. This task can be broken up in a segmentation, and a classification task. During segmentation, the image data is broken up into regions that contain image elements likely to be of the same type (i.e. bone or vessel). Based on some quantitative or
qualitative features of the regions, a classification scheme or user then determines if a particular region should be considered osseous or vascular tissue.
Bone removal algorithms do not allow to detect the vessel structure in a perfect way. There are always some fragments that need to be cleaned up.
It is an aspect of this invention to provide a method to detect the vessel structure in a volume image, such as a CTA image in an optimal way. SUMMARY OF THE INVENTION
The above-mentioned aspect is achieved by the method set out in claim 1. Specific features for preferred embodiments of the invention are set out in the dependent claims .
The present invention is applicable to a 2D image represented by a digital pixel representation as well as to a 3D volume represented by a voxel representation. When 2D image is mentioned it is understood to be interchangeable with a 3D volume and vice versa.
The present invention can be implemented as a computer program product adapted to carry out all aspects of the method of the present invention when run on a computer. The invention also comprises a computer readable medium comprising computer executable program code adapted to carry out the steps of the method of the present invention.
Further advantages and embodiments of the present invention will become apparent from the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows the input 3D volume, the result of the bone removal algorithm applied to this input 3D volume and the result of the aorta detection algorithm.
Figure 2 is a flow chart illustrating the different steps of the method of the present invention,
Figure 3 is a flow chart illustrating the bone segmentation part of the present invention,
Figure 4 shows a classified hierarchical breakdown of part of a volume .
DETAILED DESCRIPTION OF THE INVENTION In this detailed description the method of the present invention is explained with regard to the detection of the aorta in a computed tomography angiography image (CTA image) . CTA volume density is expressed in Hounsfield units.
The aorta is the largest artery in the body, originating from the left ventricle of the heart and extending down to the abdomen, where it bifurcates into two smaller arteries.
Hence, considering an abdominal CT scanner generated volume, the aorta corresponds to the largest component in the vessel tree .
Although the invention has been designed for the detection of the aorta in a CTA volume, it will be clear that the method can also be used for other applications.
For example, if the X-ray reconstructed image is not an abdominal image, the method of the invention can be applied to detect the largest vessel instead of for detecting the aorta.
Other applications are the refinement of the bone removal results getting an appropriate input for tracking the vessels to detect vascular diseases, etc.
The proposed method encompasses two major segmentation steps: Bone removal and Aorta detection.
The overall algorithm is illustrated in figure 2.
The bone removal steps are illustrated in figure 3.
Bone Removal
Bone removal methods are known in the art and include, for example, interactively controlled thresholding methods such as described in "Semiautomatic bone removal technique from CT angiography data" . Med Imaging, Proc . SPIE 4322 (2001) 1273-1283 (by Alyassin, A. M . , Avinash, G. B.) . Other methods are based on the watershed technique such as described in "Improved watershed transform for medical image segmentation using prior information", IEEE Trans Med Imaging 23(4) (2004) 447-458 (by Grau, V., Mewes, A. U. J., Alca~niz, M . , Kikinis, R-, Warfield, S. K.) · An example of region growing based bone removal is the one proposed by M. Fiebich: "Automatic bone
segmentation technique for CT angiographic studies", J. Comput As, vol. 23, no. 1, p. 155, 1999. Taking into consideration the computational complexity of
thresholding methods in general with respect to that of watershed based methods, and the relative ease at which they can be
parallelized, a threshold based segmenter is preferred in the context of the present invention.
A watershed based segmenting algorithm (illustrated in figure 3) as described below is preferably used in the method of the present invention . The method in general comprises a segmentation stage and a
classifying step.
The segmentation stage consists of an iterative process of
thresholding and cluster analysis.
Iterative thresholding:
The threshold operations are performed iteratively, with increasing threshold value each time: the mask of voxels that remain after each threshold operation is fed into the new threshold operation, at each stage reducing the computational cost as the number of voxels decreases . The masks rendered by each of the threshold operations are analyzed to find clusters of adjacent voxels. During this analysis, a number of qualitative features is calculated for each cluster . The method of the present invention starts with an initial threshold operation at 180 Hounsfield units. The output is a binary mask in which only the voxels with intensity higher than 180 HU are set to 1. Due to the sparsity of this mask, it is stored in memory as a run-length encoded mask. This first mask forms the input to the iterative process of cluster analysis and thresholding:
Cluster analysis: Clusters are defined as a group of voxels in which each voxel is adjacent to at least one of the other voxels in the group. At this stage adjacency is defined in the 6-neighborhood sense, but the cluster generator can be configured to use e.g. a 26-neighborhood of voxels .
Clusters are created by labelling runs in the run-length encoded mask. A run is labelled using an integer label and this label is propagated to all of its adjacent runs. This is achieved in a forward sweep followed by a pruning operation in which previously established corresponding labels are replaced by one unique label. One cluster is generated for each unique label in the mask. During analysis both intensity based features, such as variance, maximum value, average value, histogram data, and morphological features, such as volume, compactness, center of gravity, porosity, and principal components can be computed for each cluster. A cluster is therefore characterised by a combination of an integer label and a series of features computed on the voxels of runs carrying that label. To reduce the number of clusters that need to be stored clusters smaller than 500 mm3 are removed from the run-length mask before it is passed to the next threshold operation. The parameter that controls the increase of the threshold value between consecutive thresholds is in the described example set to 20 HU. By using the previous mask as input to the next threshold operation, the number of voxels that need to be visited during the threshold operation is reduced to the number of voxels in the mask.
The process of cluster generation and thresholding is continued until no clusters meet the minimum size requirement of 500m3 any more, or until a threshold level of 700 HU is reached. The algorithm can be configured to omit the minimum size requirement. This allows the cluster analysis step to be performed after the iterative thresholding . Cluster hierarchy:
Since in the described embodiment thresholding is performed with a monotonically increasing threshold value, clusters will fall apart into smaller clusters. This is exactly the envisioned effect to provide segmentation between bone and vascular regions . To trace these break-up events in the mask, relations need to be established between the clusters computed at successive threshold levels. The tracing of the break-up events allows assigning classes to clusters and propagating these to lower threshold clusters until a break-up event marks the joining of two distinct classes. Relationships between a higher and a lower threshold value mask are established by linking all clusters of the mask with the higher threshold value to the ones in the mask with a lower threshold value. For each cluster a direct 'ancestor' is established by taking an arbitrary voxel position of the cluster and looking up the label corresponding to this position in the lower threshold value mask. Each ancestor cluster maintains a list of its "successor' clusters and each successor retains its direct ancestor. Establishing hierarchy also enables to compute differential features describing the evolution of cluster features with respect to changing threshold levels.
Building the cluster hierarchy can also be performed incrementally as part of the cluster analysis step, as depicted in figure 4. Classifier
To determine whether a computed cluster is part of osseous or vascular tissue the algorithm needs to be able to differentiate between these cluster classes based on their features. A learning algorithm can be used to train such a classifier based on manually labelled training data.
Classification
As mentioned earlier, some clusters are classified directly whereas others are assigned a class through propagation. Clusters are only classified directly if they have no successors any more. All other clusters in the hierarchy are ancestors of these 'leaves' and will be assigned a class based on propagation rules:
If all the successors of the cluster are of the same class, that cluster receives the same classification as its successors.
In all other cases the cluster receives the 'mixed' class attribute.
The highest clusters in the hierarchy (i.e. those generated on the lowest threshold level) that did not receive the mixed
classification are the 'top ancestral clusters'. The class propagation scheme is implemented recursively, ensuring clusters are visited only once during classification.
Each cluster also contains accumulators to keep track of the number of leafs each class has among its successors. This allows to, optionally, use a voting system: a direct classification of a leaf cluster can be overruled if there are sufficient indications that the direct classification was erroneous. As an example, consider a vessel tree in which one of the bifurcations is calcified. A calcification cluster has a higher probability of being
misclassified since their characteristics are widely diverse and, as such, their features can be hard to discriminate of those of osseous clusters. Such single misclassification in a vessel tree is likely to be corrected by a voting mechanism that overrules a 10 to 1 minority .
The combination of the used segmentation and classification scheme yields several advantages with respect to watershed methods . Not only is the number of items that need to be classified several orders of magnitude smaller (typically 5.105 versus 150 for a
512x512x770 dataset) , which is good for performance reasons, but since the clusters typically have a larger extent and have a larger volume, the computed features are more robust to noise and down sampling the volume by reducing the number of slices. The described implementation is configured to down sample the volume on which the algorithm is performed, to slices with a minimal thickness of 2mm. The process of iterative thresholding in combination with a
classifier trained to classify only the leaves of the cluster hierarchy also effectively solves the problem of the overlapping density values of trabecular bone and vessel tissue. Since the trabecular bone is typically first thresholded away, leaving only cortical bone, the classifier is never forced to label low density leaves as bone .
Training
The classifier used by the algorithm is a decision tree trained on a manually labelled training set of leaf clusters coming from a mixture of CT-scanners. The data was labelled by generating and visualizing the cluster hierarchy for each dataset. Selecting a cluster from the hierarchy would highlight the corresponding voxels in the CT scan. The selected cluster and all of its successors would then be labeled as a certain class by keystroke.
The labeled data is then fed into a learning algorithm that
generates a decision tree using cross validation. To maintain generality the learner is forced to have at least 6 training instances per generated classifier leaf.
The learner is configured to discern the valuable from the useless cluster features and selects only the valuable features to train on. The cluster features the classifier is trained on are both features computed during the segmentation stage (cluster average, variance, maximum and skewness) , and a differential feature named 'minimum relative volume' (MRV) . The MRV of a cluster is the minimum of the volume ratios encountered when tracing from its root ancestral cluster to itself. In which the volume ratio is defined as the ratio between the volume of the direct ancestor, and the sum of the volumes of its direct successors. Calcifications and vascular clusters typically have a very low MRV, due to a sudden volume reduction above a certain threshold. The volumes of osseous clusters typically reduce much more slowly with respect to increasing threshold values, typically resulting in MRV values in the range 0.75 and 0.90.
Post-processing
The output of the described embodiment of the method of the present invention so far consists of 26 run-length encoded masks (each corresponding to a threshold level) and a hierarchy of linked and classified clusters. A preliminary bone mask can be found by merging all the osseous 'top ancestral clusters' . A top ancestral cluster is a non-mixed class cluster at the highest possible level of the hierarchy. As such, top ancestral clusters are always located at the threshold level of a break-up event .
Since voxels are lost from the mask at each threshold operation, the top clusters do not include all voxels. These lost voxels can be added to the bone mask again by some form of post processing. The algorithm can be configured to use two methods: morphological dilation or distance transform-based assignment. During distance transform-based assignment, voxels present in the initial threshold mask, but not in the preliminary bone or vessel mask are assigned to a cluster based on their distance to the nearest bone or vascular cluster. The class of the voxel is determined by looking up the distance of the voxel to the bone mask and to the vessel mask. The voxel is assigned to the class with whom the distance is smallest. This is achieved by generating two distance transforms of the initial threshold mask using the vessel, and bone masks respectively as source volumes .
Aorta Detection
The resulting first binary mask, either with or without the postprocessing being applied is used in the next steps.
The original voxel representation of the medical image is subjected to a low-thresholding operation so as to yield a second binary mask. In a preferred embodiment the low threshold is set at 156 HU
(Hounsfield units) because it has been experimentally determined that this value leads to very good results. However this value can be set to a different value by the user by applying a
window/leveling operation to the volume data.
Next, first and second binary masks are pixel-wise subtracted and in this way yield a third binary mask. This third mask forms the input to the process of cluster analysis.
Cluster analysis:
Clusters are computed using a connected component extraction process similar to the one used in the bone removal step to build the watershed tree. A cluster is defined as a group of voxels in which each voxel is adjacent to at least one of the other voxels in the group. At this stage adjacency is defined in the 6-neighborhood sense, but the cluster generator can be configured to use e.g. a 26- neighborhood of voxels.
Clusters are created by labelling runs in the run-length encoded mask. A run is labelled using an integer label and this label is propagated to all of its adjacent runs. This is achieved in a forward sweep followed by a pruning operation in which previously established corresponding labels are replaced by one unique label. One cluster is generated for each unique label in the mask.
Analysis is based on a set features that be computed for each cluster. A cluster is therefore characterised by a combination of an integer label and a series of features computed on the voxels of runs carrying that label .
Examples of such features are the number of voxels within the cluster and the shape of the cluster.
Aorta selection
Next, the largest connected components, being the connected
components with the largest number of voxels, are upheld and constitute the vessel to be selected.
Other components are considered as not part of the vessel and can be removed .

Claims

[CLAIMS ]
1. A method for detecting a main vessel in a medical 3D volume represented by a digital voxel representation comprising the steps of
- applying a segmentation algorithm to said volume resulting in a first binary mask with one of a first and a second class value assigned to each voxel of said volume,
- applying a thresholding operation to said volume to obtain a second binary mask,
- subtracting said second binary mask from said first binary mask to generate a third binary mask,
- extracting connected components by propagating labels to all adjacent voxels,
- computing features for each connected component,
- preserving connected components on the basis of the results of said features and identifying the preserved component as said vessel .
2. A method according to claim 1 wherein said segmentation algorithm comprises the steps of
subjecting said digital voxel representation to an iterative thresholding operation until a stopping criterion is reached,
- finding clusters of adjacent voxels by analyzing the results of each of the steps of said iterative thresholding operation,
- building a hierarchical representation of said volume by
establishing relations between clusters found in the results of each of the steps of said iterative thresholding operation,
- assigning a type class to a leaf cluster of said hierarchical representation,
- propagating said class towards the top of said hierarchical representation using propagation rules,
- generating a mask marking the locations of voxels of a specific class by merging the locations of voxels contained in clusters that received that class through propagation.
3. A method according to claim 1 wherein said first class is assigned to voxels of osseous tissue and said second class is assigned to voxels of vascular tissue.
4. A method according to claim 1 wherein said 3D volume is obtained by a CTA procedure and said vessel is the aorta.
5. A method according to claim 1 wherein said features are at least one of the number of voxels in the connected component and the shape of said component .
6. A method according to claim 2 wherein said mask marks the locations of voxels that were contained in the top ancestral clusters.
7. A method according to claim 2 wherein said stopping criterion is met when no clusters are generated anymore that fulfill a minimum size requirement.
8. A method according to claim 2 wherein said stopping criterion is reached when said threshold is below a given limit value.
9. A method according to claim 2 wherein a class to be assigned to a cluster is determined on the basis of the results of an analysis of values of qualitative and/or quantitative features determined for said cluster.
10. A method according to claim 2 wherein a class to be assigned to a leaf is decided upon by means of a trained classifier.
11. A method according to claim 2 wherein a first type of post processing comprising adding voxels to said mask to restore voxels that were lost during said thresholding operation.
12. A method according to claim 11 wherein said post post-processing operation comprises a distance transform-based assignment process.
13. A computer program product adapted to carry out the method of any of the preceding claims when run on a computer.
14. A computer readable medium comprising computer executable program code adapted to carry out the steps of any of claims 1 - 12.
EP14726005.3A 2013-06-05 2014-05-26 Automated aorta detection in a cta volume Active EP3005291B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14726005.3A EP3005291B1 (en) 2013-06-05 2014-05-26 Automated aorta detection in a cta volume

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP13170543.6A EP2811458A1 (en) 2013-06-05 2013-06-05 Automated aorta detection in a CTA volume
PCT/EP2014/060823 WO2014195170A1 (en) 2013-06-05 2014-05-26 Automated aorta detection in a cta volume
EP14726005.3A EP3005291B1 (en) 2013-06-05 2014-05-26 Automated aorta detection in a cta volume

Publications (2)

Publication Number Publication Date
EP3005291A1 true EP3005291A1 (en) 2016-04-13
EP3005291B1 EP3005291B1 (en) 2019-12-18

Family

ID=48613421

Family Applications (2)

Application Number Title Priority Date Filing Date
EP13170543.6A Withdrawn EP2811458A1 (en) 2013-06-05 2013-06-05 Automated aorta detection in a CTA volume
EP14726005.3A Active EP3005291B1 (en) 2013-06-05 2014-05-26 Automated aorta detection in a cta volume

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP13170543.6A Withdrawn EP2811458A1 (en) 2013-06-05 2013-06-05 Automated aorta detection in a CTA volume

Country Status (5)

Country Link
US (1) US9691174B2 (en)
EP (2) EP2811458A1 (en)
CN (1) CN105264569B (en)
BR (1) BR112015030526A2 (en)
WO (1) WO2014195170A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2713337B1 (en) * 2012-10-01 2015-09-09 Agfa Healthcare Method of analyzing an image
JP6145874B2 (en) * 2013-07-23 2017-06-14 富士フイルム株式会社 Radiation image processing apparatus and method
US10037603B2 (en) * 2015-05-04 2018-07-31 Siemens Healthcare Gmbh Method and system for whole body bone removal and vascular visualization in medical image data
EP3142069B1 (en) 2015-09-10 2020-05-13 Agfa HealthCare Method, apparatus and system for analyzing medical images of blood vessels
CN106682636B (en) 2016-12-31 2020-10-16 上海联影医疗科技有限公司 Blood vessel extraction method and system
EP3642743B1 (en) * 2017-06-19 2021-11-17 Viz.ai, Inc. A method and system for computer-aided triage
KR101930644B1 (en) * 2017-09-15 2018-12-18 한국과학기술원 Method and apparatus for fully automated segmenation of a joint using the patient-specific optimal thresholding and watershed algorithm
CN108573494B (en) * 2018-04-28 2021-06-15 上海联影医疗科技股份有限公司 Tubular structure extraction method and device
CN113362271B (en) * 2020-03-06 2022-09-09 深圳睿心智能医疗科技有限公司 Blood vessel three-dimensional image segmentation method and device, computer equipment and storage medium
CN115272206B (en) * 2022-07-18 2023-07-04 深圳市医未医疗科技有限公司 Medical image processing method, medical image processing device, computer equipment and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7123760B2 (en) * 2002-11-21 2006-10-17 General Electric Company Method and apparatus for removing obstructing structures in CT imaging
US7471814B2 (en) * 2002-11-27 2008-12-30 The Board Of Trustees Of The Leland Stanford Junior University Curved-slab maximum intensity projections
US7676257B2 (en) * 2003-11-25 2010-03-09 General Electric Company Method and apparatus for segmenting structure in CT angiography
US7532748B2 (en) * 2004-11-24 2009-05-12 General Electric Company Methods and apparatus for selecting and/or labeling vessel branches
US8244015B2 (en) * 2006-11-22 2012-08-14 General Electric Company Methods and apparatus for detecting aneurysm in vasculatures
US7953262B2 (en) * 2007-02-05 2011-05-31 General Electric Company Vascular image extraction and labeling system and method
EP2051205B1 (en) * 2007-10-17 2012-03-14 Deutsches Krebsforschungszentrum Method, computer program and workstation for removing undesirable objects for a digital medical image
US9025840B2 (en) * 2010-03-12 2015-05-05 Koninklijke Philips N.V. Motion visualisation in angiographic images
CN102842136B (en) * 2012-07-19 2015-08-05 湘潭大学 A kind of optic disk projective iteration method of comprehensive vascular distribution and optic disk appearance characteristics

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014195170A1 *

Also Published As

Publication number Publication date
US9691174B2 (en) 2017-06-27
CN105264569A (en) 2016-01-20
EP3005291B1 (en) 2019-12-18
BR112015030526A2 (en) 2017-07-25
EP2811458A1 (en) 2014-12-10
US20160093096A1 (en) 2016-03-31
CN105264569B (en) 2019-06-04
WO2014195170A1 (en) 2014-12-11

Similar Documents

Publication Publication Date Title
US9691174B2 (en) Automated aorta detection in a CTA volume
Jacobs et al. Automatic detection of subsolid pulmonary nodules in thoracic computed tomography images
Saad et al. Image segmentation for lung region in chest X-ray images using edge detection and morphology
US9311570B2 (en) Method of, and apparatus for, segmentation of structures in medical images
Tripathy et al. Unified preprocessing and enhancement technique for mammogram images
Liang et al. Computer aided detection of pulmonary embolism with tobogganing and mutiple instance classification in CT pulmonary angiography
Bouma et al. Automatic detection of pulmonary embolism in CTA images
Sert et al. Ensemble of convolutional neural networks for classification of breast microcalcification from mammograms
Wu et al. Center-sensitive and boundary-aware tooth instance segmentation and classification from cone-beam CT
CN111815599A (en) Image processing method, device, equipment and storage medium
Magdy et al. Automatic classification of normal and cancer lung CT images using multiscale AM‐FM features
WO2012077130A1 (en) Method and system to detect the microcalcifications in x-ray images using nonlinear energy operator.
Van Dongen et al. Automatic segmentation of pulmonary vasculature in thoracic CT scans with local thresholding and airway wall removal
Chen et al. A lung dense deep convolution neural network for robust lung parenchyma segmentation
CN112074841A (en) System and method for automatically detecting and segmenting vertebral bodies in 3D images
Maitra et al. Accurate breast contour detection algorithms in digital mammogram
Suiyuan et al. Pulmonary nodules 3D detection on serial CT scans
US9406124B2 (en) Method of analyzing an image
Amer et al. A CAD system for the early detection of lung nodules using computed tomography scan images.
Amer et al. A computer-aided early detection system of pulmonary nodules in CT scan images
Giordano et al. Automatic skeletal bone age assessment by integrating EMROI and CROI processing
Lee et al. Automated identification of lung nodules
EP3066642B1 (en) Method for removing a support of an object from volume data
Aggarwala et al. Detection of ground glass nodules in human lungs using lungs CT scans images
Beck et al. Validation and detection of vessel landmarks by using anatomical knowledge

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160105

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180831

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602014058546

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06T0007000000

Ipc: G06T0007110000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/136 20170101ALI20190710BHEP

Ipc: G06T 15/08 20110101ALI20190710BHEP

Ipc: G06K 9/62 20060101ALI20190710BHEP

Ipc: G06T 7/187 20170101ALI20190710BHEP

Ipc: G06T 7/11 20170101AFI20190710BHEP

Ipc: G06T 5/50 20060101ALI20190710BHEP

Ipc: G06K 9/46 20060101ALI20190710BHEP

INTG Intention to grant announced

Effective date: 20190802

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014058546

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1215432

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200115

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200318

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200319

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200513

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200418

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014058546

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1215432

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191218

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

26N No opposition filed

Effective date: 20200921

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191218

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240409

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240429

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240409

Year of fee payment: 11