EP3989828B1 - Analyse automatique de coronarographie - Google Patents

Analyse automatique de coronarographie Download PDF

Info

Publication number
EP3989828B1
EP3989828B1 EP20734240.3A EP20734240A EP3989828B1 EP 3989828 B1 EP3989828 B1 EP 3989828B1 EP 20734240 A EP20734240 A EP 20734240A EP 3989828 B1 EP3989828 B1 EP 3989828B1
Authority
EP
European Patent Office
Prior art keywords
acquisition
image data
vessel
training
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP20734240.3A
Other languages
German (de)
English (en)
Other versions
EP3989828A1 (fr
Inventor
Christian Haase
Dirk Schäfer
Michael Grass
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of EP3989828A1 publication Critical patent/EP3989828A1/fr
Application granted granted Critical
Publication of EP3989828B1 publication Critical patent/EP3989828B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/481Diagnostic techniques involving the use of contrast agents
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/504Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/545Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/007Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests for contrast media
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5294Devices using data or image processing specially adapted for radiation diagnosis involving using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/541Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • G06T2207/30104Vascular flow; Blood flow; Perfusion

Definitions

  • the present invention relates to a method for analyzing diagnostic image data, in particular X-ray angiographic image data, a corresponding apparatus and a respective computer program.
  • the present invention relates to an improved method and apparatus that allow to automatically derive quantitative feature information from diagnostic image data that has been acquired using pre-defined acquisition settings and to use the thus derived quantitative feature information to adjust the pre-defined acquisition settings accordingly, thereby improving the quality of the acquisition process.
  • Patent application US20180042566A1 discloses a system for acquiring a series of angiographic images and identifying the anatomical structures represented in the series of images using a machine learnt classifier. Additional series of images that would yield the optimal visualization of the structure of interest may be suggested.
  • Coronary angiography is typically performed by injecting a contrast agent into the blood vessels and subsequently irradiating the contrast agent-filled coronary vessels with X-ray radiation to acquire a sequence of angiographic images in which these vessels and, hence, the coronary vasculature are clearly visible.
  • the number and orientation of these angiographic image sequences, the contrast agent dose and the respective analysis of the image data may vary from one patient to another, making an objective analysis that is comparable for different patients very difficult.
  • pre-defined acquisition settings are used to acquire the angiographic image sequences.
  • Using these pre-defined acquisition settings reduces the variability in the acquired data since certain acquisition settings are known for each patient.
  • One such acquisition approach is the Xper Swing acquisition in which the angiographic image data is acquired at different orientations along a predefined repeatable trajectory with a pre-defined dose of contrast agent.
  • An Xper Swing acquisition hereby provides the angiographic image data to be analyzed for evaluation of a particular coronary artery as a single image sequence.
  • the quality of the angiographic image data acquired using Xper Swing still varies due to the optimization of certain acquisition settings being patient-dependent, due to the inter -patient variability of the anatomy, and due to the (remaining) variability in the acquisition settings.
  • automation of the data analysis is challenging. That is, automation of the analysis would require complex calculations that take account of all variabilities that may occur in the data for the different patients.
  • a medical imaging modality such as X-ray angiography
  • a method for analyzing diagnostic image data comprising the steps of: receiving diagnostic image data comprising a plurality of acquisition images of a vessel of interest at a trained classifying device, the diagnostic image data having been acquired using a pre-defined acquisition method, classifying the diagnostic image data to extract at least one quantitative feature of the vessel of interest from at least one acquisition image of the plurality of acquisition images, outputting the at least one quantitative feature of the vessel of interest associated with the at least one acquisition image while the acquisition of the diagnostic image data is still in progress, and adjusting one or more adjustable image acquisition settings based on the at least one quantitative feature to optimize the acquisition of the diagnostic image data.
  • the object is solved by a method which employs a trained classifying device, such as a convoluted neural network, to automatically analyze diagnostic image data already during acquisition of said diagnostic image data in order to adjust, during said ongoing acquisition, a set of adjustable acquisition settings, such as certain acquisition parameters, for optimizing the data acquisition for particular vessel properties, i.e. for particular patients.
  • a trained classifying device such as a convoluted neural network
  • diagnostic image data may hereby refer to a set of acquisition images representing a patient's vasculature.
  • vasculature may refer to a vessel tree or a single vessel.
  • vasculature may particularly refer to one or more vessels of interest and/or segments thereof.
  • vessel of interest may hereby refer to a vessel of the patient which shall be assessed - with respect to potential lesions and/or other diseases - using the diagnostic image data.
  • the acquisition images of the diagnostic image data may each represent a vessel of interest of the coronary vasculature.
  • the diagnostic image data may particularly comprise one or a plurality of acquisition images of said one or more vessels of interest.
  • acquisition image may typically be understood to refer to a single image acquired for the vessel of interest, whereby multiple acquisition images may be included in the diagnostic image data.
  • the plurality of acquisition images may particularly be acquired by a medical imaging modality, such as computed tomography (CT), ultrasound (US) imaging or magnetic resonance (MR) imaging.
  • a medical imaging modality such as computed tomography (CT), ultrasound (US) imaging or magnetic resonance (MR) imaging.
  • CT computed tomography
  • US ultrasound
  • MR magnetic resonance
  • the medical imaging modality may particularly correspond to X-ray angiography, even more particularly to X-ray angiography performed with a set of pre-defined acquisition settings, such as a pre-defined imaging trajectory and a pre-defined dose of contrast agent to be used.
  • the medical imaging modality may be gated.
  • the gated medical imaging modalities may typically employ a gated reconstruction, in which the acquisition of the acquisition images is performed in parallel with acquisition of data providing information over the cardiac cycle, such as electrocardiogram (ECG) or photoplethysmographic (PPG) data.
  • ECG electrocardiogram
  • PPG photoplethysmographic
  • the diagnostic image data is received at a trained classifying device.
  • the term classifying device may particularly refer to a classifier or a classifying unit integrated into a respective apparatus for analyzing diagnostic image data.
  • the term classifying device may also refer to a classifier provided separate to the apparatus.
  • the classifying device may particularly be implemented as a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the classifying device is a trained classifying device. That is, the classifying device has previously been trained using a training dataset indicative of the correlation between the diagnostic image data and one or more quantitative features, such as vessel length, vessel location, lesion severity or the like. Specifically, the training is performed using a training dataset including diagnostic image data comparable to the one to be classified which is annotated with the respective quantitative features.
  • the annotation may have been obtained by manually annotating the diagnostic image data by a clinical expert, or may be an inherently known ground truth in case simulated training datasets are used.
  • the weights and parameters of the classifying device are then optimized in the training process such that for an input of a training dataset the resulting neural network output is numerically close to the corresponding annotated feature values. That is, the optimization of the neural network minimizes on all training datasets the difference between the neural network output and the annotated feature values.
  • the comparison of the neural network output and the annotated feature values may hereby be realized by various types of suitable metrics, e.g. by L2 norm or generalized dice loss. In some examples, the optimization may use an Adam optimizer.
  • An exemplary network structure for such a task may be an encoder-decoder neural network architecture.
  • the classifying device is used to classify the diagnostic image data in order to extract at least one quantitative feature from the diagnostic image data. That is, based on the training, the classifying device is enabled to derive, for one or more of the acquisition images in the diagnostic image data, a value for at least one quantitative features of the vessel of interest. In some embodiments, a corresponding value for one particular quantitative feature may be derived per acquisition image. Thus, a plurality of values for a particular quantitative feature may be derived for a plurality of acquisition images.
  • the quantitative features may hereby particularly correspond to features such as vessel length, vessel location, vessel diameter, lesion severity, myocardial blush values, visibility score values for the lesions and/or the vessels in the individual acquisition images, i.e. to features that may be derived on a per-image basis.
  • the quantitative features may alternatively or additionally include values indicative of the fluid dynamics through the vessel of interest, such as fractional flow reserve (FFR) values, instantaneous wave-free ratio (iFR) values or coronary flow reserve (CFR) values.
  • FFR fractional flow reserve
  • iFR instantaneous wave-free ratio
  • CFR coronary flow reserve
  • these parameters may be derived from a fluid dynamics model capable of modeling the fluid dynamics through the vessel of interest as described for example in international applications WO 2016/087396 , WO 2020/053099A1 and WO 2019/101630A1 .
  • the classifying device may be enabled to implicitly learn the fluid properties of the vessel(s) of interest and, thus, the fluid parameters related thereto without having to simulate or model the fluid flow through the vessel(s) of interest. This allows to avoid using a fluid dynamics model, but rather obtain the fluid parameters directly from the trained classifying device.
  • the quantitative features may also correspond to features related to the diagnostic image data as a whole, such as completeness scores indicating if sufficient angular information for a vessel of interest is available to obtain a reliable analysis, a reference deviation index indicating if the visible vasculature is similar to a patient averaged reference, or an obstruction score that indicates if a future tomographic reconstruction will likely show strong artefacts if the current trajectory is continued or the like.
  • the obstruction score may hereby particularly be used in the case where implants of specific external devices are in the field of view. That is, in case a particular trajectory results in a device obstructing the field of view in future projections of the planned trajectory, it may be beneficial to change the trajectory to avoid such obstruction.
  • the quantitative feature is then output associated with the one acquisition image. That is, the value for the quantitative feature that has been derived on the basis of the corresponding acquisition image is associated with said acquisition image and then output for further evaluation and/or further processing. This output is particularly performed while the image acquisition by the medical imaging modality is still in progress.
  • a computation unit or other processing device Based on the output, a computation unit or other processing device then evaluates the at least one quantitative feature, respectively its values, in association with the respective acquisition images in order to determine whether the current acquisition settings used render sufficient image quality.
  • the computation unit may hereby particularly use quantitative features such as visibility scores, completeness scores or the like. If the evaluation shows that the current acquisition settings do not produce sufficient acquisition images, one or more of the adjustable acquisition settings are adjusted. The adjustment may hereby be performed automatically, in particular on the basis of the previous classification.
  • adjustable acquisition settings may hereby particularly refer to the acquisition settings used that are not pre-defined due to the used medical imaging modality. Accordingly, in the present context, it is distinguished between pre-defined acquisition settings which shall not be changed, i.e. remain the same in order to reduce variability and adjustable acquisition settings which may be changed in accordance with the individual requirements of each patient.
  • the method may be implemented to perform out-of-distribution detection. That is, the method may be implemented to determine whether the diagnostic image data that is input into the computation unit or other processing device is within the distribution that would be expected based on the training of the classifying device. This may allow detecting if diagnostic image data is input into the computation unit or other processing device that cannot be put into relation with the kind of diagnostic data the classifying device has been trained with.
  • an indication may be output to a user that the diagnostic image data acquired cannot properly evaluated because it is not related to the kind of diagnostic image data that is expected to be evaluated.
  • This indication may be a simple warning that the diagnostic image data cannot be evaluated or can only be evaluated improperly.
  • the indication may comprise a suggestion to perform a new or additional diagnostic image data acquisition.
  • the method may, alternatively or additionally, be implemented to perform an evaluation of the diagnostic image data nonetheless, whereby the output of the evaluation may be done with respectively large error bars.
  • the adjusting the one or more adjustable image acquisition settings comprises prematurely terminating the acquisition of the diagnostic image data if it is determined that an already acquired portion of the diagnostic image data fulfils at least one pre-defined reliability criteria.
  • the adjustment of the adjustable acquisition settings may particularly comprise terminating the acquisition prior to its planned finishing if it is determined that sufficient diagnostic information has been obtained already. That is, the diagnostic image data acquired is distinguished into two or more subsets of diagnostic image data, whereby the first subset is evaluated while the second subset is currently acquired.
  • the size of each subset may largely depend on the given medical imaging modality and the acquisition quality.
  • a single acquisition image may form a subset. In other embodiments, more acquisition images may form a subset of the diagnostic image data.
  • the first subset is evaluated such as to determine whether the diagnostic information derived therefrom meets a pre-defined reliability criteria, i.e. whether enough angular information is present to already provide a reliable assessment of the vessel of interest.
  • a pre-defined reliability criteria i.e. whether enough angular information is present to already provide a reliable assessment of the vessel of interest.
  • the reliability criteria may particularly be quantified in terms of a completeness score. That is, a threshold may be determined for the completeness score and as soon as the completeness score is higher than the threshold, it is determined that sufficient angular information is available for a reliable diagnosis.
  • the reliability criteria may include further scores and/or criteria.
  • the acquisition setting to be adjusted may in particular be the acquisition ending time. Even more particularly, the acquisition ending time may be set, for example by means of a termination signal, such that the acquisition is immediately terminated. By terminating the acquisition as soon as sufficient information is available, it is possible to keep the radiation dose the patient is subjected to as low as possible. On the other hand, if it is determined that the reliability criteria is not met, i.e. that no sufficient information is available yet, the measurement may continue, i.e. no adjustment of the adjustable acquisition settings is performed. This feedback loop allowing for an adjustment of the acquisition time may be repeated frequently until the acquisition is stopped due to sufficient information being available.
  • the adjusting the one or more adjustable acquisition settings comprises adjusting an image acquisition trajectory to improve visibility of the vessel of interest in the diagnostic image data. In some modifications, the adjusting the one or more adjustable acquisition settings comprises adjusting a contrast agent injection rate into the vessel of interest during image acquisition.
  • the adjustment of the adjustable image acquisition settings may, additionally or alternatively, comprise an adjusting of the imaging trajectory used for image acquisition.
  • a visibility score for a vessel of interest and/or a lesion therein is determined for the first subset of diagnostic image data.
  • the visibility score may be compared to a pre-set reference value or threshold value, whereby the visibility is considered sufficient in case the score is above said value (or below said value) and the visibility is considered poor in case the score is below said value (or above said value): If the visibility score shows that the visibility is not sufficient, i.e.
  • the adjustment of the adjustable imaging settings may particularly comprise an adjustment of the image acquisition trajectory used to acquire the acquisition images. This allows to improve image quality which means that fewer acquisition images are need to obtain sufficient diagnostic information. This effectively reduces the radiation dose delivered to the patient.
  • the adjusted trajectory also avoids that a diagnosis must be made on images with non-ideal visibility.
  • the adjusting the one or more adjustable acquisition settings may also encompass the adjusting of a contrast agent injection rate into the vessel of interest. That is, the contrast of the vessel of interest may be determined for the first subset of diagnostic image data using the classifying device. By reviewing the contrast, it may be determined whether sufficient contrast agent has been injected into the vessel of interest. Hereby, the amount of contrast agent may vary from patient to patient since a patient having narrower vessels may need less contrast agent than a patient with wide vessels in order to achieve similar visibility. Thus, based on the contrast of the vessel of interest, it may be evaluated whether there is enough contrast agent in the vessel of interest, and, as such, the contrast injection rate is sufficient or whether it should be adjusted due to too little or too much contrast agent currently being injected into the vessel of interest.
  • the adjusting of the adjustable acquisition setting thus comprises adjusting the contrast agent injection rate based on the properties of the vessel of interest.
  • the contrast agent dose delivered to each patient may be optimized.
  • the method further comprises obtaining training image data of the vessel of interest according to the pre-defined acquisition method and extracting the at least one quantitative feature from the training image data, generating at least one training dataset for the classifying device, the training dataset comprising the training image data associated with the at least one quantitative feature, and training the classifying device using the at least one training dataset.
  • the classifying device may be trained using respective training datasets.
  • these training datasets may be derived on the basis of training image data.
  • the term training image data may hereby particularly refer to a plurality of training images data having been acquired in a clinical environment, i.e. to measurement data, or to a plurality of training images having been generated by simulation.
  • one or more quantitative features may be extracted from the individual training images as well as the training image data as a whole. Whether or not individual images or the data as a whole is used hereby depends on the respective quantitative value.
  • the feature extraction may hereby be performed manually by one or more users, automatically by a respective algorithm or may correspond to the quantitative features being readily available from the simulation of the data.
  • a respective training dataset is then generated. That is, the quantitative feature values are associated with the respective training images and/or the training image data in order to derive the correlation between the quantitative feature values and the respective image data.
  • the thus generated training dataset may then be used to train the classifying device.
  • the training image data comprises simulated training image data generated by simulating an image acquisition according to the pre-defined acquisition method, wherein the simulating comprises the steps of obtaining at least one three-dimensional geometric model of the vessel of interest, obtaining at least one two-dimensional background image for the vessel of interest, and simulating a contrast agent fluid dynamic through the patient's vasculature based on at least one contrast agent fluid parameter.
  • the simulating further comprises obtaining deformation translation and rotation data, and augmenting the simulated training image data based on the translation and rotation data.
  • the generating the at least one training dataset further comprises the steps of receiving additional patient data, and adjusting the at least one training dataset in accordance with the additional patient data.
  • the training image data is generated by means of a simulation.
  • at least one three-dimensional geometric model of a patient's vasculature including the vessel of interest is obtained.
  • the geometric model may be obtained from a medical image which may have been acquired by any medical imaging modality that allows to acquire three-dimensional medical images.
  • the medical imaging modality may correspond to the medical imaging modality for which the live adaptation shall be performed.
  • the medical imaging modality may be a different imaging modality.
  • the geometrical model may also be purely virtual, and defined by common anatomical knowledge. Further, at least one two-dimensional background image of the vasculature of the patient including the vessel of interest is acquired. By means of the background image, it is possible to distinguish the background and the vasculature in the medical images in order to properly perform a vessel identification of the vessels in the vasculature.
  • the background image may also provide a realistic appearance to the simulated data.
  • the two-dimensional background image may hereby have been obtained from an actual clinical acquisition and/or it may have been constructed from a forward projection of a three-dimensional medical image and/or it may be a virtual image designed to mimic typical background seen in the diagnostic data that is to be simulated.
  • the three-dimensional medical image and/or the two-dimensional background image may be used to generate a fluid dynamics model representative of the fluid dynamics through the patient's vasculature.
  • the fluid dynamics model may particularly comprise a lumped parameter model.
  • lumped parameter model may particularly refer to a model in which the fluid dynamics of the vessels are approximated by a topology of discrete entities.
  • a vasculature such as a vessel tree
  • a topology of resistor elements each having a particular resistance.
  • the outlet at a distal end of the vessel is also represented by a particular resistor element.
  • This resistor element is then connected to ground such as to represent the termination of the vessel.
  • respective resistor elements may be connected to the series of resistor elements representing the vessel of interest, such as to represent the outflow from the vessel of interest at certain bifurcations. These resistor elements may typically also be connected to ground.
  • lumped parameter models reduce the number of dimensions compared to other approaches such as Navier-Stokes or the like. Accordingly, using a lumped parameter model may allow for a simplified calculation of the fluid dynamics inside the vessels and may ultimately result in reduced processing time.
  • the employing of such a lumped parameter model is described for example in international application WO 2016/087396 .
  • the thus generated fluid dynamics model may then be employed to simulate a contrast agent fluid flow through the patient's vasculature and, in particular, through the vessel or vessels of interest. This allows to generate training image data representative of the vasculature and the corresponding fluid dynamics through it. In some embodiments, in order to augment the training image data, deformation translation and rotation data may be added to the simulation as additional information. The thus generated training image data may then be provided to the classifying device for training.
  • the at least one quantitative feature comprises one or more of: a vessel label of a vessel in the patient's vasculature and/or a vessel length of a vessel in the patient's vasculature and/or a severity of a lesion in a vessel in the patient's vasculature and/or a vessel diameter of a vessel in the patient's vasculature and/or a visibility score for a lesion and/or a vessel in the patient's vasculature and/or a completeness score for the at least one of the plurality of acquisition images and/or a myocardial blush value.
  • additional patient information such as ECG data, aortic pressure value or historical data for a particular patient may also be added to the training datasets and/or the classification. This may have the further benefit that additional patient anomalies such as strongly elevated aortic pressure may be detected as, in those cases, the injection of the contrast agent may have to be adjusted as well.
  • the outputting the at least one quantitative feature for further evaluation comprises the steps of displaying the at least one quantitative feature to a user and/or outputting the at least one quantitative feature in a pre-defined format for automatic reporting to a reporting entity.
  • the user may input additional data in response to the outputting, whereby the additional data may further be used to train the classifying device and/or to evaluate the diagnostic image data.
  • an apparatus for analyzing diagnostic image data comprising a trained classifying device configured to receive diagnostic image data comprising a plurality of acquisition images of a vessel of interest, the diagnostic image data having been acquired using a pre-defined acquisition method, classify the diagnostic image data to extract at least one quantitative feature of the vessel of interest from at least one acquisition image of the plurality of acquisition images, and output the at least one quantitative feature of the vessel of interest associated with the at least one acquisition image while the acquisition of the diagnostic image data is still in progress, and a computation unit configured to adjust one or more adjustable image acquisition settings based on the at least one quantitative feature to optimize the acquisition of the diagnostic image data.
  • the apparatus further comprises an input unit configured to obtain training image data of the vessel of interest according to the pre-defined acquisition method, a training dataset generation unit configured to extract the at least one quantitative feature of the vessel of interest from the training image data and to generate at least one training dataset for the classifying device, the training dataset comprising the training image data associated with the at least one quantitative feature, and to provide the at least one training dataset to the classifying device for training.
  • the apparatus may also comprise a display unit configured to generate a graphical representation of at least one acquisition image of the plurality of acquisition images and/or the at least one quantitative feature, and a user interface configured to receive user inputs in response to the graphical representation.
  • a computer program for controlling an apparatus according to the invention is provided, which, when executed by a processing unit, is adapted to perform the method steps according to the invention.
  • a computer-readable medium is provided having stored thereon the above-cited computer program.
  • Fig. 1 represents schematically an exemplary embodiment of an apparatus 1 for analyzing diagnostic image data.
  • the apparatus 1 comprises an input unit 100, a training dataset generation unit 200, a classifying unit 300, a computation unit 400 and a display unit 500. Further, the classifying unit 300 and the communication unit 400 are communicatively coupled to a medical imaging modality 2 in a feedback loop 600.
  • Input unit 100 is configured to receive training image data 10 of a patient's vasculature.
  • the training image data 10 may particularly correspond to or comprise image data that has been previously acquired using a pre-defined acquisition method, i.e. an acquisition method performed with one or more pre-defined (known) acquisition settings, such as known contrast agent dose and acquisition trajectory.
  • the training image data 10 may particularly correspond to clinical data that has been acquired by means of X-ray angiography using a C-arm. That is, in the specific embodiment of Fig. 1 , the training image data 10 has been derived from actual measurement data.
  • the training image data 10 may also have been generated using a simulation or the like.
  • simulated training image data 10 three-dimensional medical images, usually acquired using the CT and/or MR imaging modality, may be acquired, used to generate a three-dimensional model of the vessel of interest and combined with two-dimensional background data showing cardiac images without any contrast agent filling of the arteries.
  • training image data und a corresponding training dataset is generated.
  • deformations translations and rotations may be added to the three-dimensional representation of the vessel of interest and the two-dimensional background projection to achieve data augmentation.
  • the full range of the acquisition trajectory is then typically covered by a cardiac motion model.
  • Training dataset generation unit 200 is configured to extract one or more quantitative features of the patient's vasculature, and, in particular, the vessel of interest, from the training image data 10.
  • these quantitative features may particularly relate to vessel labels, vessel numbers, vessel location and/or the vessel length of the vessels in the vasculature, the severity of a lesion or multiple lesions in one or more vessels of interest, a myocardial blush value, a vessel diameter of the vessel of interest, a visibility score for a lesion in the vessel of interest for each individual training image of the training image data, a completeness score indicating if sufficient angular information for a given vessel is available to allow for a reliable analysis, a reference deviation index, indicating of the visible vasculature is similar to a reference, or the like.
  • the training dataset generation unit 200 is configured to generate at least one training dataset comprising the training image data 10 and the respective pre-defined features that are associated with one or more of the training images in the training image data 10.
  • the training dataset generation unit 200 thus obtains a correlation between the training image data 10 and the extracted pre-defined features and generates a corresponding dataset comprising the correlated information.
  • This corresponding dataset is then provided, as a training dataset 20, to the classifying unit - or classifying device - 300.
  • Classifying unit 300 comprises an input port 301 configured to receive the training dataset from training dataset generation unit 200. Classifying device 300 uses the training dataset 20 - or, optionally, multiple training datasets 20 - to train the relation between the quantitative features and the training images in the training image data 10.
  • classifying unit comprises or corresponds to a convolutional neural network, in some embodiments a deep convolutional neural network. That is, classifying unit 400 implements a plurality of convolutional layers in combination with a pooling layer.
  • the training dataset 20 input into classifying unit 300 corresponds to a plurality of training images having been acquired using X-ray angiography.
  • individual angiography images are used as respective training images.
  • These training images are provided with respective feature data in terms of a pixel mask that is provided for each individual angiography image, whereby each pixel is either classified as belonging to the left anterior descending artery (LAD), the left circumflex artery (LCX), the obtuse marginal branches (OM), the right coronary artery or the like arteries, or as belonging to the background.
  • the training images may be provided with feature data comprising, for each angiography image, a single value indicating the minimal diameter of the arteries, and/or indicating that (parts of) the arteries are not visible.
  • the weights and parameters of the classifying device 300 are optimized so that for the input training dataset 20, the resulting neural network output is numerically close to the corresponding annotated feature values. That is, the optimization of the neural network minimizes on all training datasets the difference between the neural network output and the annotated feature values.
  • the comparison of the neural network output and the annotated feature values can be realized by various types of suitable metrics, such as for example L2 norm or generalized dice loss.
  • the optimization may particularly use an Adam optimizer.
  • An exemplary network structure for such a task may be an encoder-decoder neural network architecture.
  • the classifying unit 300 Upon finalizing training using the training dataset 20, the classifying unit 300 is configured to receive, via input port 302, from the medical imaging modality 2, a first subset of diagnostic image data 30 obtained for a particular patient.
  • the first subset of diagnostic image data 30 may particularly comprise a plurality of acquisition images 31 that have been acquired using a pre-defined acquisition method, whereby the pre-defined acquisition method corresponds to the pre-defined acquisition method for the training image data in order to ensure that the classifying unit 300 has been trained with the proper training datasets to accurately classify the diagnostic image data 30.
  • the input to the classifying device corresponds to the plurality of acquisition images 31 in the diagnostic image data 30, each acquisition image 31 corresponding to a single two-dimensional X-ray angiography image.
  • the plurality of acquisition mages 31 may also correspond to a chronological stack of multiple two-dimensional angiography images, such as respective C-arm angulations. That is, the input to the classifying device corresponds to the same diagnostic image data 30 that is presented to the user, such as the physician, for visual review.
  • At least one quantitative feature that is suitable for analyzing the diagnostic image data 30 is extracted from the diagnostic image data 30.
  • the extracted quantitative feature values and the first subset of diagnostic image data 30 comprising the one or more acquisition images 31 are then provided to computation unit 400 for further processing. It shall be understood that the first subset of diagnostic image data 30 is provided to the computation unit 400 for further processing while the acquisition of the second subset of diagnostic image data 30 is still in progress. This allows to use the evaluation by computation unit 400 to adjust the image acquisition where possible and/or necessary.
  • the computation unit 400 determines, based on the first subset of diagnostic image data and the extracted quantitative features, whether an adjustment of the acquisition parameters for the image acquisition may be beneficial. In the specific example of Fig. 1 , computation unit 400 derives, for that purpose, a reliability criteria for the diagnostic information to be derived from the diagnostic image data 30 and the quantitative features.
  • the computation unit 400 processes the first subset of diagnostic image data 30 and the quantitative features derived therefrom and determines whether the reliability criteria is met or not. In the specific embodiment, this is achieved by comparing the diagnostic information that may be derived from the first subset of diagnostic image data 30 and the quantitative features to a threshold value which indicates sufficiency of the diagnostic information.
  • the computation unit 400 is then configured to adjust the adjustable image acquisition settings by outputting a corresponding termination signal to medical imaging modality 2, i.e. by adjusting the acquisition settings such that the acquisition is prematurely terminated, i.e. finished prior to its originally set termination point. That is, in response to said termination signal, medical imaging modality 2 terminates the further image acquisition, thereby avoiding unnecessary radiation and contrast agent dose to be delivered to the patient.
  • the computation unit 400 determines that the reliability criteria is not met, i.e. that no sufficient information is available yet, the computation unit 400 will not output any termination signal to the medical imaging modality 2 and the medical imaging modality 2 will continue acquisition of a second subset of diagnostic image data.
  • the computation unit 400 may hereby terminate the acquisition procedure as soon as it is determined that sufficient diagnostic information is available.
  • the adjustment of the adjustable image acquisition settings may, additionally or alternatively, comprise an adjusting of the imaging trajectory used for image acquisition.
  • evaluating the first subset of diagnostic image data 30 may comprise determining a visibility score for a vessel of interest in the individual acquisition images. If the computation unit 400 registers poor visibility, the computation unit 400 may be configured to automatically adjust the imaging trajectory to improve visibility of the vessel of interest. By adjusting the imaging trajectory in order to improve visibility, less acquisition images 31 are required to obtain sufficient diagnostic information, thereby optimizing the radiation dose delivered to the patient.
  • the computation unit 400 may also evaluate the first subset of diagnostic image data 30 along with the extracted quantitative features in order to determine a contrast of the vessel of interest. This allows to determine whether sufficient contrast agent has been injected into the vessel of interest.
  • the amount of contrast agent necessary to provide sufficient visibility of the vessel(s) of interest may vary from patient to patient.
  • a patient having narrower vessels may need less contrast agent, whereby a patient with wide vessels may need more contrast agent in order to achieve similar visibility.
  • the computation unit 400 may be configured to adjust, as a further adjustable acquisition setting, the contrast agent injection rate based on the properties of the vessel of interest, whereby a lower rate is used for patients having narrow vessels (i.e. requiring less contrast agent) and a higher rate is used for patients having wider vessels (i.e. requiring more contrast agent). By means of this adjustment, the contrast agent dose delivered to each patient may be optimized.
  • the above-described evaluation process may be repeated for a second subset of diagnostic image data 30 (and any subsequent subset) until the reliability criteria are met, i.e. until sufficient diagnostic information is available.
  • the computation unit 400 may be configured to adjust the contrast agent injection rate in accordance with the respective vessel properties for the patient and to further terminate the acquisition procedure as soon as it is determined that sufficient diagnostic information is available.
  • a feedback loop is implemented which allows a live adaptation of the acquisition parameters in order to optimize diagnostic image data acquisition.
  • the diagnostic image data 30, along with the extracted features, is further provided to a display unit 500.
  • the display unit 500 may particularly comprise a screen 501 for displaying information graphically and a user interface 502, such as a keyboard, a touchpad, a mouse, a touchscreen or the like configured to allow the user to provide inputs and generally operate the device.
  • the display unit 500 is configured to generate a graphical representation of the image data 30 and the extracted quantitative features and to present this information to a user on screen 501.
  • the user may then review the presented information and provide respective input thereon via the user interface 502.
  • the user input may then be used for further evaluation of the data.
  • the user input may also be used to be returned to the trained classifying unit 300 and used, by the classifying unit, for further training.
  • Fig. 2 shows a flow chart of a method 1000 for analyzing diagnostic image data using an apparatus 1 in accordance with Fig. 1 .
  • the input unit 100 to receives training image data 10 that may have been generated as explained in relation to Fig.3 .
  • the training image data may also have been generated by different means.
  • the input unit 100 provides the training image data 10 to training dataset generation unit 200.
  • the training dataset generation unit 200 receives the training image data 10 and, in step S202, extracts one or more quantitative features of the patient's vasculature, in particular of one or more vessels of interest in the patient's vasculature, from the training image data 10.
  • the training dataset generation unit 200 correlates the one or more quantitative features to the training image data 10 in step S203.
  • step S204 the training dataset generation unit 200 generates at least one training dataset. This at least one training dataset 20 is provided to the classifying unit 300 in step S205.
  • step S301 classifying unit 300 receives the training dataset 20 from training dataset generation unit 200.
  • step S302 classifying unit 300 then uses the training dataset 20 for training as described in relation to Fig. 1 .
  • the classifying unit 300 receives a first subset of diagnostic image data 30 acquired by medical imaging modality 2.
  • the classifying unit 300 classifies the plurality of acquisition images in the first subset of diagnostic image data 30 to extract at least one quantitative feature, in particular at least one value for the at least one quantitative feature, from at least one acquisition image 31 of the diagnostic image data 30.
  • step S305 classifying unit 300 provides the diagnostic image data 30 along with the extracted features to computation unit 400 for further processing. It shall hereby again be understood that the first subset of diagnostic image data 30 is provided to the computation unit 400 for further processing while the acquisition of a second subset of diagnostic image data 30 is still ongoing, thereby allowing to use the evaluation by computation unit 400 to adjust the image acquisition.
  • computation unit 400 receives, in step S401, the first subset of diagnostic image data 30 along with the extracted quantitative feature, and evaluates, in step S402, the received data in order to determine whether adjustment of one or more adjustable acquisition settings may be necessary.
  • the computation unit 400 compares the visibility score to a reference value in order to determine whether the visibility of the vessel of interest is sufficient or whether it needs to be improved.
  • step S404 the computation unit 400 determines an optimized imaging trajectory in order to improve visibility and, in step S407 generates a corresponding adjustment signal and provides said adjustment signal to medical imaging modality 2 to automatically adjust the imaging trajectory.
  • the medical imaging modality adjusts the imaging trajectory used to acquire the second subset of diagnostic image data 30. The method is then repeated in a loop starting from step S303 with the second subset of diagnostic image data 30.
  • step S404 the computation unit 400 evaluates the first subset of diagnostic image data 30 in order to determine whether sufficient diagnostic information may be derived from the accumulation of acquisition images 31 in the first subset of diagnostic image data 30. If that is the case ("Y"), computation unit 400 generates a termination signal and provides said termination signal to the medical imaging modality 2 in step S545. This results in the termination of the image acquisition in step S406.
  • step S404 determines whether the diagnostic information that may be derived from the accumulation of acquisition images 31 in the first subset of diagnostic image data 30 is not sufficient.
  • step S405' no termination signal is generated and the method proceeds to continue with the second subset (or a subsequent subset) of diagnostic image data 30 being received and processed in steps S303 to S404.
  • Fig. 3 shows a flow chart of a method S2000 for generating training image data according to an embodiment.
  • a simulation unit obtains at least one medical image of the patient and generates a three-dimensional geometric model of said patient's vasculature therefrom.
  • the medical image may particularly have been obtained by a medical imaging modality.
  • the medical imaging modality may correspond to medical imaging modality 2 or may be a different imaging modality.
  • step S2002 the simulation unit further obtains at least one two-dimensional background image of the vasculature of the patient in order to accurately distinguish background from vasculature.
  • step S2003 the simulation unit then performs a vessel identification and identifies the vessels in the vasculature. Further, in step S2004, the simulation unit uses the three-dimensional medical image and/or the two-dimensional background image to generate a fluid dynamics model of the blood flow through the patient's vasculature.
  • the fluid dynamics model may comprise or correspond to a lumped parameter model, i.e. a model in which the fluid dynamics of the vessels are approximated by a topology of discrete entities.
  • step S2005 This model is used, in step S2005 to simulate a contrast agent fluid flow through the patient's vasculature.
  • the simulation unit may optionally further receive deformation translation and rotation data as additional information.
  • step S2007 the simulation unit may then use the additional information in order to augment the training image data.
  • step S2008 the training image data is output to be provided to the classifying unit 400.
  • the training data has been generated based on a simulation using a fluid dynamics model, it shall be understood that the training data may also be derived from historical clinical data of one or more patients.
  • the adjustment of the adjustable acquisition parameter concerned a change in imaging trajectory and a termination of the acquisition process
  • other kinds of adjustments may be made automatically on the basis of the classification of the already received diagnostic image data, such as an adjustment of the radiation dose to be delivered to a target region and/or an adjustment of the injection rate into the vessel of interest and so on.
  • a single unit or device may fulfill the functions of several items recited in the claims.
  • the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • Procedures like the generating of the training dataset, the training of the classifying device, the classifying of the image data, the simulation of the training image data to generate the training image data or the like that may have been explained to be performed by a single unit may also be performed by multiple units. Also, certain procedures may be performed by the same unit, rather than separate units.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • the invention relates to a method for analyzing diagnostic image data, comprising the steps of receiving diagnostic image data comprising a plurality of acquisition images of a vessel of interest at a trained classifying device, the diagnostic image data having been acquired using a pre-defined acquisition method, classifying the diagnostic image data to extract at least one quantitative feature of the vessel of interest from at least one acquisition image of the plurality of acquisition images, outputting the at least one quantitative feature of the vessel of interest associated with the at least one acquisition image while the acquisition of the diagnostic image data is still in progress, and adjusting one or more adjustable image acquisition settings based on the at least one quantitative feature to optimize the acquisition of the diagnostic image data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Quality & Reliability (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Human Computer Interaction (AREA)

Claims (15)

  1. Procédé d'analyse de données d'images de diagnostic, comprenant:
    recevoir de données d'image de diagnostic comprenant une pluralité d'images d'acquisition d'un vaisseau d'intérêt au niveau d'un dispositif de classification formé, les données d'image de diagnostic ayant été acquises en utilisant un procédé d'acquisition prédéfini,
    classer les données d'image de diagnostic pour extraire au moins une caractéristique quantitative du vaisseau concerné à partir d'au moins une image d'acquisition de la pluralité d'images d'acquisition, caractérisé par la fourniture de l'au moins une caractéristique quantitative du vaisseau concerné associée à la au moins une image d'acquisition alors que l'acquisition des données d'image de diagnostic est toujours en cours, et
    l'ajustement d'un ou plusieurs paramètres d'acquisition d'image réglables sur la base de l'au moins une caractéristique quantitative pour optimiser l'acquisition des données d'image de diagnostic.
  2. Procédé selon la revendication 1, dans lequel l'ajustement d'un ou de plusieurs paramètres d'acquisition d'image réglables comprend l'arrêt prématuré de l'acquisition des données d'image de diagnostic s'il est déterminé qu'une partie déjà acquise des données d'image de diagnostic remplit au moins un critère de fiabilité prédéfini.
  3. Procédé selon la revendication 1, dans lequel l'ajustement d'un ou plusieurs paramètres d'acquisition réglable comprend:
    l'ajustement d'une trajectoire d'acquisition d'image pour améliorer la visibilité du vaisseau d'intérêt dans les données d'image de diagnostic.
  4. Procédé selon la revendication 1, dans lequel l'ajustement d'un ou plusieurs paramètres d'acquisition réglables comprend:
    l'ajustement d'un taux d'injection d'agent de contraste dans le vaisseau d'intérêt pendant l'acquisition d'image.
  5. Procédé selon la revendication 1, comprenant en outre:
    obtenir des données d'image d'apprentissage du vaisseau d'intérêt selon le procédé d'acquisition prédéfini et l'extraction de l'au moins une caractéristique quantitative à partir des données d'image d'apprentissage,
    générer au moins un ensemble de données d'apprentissage pour le dispositif de classification, l'ensemble de données d'apprentissage comprenant les données d'image d'apprentissage associées à l'au moins une caractéristique quantitative, et
    l'apprentissage du dispositif de classification en utilisant l'au moins un ensemble de données d'apprentissage.
  6. Procédé selon la revendication 5, dans lequel les données d'image d'apprentissage comprennent
    des données d'image d'apprentissage simulées générées en simulant une acquisition d'image selon le procédé d'acquisition prédéfini, dans lequel la simulation comprend:
    l'obtention d'au moins un modèle géométrique tridimensionnel du vaisseau d'intérêt;
    l'obtention d'au moins une image d'arrière-plan bidimensionnelle pour le vaisseau concerné; et
    la simulation d'une dynamique de fluide d'agent de contraste à travers le système vasculaire du patient sur la base d'au moins un paramètre de fluide d'agent de contraste.
  7. Procédé selon la revendication 6, dans lequel la simulation comprend en outre les étapes consistant à l'obtention de données de translation et de rotation de déformation,
    l'augmentation des données d'image d'apprentissage simulées sur la base des données de translation et de rotation.
  8. Procédé selon la revendication 5, dans lequel la génération de l'au moins un ensemble de données d'apprentissage comprend en outre:
    la réception de données de patient supplémentaires, et
    l'ajustement de l'au moins un ensemble de données d'apprentissage en fonction des données supplémentaires du patient.
  9. Procédé selon la revendication 1, dans lequel l'au moins une caractéristique quantitative comprend un ou plusieurs des éléments suivants: une étiquette de vaisseau d'un vaisseau dans le système vasculaire du patient et/ou une longueur de vaisseau d'un vaisseau dans le système vasculaire du patient et/ou une gravité d'une lésion dans un vaisseau dans le système vasculaire du patient et/ou un diamètre de vaisseau d'un vaisseau dans le système vasculaire du patient et/ou un score de visibilité pour une lésion et/ou un vaisseau dans le système vasculaire du patient et/ou un score de complétude pour l'au moins une de la pluralité d'images d'acquisition et/ou une valeur de rougissement myocardique.
  10. Procédé selon la revendication 1, dans lequel la fourniture de l'au moins une caractéristique quantitative pour une évaluation supplémentaire comprend:
    l'affichage de l'au moins une caractéristique quantitative à un utilisateur, et/ou
    la fourniture de l'au moins une caractéristique quantitative dans un format prédéfini pour un rapport automatique à une entité de rapport.
  11. Appareil pour analyser des données d'image de diagnostic, comprenant:
    un dispositif de classification formé configuré pour recevoir des données d'image de diagnostic comprenant une pluralité d'images d'acquisition d'un vaisseau d'intérêt, les données d'image de diagnostic ayant été acquises en utilisant un procédé d'acquisition prédéfini,
    classer les données d'image de diagnostic pour extraire au moins une caractéristique quantitative du vaisseau concerné à partir d'au moins une image d'acquisition de la pluralité d'images d'acquisition, et
    fournir l'au moins une caractéristique quantitative du vaisseau concerné associée à l'au moins une image d'acquisition alors que l'acquisition des données d'image de diagnostic est toujours en cours, et
    une unité de calcul configurée pour ajuster un ou plusieurs paramètres d'acquisition d'image réglables sur la base de l'au moins une caractéristique quantitative pour optimiser l'acquisition des données d'image de diagnostic.
  12. Appareil selon la revendication 11 comprend en outre:
    une unité d'entrée configurée pour obtenir des données d'image d'apprentissage du vaisseau d'intérêt selon le procédé d'acquisition prédéfini;
    une unité de génération d'ensemble de données d'apprentissage configurée pour extraire la ou les caractéristique quantitative du vaisseau concerné à partir des données d'image d'apprentissage, et pour générer au moins un ensemble de données d'apprentissage pour le dispositif de classification, l'ensemble de données d'apprentissage comprenant les données d'image d'apprentissage associées à l'au moins une caractéristique quantitative, et pour fournir l'au moins un ensemble de données d'apprentissage au dispositif de classification pour l'apprentissage.
  13. Appareil selon la revendication 11 comprenant en outre:
    une unité d'affichage configurée pour générer une représentation graphique d'au moins une image d'acquisition de la pluralité d'images d'acquisition et/ou de l'au moins une caractéristique quantitative, et
    une interface utilisateur configurée pour recevoir des entrées utilisateur en réponse à la représentation graphique.
  14. Programme d'ordinateur pour commander un appareil selon l'une quelconque des revendications 11 à 13 qui, lorsqu'il est exécuté par une unité de traitement, est adapté pour réaliser le procédé selon les revendications 1 à 10.
  15. Support lisible par ordinateur sur lequel est stocké le programme informatique selon la revendication 14.
EP20734240.3A 2019-06-28 2020-06-29 Analyse automatique de coronarographie Active EP3989828B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP19183278.1A EP3756547A1 (fr) 2019-06-28 2019-06-28 Analyse automatique de coronarographie
PCT/EP2020/068273 WO2020260701A1 (fr) 2019-06-28 2020-06-29 Analyse automatisée d'angiographie coronaire

Publications (2)

Publication Number Publication Date
EP3989828A1 EP3989828A1 (fr) 2022-05-04
EP3989828B1 true EP3989828B1 (fr) 2022-11-30

Family

ID=67184780

Family Applications (2)

Application Number Title Priority Date Filing Date
EP19183278.1A Withdrawn EP3756547A1 (fr) 2019-06-28 2019-06-28 Analyse automatique de coronarographie
EP20734240.3A Active EP3989828B1 (fr) 2019-06-28 2020-06-29 Analyse automatique de coronarographie

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP19183278.1A Withdrawn EP3756547A1 (fr) 2019-06-28 2019-06-28 Analyse automatique de coronarographie

Country Status (5)

Country Link
US (1) US20220351369A1 (fr)
EP (2) EP3756547A1 (fr)
JP (2) JP7200406B2 (fr)
CN (1) CN114041167A (fr)
WO (1) WO2020260701A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12039685B2 (en) 2020-09-23 2024-07-16 Cathworks Ltd. Methods, apparatus, and system for synchronization between a three-dimensional vascular model and an imaging device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3939003B1 (fr) 2019-03-12 2024-04-03 Bayer HealthCare, LLC Systèmes et procédés permettant d'évaluer une probabilité de cteph et d'identifier des caractéristiques indiquant celle-ci
JP2022549604A (ja) 2019-09-18 2022-11-28 バイエル、アクチエンゲゼルシャフト 教師あり学習によって訓練された予測モデルによるmri画像の予測
EP4031895A1 (fr) * 2019-09-18 2022-07-27 Bayer Aktiengesellschaft Système, procédé et produit programme informatique pour prédire, anticiper et/ou évaluer des caractéristiques de tissu

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7627386B2 (en) * 2004-10-07 2009-12-01 Zonaire Medical Systems, Inc. Ultrasound imaging system parameter optimization via fuzzy logic
US8081811B2 (en) * 2007-04-12 2011-12-20 Fujifilm Corporation Method, apparatus, and program for judging image recognition results, and computer readable medium having the program stored therein
RU2556968C2 (ru) * 2009-10-06 2015-07-20 Конинклейке Филипс Электроникс Н.В. Углы наблюдения автоматической с-дуги для лечения системных сердечных заболеваний
JP5695140B2 (ja) * 2013-07-29 2015-04-01 株式会社東芝 医用画像診断装置
KR102233319B1 (ko) * 2014-01-20 2021-03-29 삼성전자주식회사 관심 영역 추적 방법, 방사선 촬영 장치, 방사선 촬영 장치의 제어 방법 및 방사선 촬영 방법
JP2015217170A (ja) * 2014-05-19 2015-12-07 株式会社東芝 X線診断装置
US11141123B2 (en) 2014-12-02 2021-10-12 Koninklijke Philips N.V. Fractional flow reserve determination
DE102016207367A1 (de) * 2016-04-29 2017-11-02 Siemens Healthcare Gmbh Festlegen von Scanparametern einer CT-Bildaufnahme mit Hilfe einer Außenbildaufnahme
US10667776B2 (en) * 2016-08-11 2020-06-02 Siemens Healthcare Gmbh Classifying views of an angiographic medical imaging system
JP6903495B2 (ja) * 2017-06-12 2021-07-14 株式会社日立製作所 X線ct装置、処理方法、及びプログラム
EP3488774A1 (fr) 2017-11-23 2019-05-29 Koninklijke Philips N.V. Guide de mesure pour l'estimation du débit coronaire selon le principe de bernoulli
EP3624132A1 (fr) 2018-09-13 2020-03-18 Koninklijke Philips N.V. Calcul de conditions limites pour calcul ifr et ffr virtuel sur la base des caractéristiques d'une opacification myocardique

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12039685B2 (en) 2020-09-23 2024-07-16 Cathworks Ltd. Methods, apparatus, and system for synchronization between a three-dimensional vascular model and an imaging device

Also Published As

Publication number Publication date
US20220351369A1 (en) 2022-11-03
JP2023033308A (ja) 2023-03-10
JP2022531989A (ja) 2022-07-12
EP3989828A1 (fr) 2022-05-04
WO2020260701A1 (fr) 2020-12-30
CN114041167A (zh) 2022-02-11
JP7200406B2 (ja) 2023-01-06
EP3756547A1 (fr) 2020-12-30

Similar Documents

Publication Publication Date Title
EP3989828B1 (fr) Analyse automatique de coronarographie
US11195278B2 (en) Fractional flow reserve simulation parameter customization, calibration and/or training
KR20150122183A (ko) 환자-특정 기하학적 모델들을 변경함으로써 치료들을 결정하기 위한 방법 및 시스템
US10223795B2 (en) Device, system and method for segmenting an image of a subject
US20240078676A1 (en) Interactive coronary labeling using interventional x-ray images and deep learning
EP3606433B1 (fr) Mesure métrique standardisée de maladie artérielle coronarienne
US20190076105A1 (en) Hemodynamic parameters for co-registration
EP3602485B1 (fr) Surveillance d'interaction de ffr basé sur une imagerie non invasive
US11657519B2 (en) Method for deformation correction
US20210383539A1 (en) Orientation detection for 2d vessel segmentation for angio-ffr
US11918291B2 (en) Simulation of transcatheter aortic valve implantation (TAVI) induced effects on coronary flow and pressure
EP3989832B1 (fr) Registrement des vaisseaux sanguins à l'aide d'informations fonctionnelles
JP2023044939A (ja) 医用画像処理装置、方法及びプログラム
Qian Intelligent Diagnostic Imaging and Analysis

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220128

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
INTG Intention to grant announced

Effective date: 20220801

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1534059

Country of ref document: AT

Kind code of ref document: T

Effective date: 20221215

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602020006661

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20221130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230331

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230228

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1534059

Country of ref document: AT

Kind code of ref document: T

Effective date: 20221130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230330

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230301

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230627

Year of fee payment: 4

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602020006661

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20230831

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20230630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230629

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230629

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230629

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230629

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221130

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230630

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230630