EP4009227A1 - Calcul et affichage de covariance-spectrale locale - Google Patents

Calcul et affichage de covariance-spectrale locale Download PDF

Info

Publication number
EP4009227A1
EP4009227A1 EP20210977.3A EP20210977A EP4009227A1 EP 4009227 A1 EP4009227 A1 EP 4009227A1 EP 20210977 A EP20210977 A EP 20210977A EP 4009227 A1 EP4009227 A1 EP 4009227A1
Authority
EP
European Patent Office
Prior art keywords
image
local
mono
energetic
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20210977.3A
Other languages
German (de)
English (en)
Inventor
Rafael Wiemker
Liran Goshen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to EP20210977.3A priority Critical patent/EP4009227A1/fr
Priority to EP21820564.9A priority patent/EP4256463A2/fr
Priority to PCT/EP2021/083262 priority patent/WO2022117468A2/fr
Priority to JP2023532493A priority patent/JP2023552333A/ja
Priority to US18/038,546 priority patent/US20240090849A1/en
Priority to CN202180080678.3A priority patent/CN116710970A/zh
Publication of EP4009227A1 publication Critical patent/EP4009227A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis

Definitions

  • the present invention generally relates to multispectral imaging, and in particular relates to an apparatus and a method for processing image data of an object of interest comprising first mono-energetic image acquired at a first energy and a second mono-energetic image acquired at a second energy different from the first energy, a medical imaging system, a computer program element, and a computer readable medium.
  • Spectral-CT Multispectral imaging is becoming increasingly more common in clinical practice due to the rapid rise in computer technology and expanding literature exhibiting vast advantages over conventional single energy imaging.
  • Spectral-CT for example, generates a number of spectral channels for low- to high-energy levels, which are governed by Compton- and Photoelectric-effect to varying degree.
  • the spectral channels can also be converted into various representations with pure Compton-, pure Photoelectric-, and pure (pseudo/virtual-) mono-energy-images.
  • the rich multispectral information per voxel provided by spectral images may be difficult to condense into one view for efficient reading.
  • conventional images may be often preferred for viewing because of existing reading habits and accumulated expertise.
  • an apparatus for processing image data of an object of interest comprising a first mono-energetic image acquired at a first energy and a second mono-energetic image acquired at a second energy different from the first energy.
  • the apparatus comprises an input module, a computation module, a rendering module, and an output module.
  • the input module is configured for receiving the image data of the object of interest.
  • the computation module is configured for determining local covariance matrices at a plurality of image positions in the first and second mono-energetic images. Each local covariance matrix is a matrix of local variances and local covariances between image intensities at one of the plurality image positions in the first and second mono-energetic images.
  • the rendering module is configured for overlaying values of local variances and/or local covariances at the plurality of image positions with first mono-energetic image and/or the second mono-energetic image.
  • the output module is configured for providing an overlaying result.
  • the image data may be acquired by a dual- or multi-energy CT scanner.
  • a dual-energy CT an additional attenuation measurement is obtained with a second x-ray spectrum (i.e. a second "energy"), allowing the differentiation of multiple materials.
  • a second x-ray spectrum i.e. a second "energy”
  • this allows quantification of the mass density of two or three materials in a mixture with known elemental composition.
  • the image data may be acquired by multi-parametric magnetic resonance imaging (MRI), which includes the following sequences: T1-weighted images, T2-weighted images, diffusion-weighted images (DWI), and dynamic contrast-enhanced imaging (DCEI).
  • MRI multi-parametric magnetic resonance imaging
  • T1-weighted images T2-weighted images
  • DWI diffusion-weighted images
  • DCEI dynamic contrast-enhanced imaging
  • Each local covariance matrix is a matrix of local variances and local covariances between image intensities at one of the plurality image positions in the first and second mono-energetic images.
  • the local variances and covariances may be computed using an image patch of neighbouring pixel or voxel.
  • the "image patch” refers to a patch or group of pixels or voxels having a specific size, shape, and location corresponding to an image.
  • image patches can have a predetermined pixel width/height (e.g., 7 ⁇ 7, 9 ⁇ 9, 11 ⁇ 11, etc.) and a location for each image patch can be defined based on one or more centre pixels or voxels.
  • the local variances and covariances are positive definite numbers and reach their highest value at the location of a material transition.
  • the local maxima of the variances and/or covariances of the intensities of the multi-channel images may be used to locate material transitions.
  • the covariance matrix values are characterizing certain material transition types, such as fat tissue, muscle tissue, air, contrast-agent tagged materials, etc.
  • Figs. 2A-2C demonstrate that the spectral local variance in the 'Photo'- and 'Scatter' images responds sensitively to colonic folds submerged in contrast-tagged stool residuals, whereas the spectral local covariance signifies in particular the tag-to-air material transitions.
  • the values of local variances and/or local covariances at the plurality of image positions may be overlaid with first mono-energetic image and/or the second mono-energetic image to generate an overlaid presentation of the first mono-energetic image and/or the second mono-energetic image.
  • the overlaid presentation of the first mono-energetic image and/or the second mono-energetic image may be visualized on a display.
  • the local variances and/or covariances may be selected discretely for specific transition types only (e.g. transition between air and tissue), and the trace of the matrix may be used as the weight for display. Alternatively, the local variances and/or covariances may be weighted continuously for application-specific relevance.
  • values of local variances and/or local covariances may be overlaid with colour encoding over the standard slice images.
  • cues of the material transitions may be conveyed using gray-values, such as gradual black-to-white colour, which ensures a familiar viewing experience as with conventional images, as well as possible deployment on widespread black-white-monitors.
  • gray-values such as gradual black-to-white colour
  • the absence/presence and weight of the overlay can be controlled or toggled interactively by an operator (referred to herein as "the user").
  • a further visualization option is three-dimensional (3D) rendering of the local variance magnitudes or regression values into a virtual display, which may be interactively rotatable.
  • relevant material transitions may be overlaid in colour or gray value, conveying additional spectral information.
  • the user may assess just one image type, i.e. one spectral band, rather than having to read two or even a whole series of different image representations.
  • Certain material transition may not be discernible in the conventional image, but in the spectral image channels. Further, since only transitions lines (rather than entire areas) are marked, the standard image is not cluttered with area-wise occluding overlays, and standard reading expertise does not have to be changed.
  • the computation module is configured to compute each local variance matrix by a Gaussian convolutions operation.
  • the local variances and covariances may be computed by Gaussian convolutions operations, i.e. smoothing operations.
  • the Gaussian filtering may be computed for various kernel widths, corresponding to the scale space, and thus an entire vector of covariance matrices can be computed for various values of covariance scale space tensor.
  • the computation module is further configured to determine a number of eigenvalues of at least one of the local covariance matrices for determining a quantity of materials between which a material transition takes place.
  • the computation module is further configured to determine a number of eigenvalues of at least one of the local covariance matrices and to classify the at least one of the local covariance matrices into a material transition between N materials based on the number of eigenvalues, where N is greater than or equal to 2
  • a single eigenvalue of a local variance matrix may signify a two-material transition (e.g. between air and soft tissue).
  • two eigenvalues of a local variance matrix may signify a three-material transition (e.g. among air, soft tissue, and contrast-agent tagged material).
  • the apparatus further comprises a classifier module configured for applying a pre-trained classifier to classify each of the local covariance matrices into a material transition type.
  • classifier may include, but are not limited to, linear, maximum likelihood, random forest, support vector machine, or artificial neural network, may be used to classify the covariance at any position into a material transition type.
  • the result of classification may be one or several type-specific scalar values output from each local convariance matrix or covariance scale space tensor.
  • the pre-trained classifier comprises at least one of: artificial neural networks, support-vector machines, maximum likelihood, or random forest.
  • the plurality of image positions comprise locations of a material transition.
  • Edge detection is the name for a set of mathematical methods, which aim at identifying points in a digital image at which the image brightness changes sharply or, more formally, has discontinuities.
  • the points at which image brightness changes sharply are typically organized into a set of curved line segments termed edges.
  • Various techniques may be used for edge detection, such as Prewitt edge detection, Laplacian edge detection, LOG edge detection, canny edge detection, etc.
  • the local covariance matrices are computed only at locations of a material transition for identifying relevant multispectral material transitions. Accordingly, computational efforts may be reduced.
  • the apparatus further comprises a display device configured for displaying the overlaying result.
  • the image data comprises at least one of image data acquired by spectral computed tomography (CT) or image data acquired by spectral magnetic resonance imaging (MRI).
  • CT computed tomography
  • MRI spectral magnetic resonance imaging
  • a medical imaging system comprises a scanner and an apparatus according to the first aspect and any associated example.
  • the scanner is configured for scanning an object of interest to acquire image data of the object of interest.
  • the apparatus is configured for processing the image data of the object of interest.
  • a computer-implemented method for processing image data of an object of interest comprising a first mono-energetic image acquired at a first energy and a second mono-energetic image acquired at a second energy different from the first energy.
  • the computer-implemented method comprises:
  • step b) further comprises the step of computing each local variance matrix by Gaussian convolutions operations.
  • step b) further comprises determining a number of eigenvalues of at least one of the local covariance matrices for determining a quantity of materials between which a material transition takes place.
  • the method further comprises the step of applying a pre-trained classifier to classify each of the local covariance matrices into a material transition type.
  • a computer program element configured, during execution, to perform the method step of the third aspect and any associated example.
  • a computer readable medium comprising the computer program element.
  • learning in the context of machine learning refers to the identification and training of suitable algorithms to accomplish tasks of interest.
  • learning includes, but is not restricted to, association learning, classification learning, clustering, and numeric prediction.
  • machine-learning refers to the field of the computer sciences that studies the design of computer programs able to induce patterns, regularities, or rules from past experiences to develop an appropriate response to future data, or describe the data in some meaningful way.
  • data-driven model in the context of machine learning refers to a suitable algorithm that is learnt on the basis of appropriate training data.
  • module may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logical circuit, and/or other suitable components that provide the described functionality.
  • ASIC Application Specific Integrated Circuit
  • processor shared, dedicated, or group
  • memory shared, dedicated, or group
  • Fig. 1 shows a flow chart of a computer-implemented method 200 according to some embodiments of the present disclosure.
  • the computer-implemented method 200 is proposed for processing image data of an object of interest comprising a first mono-energetic image acquired at a first energy and a second mono-energetic image acquired at a second energy different from the first energy.
  • the computer-implemented method 200 may be implemented as a device, module or related component in a set of logic instructions stored in a non-transitory machine- or computer-readable storage medium such as random access memory (RAM), read only memory (ROM), programmable ROM (PROM), firmware, flash memory, etc., in configurable logic such as, for example, programmable logic arrays (PLAs), field programmable gate arrays (FPGAs), complex programmable logic devices (CPLDs), in fixed-functionality hardware logic using circuit technology such as, for example, application specific integrated circuit (ASIC), complementary metal oxide semiconductor (CMOS) or transistor-transistor logic (TTL) technology, or any combination thereof.
  • a non-transitory machine- or computer-readable storage medium such as random access memory (RAM), read only memory (ROM), programmable ROM (PROM), firmware, flash memory, etc.
  • configurable logic such as, for example, programmable logic arrays (PLAs), field programmable gate arrays (FPGAs), complex programmable
  • computer program code to carry out operations shown in the method 200 may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, SMALLTALK, C++, Python, or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • object oriented programming language such as JAVA, SMALLTALK, C++, Python, or the like
  • conventional procedural programming languages such as the "C" programming language or similar programming languages.
  • step 210 i.e. step a
  • the image data of the object of interest is received.
  • the medical image may be a two-dimensional image comprising image pixel data or a three-dimensional image comprising image voxel data.
  • the object of interest is a colon. In an example, the object of interest is a lumen of vasculature.
  • the image data may be acquired by spectral CT.
  • CT the basic idea of dual energy imaging in CT is to acquire two data sets at low and high energy levels and to use the pairs of the data sets to deduce additional information about the patient.
  • the physical basis of dual energy imaging includes two main mechanisms of the interaction of X rays with matter in the clinically relevant diagnostic energy-range from 30 keV to 140 keV, and the two interactions are photoelectric absorption and Compton scattering, each having its own functional dependence on x-ray energy. Photoelectric absorption is a rapidly decreasing function of energy for generating a 'Photo'-image, while Compton scatter is a gentle function of energy for generating a 'Scatter'-image.
  • the image data may be acquired by multi-parametric MRI.
  • step 220 i.e. step b
  • local covariance matrices are determined at a plurality of image positions in the first and second mono-energetic images.
  • Each local covariance matrix is a matrix of local variances and local covariances local variances and local covariances between image intensities at one of the plurality image positions in the first and second mono-energetic images.
  • image patch For local variances at each pixel or voxel location, we use statistics of a small patch of neighbouring pixel or voxel, which is also referred to as image patch.
  • Each image patch represents one or a group of pixels in a two-dimensional medical image or one or a group of voxels in a three-dimensional medical image.
  • image patches may have a predetermined pixel width/height (e.g., 7 ⁇ 7, 8 ⁇ 8, 9 ⁇ 9, etc.) and a location for each image patch may be defined based on one or more centre pixels or voxels.
  • the local covariance matrix has a 2 ⁇ 2 matrix of covariances, of which three are independent.
  • the local covariance matrix has an N ⁇ N matrix of covariances.
  • the Gaussian filter G ⁇ may be computed for various kernel widths ⁇ , corresponding to the scale space, and thus an entire vector of covariance matrices may be computed for various values of ⁇ to obtain a covariance scale space tensor.
  • the local variances and covariances are positive definite numbers, and reach their highest value at locations of a material transition.
  • the softness of the filter response is influenced by the Gaussian filter width ⁇ .
  • the local variance is not directed and not prone to noise.
  • the method further comprises the step of determining a number of eigenvalues of at least one of the local covariance matrices for determining a quantity of materials between which a material transition takes place.
  • two spectral bands ('Photo' - and 'Scatter'-image) yield a 2 ⁇ 2 local covariance matrix.
  • a single eigenvalue of the local covariance matrix signifies a two-material transition, while two eigenvalues signifies a three-material transition.
  • step 230 i.e. step c
  • values of local variances and/or local covariances at the plurality of image positions are overlaid with first mono-energetic image and/or the second mono-energetic image.
  • the values of local variances and/or local covariances may be overlaid with colour encoding over the standard slice images.
  • Another visualization option is 3D rendering of the local variance magnitudes or regression values into a virtual display, which is interactively rotatable.
  • cues of the transitions can be conveyed using gray-values (gradual black-to-white colour, rather than colour overlays), which ensures a familiar viewing experience as with conventional images, as well as possible deployment on widespread black-white-monitors.
  • the absence/presence and weight of the overlay can be controlled/toggled interactively by the user.
  • FIGs. 2A to 2C show an example with two spectral bands ('Photo' - and 'Scatter'-image), in an abdominal spectral CT scan with contrast agent-tagged stool residuals in the colon.
  • Fig. 2A shows an exemplary 'Photo'-image overlaid with local variance magnitudes, shown as brighter contour. This image demonstrates that the spectral local variance in the 'Photo' image responds sensitively to colonic folds submerged in contrast-tagged stool residuals.
  • Fig. 2B shows an exemplary 'Scatter'-image overlaid with local variance magnitudes, shown as brighter contour. This image demonstrates that the spectral local variance in the 'Scatter' image also responds sensitively to colonic folds submerged in contrast-tagged stool residuals.
  • Fig. 2C shows an exemplary 'Scatter '-image overlaid with local covariance magnitudes, shown as brighter contour. This image demonstrates that the spectral local covariance signifies in particular the tag-to-air material transitions.
  • contour of colonic folds submerged in contrast-tagged stool residuals may be further highlighted by subtracting the spectral local covariance, which signifies in particular the tag-to-air material transitions, from the composite image in Fig. 3A .
  • An exemplary subtraction result is illustrated in Fig. 3B .
  • step 240 i.e. step d
  • the overlaying result is provided e.g. to a display (for example, a built-in screen, a connected monitor or projector) or to a file storage (for example, a hard drive or a solid state drive).
  • a display for example, a built-in screen, a connected monitor or projector
  • a file storage for example, a hard drive or a solid state drive
  • the user may assess just one image type, i.e. one spectral band, rather than having to read two or even a whole series of different image representations.
  • relevant material transitions may be overlaid in colour or gray value, conveying additional spectral information. Certain material transition may not be discernible in the conventional image, but in the spectral image channels. Further, since only transitions lines (rather than entire areas) are marked, the standard image is not cluttered with area-wise occluding overlays, and standard reading expertise does not have to be changed.
  • a pre-trained classifier may be applied to classify each of the local covariance matrices into a material transition type.
  • the classifier is a pre-trained machine-learning model.
  • the machine learning model has been pre-trained on historic patient data retrievable from image repositories from the same hospitals or other hospitals.
  • a supervised learning scheme is used wherein the historic imagery is pre-labelled by experienced clinicians. Labelling may provide data including local covariance matrices at a plurality of image positions of multiple images and the corresponding classified material transition types (e.g. transitions between air-tissue, tissue-content agent tagged materials, or etc.).
  • Training of the classifier may include the following steps of receiving the training data, applying the classifier to the training data, in one or more iterations. As a result of this application the pre-trained classifier is then obtained, which can then be used in deployment. In deployment, new image data, can be applied to the pre-trained classifier to obtain the result of classification for this new data.
  • a neural-network model also referred to as artificial neural networks (ANNs)
  • ANNs artificial neural networks
  • other machine learning techniques such as support vector machines, maximum likelihood, random forest, or other may be used instead of neural networks.
  • the trained classifier attempts to approximate the correlation between the local covariance matrices and the material transition types.
  • the approximation may be achieved in a learning or training process where parameters, itself forming a high dimensional space, are adjusted in an optimization scheme based on training data.
  • the classifier may be realized as artificial neural-network ("ANN").
  • ANN artificial neural-network
  • the ANN is operable in two modes: “training mode/phase” and “deployment mode/phase”.
  • training mode an initial model of the ANN is trained based on a set of training data to produce a trained ANN model.
  • deployment mode the pre-trained ANN model is fed with non-training, new data, to operate during normal use.
  • the training mode may be a one-off operation or this is continued in repeated training phases to enhance performance. All that has been said so far in relation to the two modes is applicable to any kind of machine learning algorithms and is not restricted to ANNs.
  • the ANN comprises a set of interconnected nodes organized in layers.
  • the ANN includes an output layer and an input layer.
  • the input layer may be a matrix whose size (rows and columns) matches that of the training input local covariance matrices.
  • the output layer may be a vector or matrix with size matching the size chosen for material transition types.
  • the ANN has preferably a deep learning architecture, that is, in between the output layer and input layer there is at least one, preferably two or more, hidden layers.
  • Nodes are associated with numbers, called "weights", which represent how the node responds to input from earlier nodes in a preceding layer.
  • the set of all weights defines a configuration of the ANN.
  • an initial configuration is adjusted based on the training data using a learning algorithm such as forward-backward ("FB ”)-propagation or other optimization schemes, or other gradient descent methods. Gradients are taken with respect of the parameters of the objective function.
  • FB forward-backward
  • Gradients are taken with respect of the parameters of the objective function.
  • the training mode is preferably supervised, that is, is based on annotated training data.
  • Annotated training data includes pairs or training data items. For each pair, one item is the training input data and the other item is target training data known a priori to be correctly associated with its training input data item. This association defines the annotation and is preferably provided by a human expert.
  • the training pair includes local covariance matrices as training input, and associated with a material transition type as an output.
  • the output is in general different from the target.
  • the initial configuration is readjusted so as to achieve a good match between input training data and their respective target for all pairs.
  • the match is measured by way of a similarity measure which can be formulated in terms of on objective function, or cost function.
  • the aim is to adjust the parameters to incur low cost, that is, a good match.
  • the training data sets are applied to the initially configured ANN and is then processed according to a learning algorithm such as the FB-propagation algorithm as mentioned before.
  • a learning algorithm such as the FB-propagation algorithm as mentioned before.
  • the so pre-trained ANN may then be used in deployment phase to compute the material transition types for new data, that is, newly acquired images not present in the training data.
  • the result of classification may be one or several type-specific scalar values output from each local covariance matrix or covariance scale space tensor.
  • the local variances may be selected discretely for specific transition types only, and the trace of the matrix may be used as the weight for display.
  • the local variances and/or covariances may be weighted continuously, e.g. by regression rather than classification, for application-specific relevance.
  • Fig. 4 schematically shows an example of an apparatus 100 for processing image data of an object of interest comprising a first mono-energetic image acquired at a first energy and a second mono-energetic image acquired at a second energy different from the first energy.
  • the apparatus 100 comprises an input module 110, a computation module 120, a rendering module 130, and an output module 140.
  • Each module may be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logical circuit, and/or other suitable components that provide the described functionality.
  • ASIC Application Specific Integrated Circuit
  • the apparatus 100 may be any computing device, such as mobile devices, laptop and desktop computers, wearable computing devices, and other computing devices, suitable for processing image data.
  • the input module 110 is configured for receiving the image data of the object of interest.
  • the medical image may be a two-dimensional image comprising image pixel data or a three-dimensional image comprising image voxel data.
  • Examples of the imaging modality may include, but are not limited to, spectral CT or multi-parametric MRI.
  • the computation module 120 is configured for determining local covariance matrices at a plurality of image positions in the first and second mono-energetic images.
  • the plurality of image positions include the locations of a material transition.
  • Each local covariance matrix is a matrix of local variances and local covariances between image intensities at one of the plurality image positions in the first and second mono-energetic images.
  • statistics of a small patch of neighbouring pixel or voxel, i.e. image patch may be used for the calculation.
  • the computation module 120 may be configured to compute each local variance matrix by a Gaussian convolutions operation.
  • the computation module 120 may be further configured to determine a number of eigenvalues of at least one of the local covariance matrices for determining a quantity of materials between which a material transition takes place.
  • a single eigenvalue of a local variance matrix may signify a two-material transition e.g. between air and soft tissue.
  • two eigenvalues of a local variance matrix may signify a three-material transition, e.g. among air, soft tissue, and contrast-agent tagged material.
  • the rendering module 130 is configured for overlaying values of local variances and/or local covariances at the plurality of image positions (e.g. in colour or gray values) with first mono-energetic image and/or the second mono-energetic image. Exemplary overlaying results are shown in Figs. 2A-2C and Figs. 3A-3C .
  • the output module 140 is configured for providing an overlaying result e.g. to a display (for example, a built-in screen, a connected monitor or projector) or to a file storage (for example, a hard drive or a solid state drive).
  • a display for example, a built-in screen, a connected monitor or projector
  • a file storage for example, a hard drive or a solid state drive
  • the apparatus 100 may further comprise a classifier module configured for applying a pre-trained classifier, such as artificial neural networks, support-vector machines, maximum likelihood, or random forest to classify the local covariance matrices into a material transition type.
  • a classifier module configured for applying a pre-trained classifier, such as artificial neural networks, support-vector machines, maximum likelihood, or random forest to classify the local covariance matrices into a material transition type.
  • the apparatus may comprise a display device (not shown) configured for displaying an overlaying result.
  • Fig. 5 schematically shows a medical imaging system 300 according to some embodiments of the present disclosure.
  • the medical imaging system 300 comprises a scanner 310 configured to scan an object of interest to acquire image data of the object of interest.
  • the scanner 310 may be a spectral CT-scanner or an MRI scanner.
  • the medical imaging system 300 further comprises an apparatus 100 according to any one of the above-described examples for processing the image data of the object of interest.
  • the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.
  • a computer program or a computer program element is provided that is characterized by being adapted to execute the method steps of the method according to one of the preceding embodiments, on an appropriate system.
  • the computer program element might therefore be stored on a computer unit, which might also be part of an embodiment of the present invention.
  • This computing unit may be adapted to perform or induce a performing of the steps of the method described above. Moreover, it may be adapted to operate the components of the above described apparatus.
  • the computing unit can be adapted to operate automatically and/or to execute the orders of a user.
  • a computer program may be loaded into a working memory of a data processor.
  • the data processor may thus be equipped to carry out the method of the invention.
  • This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and a computer program that by means of an up-date turns an existing program into a program that uses the invention.
  • the computer program element might be able to provide all necessary steps to fulfil the procedure of an exemplary embodiment of the method as described above.
  • a computer readable medium such as a CD-ROM
  • the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
  • a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network.
  • a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
  • inventive embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed.
  • inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Fuzzy Systems (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
EP20210977.3A 2020-12-01 2020-12-01 Calcul et affichage de covariance-spectrale locale Withdrawn EP4009227A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP20210977.3A EP4009227A1 (fr) 2020-12-01 2020-12-01 Calcul et affichage de covariance-spectrale locale
EP21820564.9A EP4256463A2 (fr) 2020-12-01 2021-11-28 Calcul et affichage de déficits de variance spectrale locale ou de covariance spectrale locale pour la mise en évidence de transitions de matériau pertinentes dans une ct spectrale et rm
PCT/EP2021/083262 WO2022117468A2 (fr) 2020-12-01 2021-11-28 Calcul et affichage de déficits de variance spectrale locale ou de covariance spectrale locale pour la mise en évidence de transitions de matériau pertinentes dans une ct spectrale et rm
JP2023532493A JP2023552333A (ja) 2020-12-01 2021-11-28 スペクトルct及びmrにおける関連材料遷移を強調表示するための局所スペクトル共分散又は局所スペクトル共分散欠損の計算及び表示
US18/038,546 US20240090849A1 (en) 2020-12-01 2021-11-28 Local spectral-covariance or local spectral covariance deficits computation and display for highlighting of relevant material transitions in spectral ct and mr
CN202180080678.3A CN116710970A (zh) 2020-12-01 2021-11-28 用于在能谱ct和mr中突出相关材料转变的局部能谱协方差或局部能谱协方差亏损计算和显示

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP20210977.3A EP4009227A1 (fr) 2020-12-01 2020-12-01 Calcul et affichage de covariance-spectrale locale

Publications (1)

Publication Number Publication Date
EP4009227A1 true EP4009227A1 (fr) 2022-06-08

Family

ID=73654718

Family Applications (2)

Application Number Title Priority Date Filing Date
EP20210977.3A Withdrawn EP4009227A1 (fr) 2020-12-01 2020-12-01 Calcul et affichage de covariance-spectrale locale
EP21820564.9A Pending EP4256463A2 (fr) 2020-12-01 2021-11-28 Calcul et affichage de déficits de variance spectrale locale ou de covariance spectrale locale pour la mise en évidence de transitions de matériau pertinentes dans une ct spectrale et rm

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP21820564.9A Pending EP4256463A2 (fr) 2020-12-01 2021-11-28 Calcul et affichage de déficits de variance spectrale locale ou de covariance spectrale locale pour la mise en évidence de transitions de matériau pertinentes dans une ct spectrale et rm

Country Status (5)

Country Link
US (1) US20240090849A1 (fr)
EP (2) EP4009227A1 (fr)
JP (1) JP2023552333A (fr)
CN (1) CN116710970A (fr)
WO (1) WO2022117468A2 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116975574B (zh) * 2023-08-31 2024-04-16 国家海洋环境监测中心 一种海洋环境重金属污染评价方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FULWADHVA URVI P. ET AL: "Use of Dual-Energy CT and Iodine Maps in Evaluation of Bowel Disease", RADIOGRAPHICS., vol. 36, no. 2, 1 March 2016 (2016-03-01), US, pages 393 - 406, XP055804017, ISSN: 0271-5333, Retrieved from the Internet <URL:https://pubs.rsna.org/doi/pdf/10.1148/rg.2016150151> DOI: 10.1148/rg.2016150151 *
WIEMKER RAFAEL ET AL: "Towards reduced-preparation Spectral-CT-colonography utilizing local covariance", MEDICAL IMAGING 2020: IMAGE PROCESSING, 10 March 2020 (2020-03-10), pages 17, XP055803386, ISBN: 978-1-5106-3394-0, DOI: 10.1117/12.2549539 *

Also Published As

Publication number Publication date
WO2022117468A2 (fr) 2022-06-09
WO2022117468A3 (fr) 2022-07-14
EP4256463A2 (fr) 2023-10-11
JP2023552333A (ja) 2023-12-15
US20240090849A1 (en) 2024-03-21
CN116710970A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
US11195280B2 (en) Progressive and multi-path holistically nested networks for segmentation
Rebouças Filho et al. Novel and powerful 3D adaptive crisp active contour method applied in the segmentation of CT lung images
Middleton et al. Segmentation of magnetic resonance images using a combination of neural networks and active contour models
US20180025255A1 (en) Classification method and apparatus
Banerjee et al. Automated 3D segmentation of brain tumor using visual saliency
Atlason et al. Unsupervised brain lesion segmentation from MRI using a convolutional autoencoder
Banerjee et al. A novel GBM saliency detection model using multi-channel MRI
Soleymanpour et al. Fully automatic lung segmentation and rib suppression methods to improve nodule detection in chest radiographs
EP2936430B1 (fr) Imagerie quantitative
JP7216722B2 (ja) 診断撮像における画像特徴のアノテーション
EP3213298B1 (fr) Carte d&#39;analyse de texture pour données d&#39;image
US20120038649A1 (en) Method and Apparatus for Multimodal Visualization of Volume Data Sets
CN111666966A (zh) 医学成像中基于人工智能的材料分解
Lee et al. No-reference perceptual CT image quality assessment based on a self-supervised learning framework
CN117809122B (zh) 一种颅内大血管图像的处理方法、系统、电子设备及介质
EP4009227A1 (fr) Calcul et affichage de covariance-spectrale locale
US10699392B2 (en) Contrast-enhanced reproduction of spectral CT image data
Venugopal et al. A deep learning-based illumination transform for devignetting photographs of dermatological lesions
WO2021204744A1 (fr) Appareil pour générer une image augmentée d&#39;un objet
Khowaja et al. Supervised method for blood vessel segmentation from coronary angiogram images using 7-D feature vector
Babaheidarian et al. Joint segmentation and material recognition in dual-energy ct images
EP4256512B1 (fr) Apprentissage automatique de la restoration des bords aprés suppression du contraste/substitution de matériel
EP3889896A1 (fr) Nettoyage virtuel basé sur un modèle pour une coloscopie virtuelle spectrale
US20230351731A1 (en) Feature detection based on training with repurposed images
EP4002288A1 (fr) Procédés et systèmes de rendu de représentations du système vasculaire de sujet

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20221209