WO2016173957A1 - Brain tissue classification - Google Patents

Brain tissue classification Download PDF

Info

Publication number
WO2016173957A1
WO2016173957A1 PCT/EP2016/059115 EP2016059115W WO2016173957A1 WO 2016173957 A1 WO2016173957 A1 WO 2016173957A1 EP 2016059115 W EP2016059115 W EP 2016059115W WO 2016173957 A1 WO2016173957 A1 WO 2016173957A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
tissue classification
map
misclassification
brain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2016/059115
Other languages
English (en)
French (fr)
Inventor
Fabian Wenzel
Thomas Heiko STEHLE
Lyubomir Georgiev ZAGORCHEV
Jochen Peters
Martin Bergtholdt
Carsten Meyer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to JP2017552066A priority Critical patent/JP6596100B2/ja
Priority to RU2017141759A priority patent/RU2713707C2/ru
Priority to EP16720773.7A priority patent/EP3289563B1/en
Priority to US15/564,263 priority patent/US10331981B2/en
Priority to CN201680024874.8A priority patent/CN107567637B/zh
Publication of WO2016173957A1 publication Critical patent/WO2016173957A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • A61B5/0042Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • A61B2576/026Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part for the brain
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20128Atlas-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain

Definitions

  • the invention relates to a system and a method for brain tissue classification.
  • the invention further relates to a workstation and imaging apparatus comprising the system.
  • the invention further relates to a computer program product comprising instructions for causing a processor system to perform the method.
  • MRI Magnetic Resonance Imaging
  • Image analysis techniques help quantify brain atrophy by classifying the brain tissue voxels into different tissue classes such as Gray Matter (GM), White Matter (WM), and Cerebrospinal Fluid (CSF). Brain tissue classification is particularly useful in assessing brain atrophy, since the gray matter volume serves as a biomarker for cortical atrophy.
  • tissue classification errors typically two types may occur in the tissue classification map, namely isolated "blob-like" misclassifications, as well as over- or under- pronunciation of cortical gray matter near its border to white matter.
  • US 2002/0186882 Al discloses a process and related apparatus for obtaining quantitative data about a 2-dimensional, 3 -dimensional image, or other dimensional image, e.g. for classifying and counting the number of entities an image contains.
  • Each entity comprises an entity, structure, or some other type of identifiable portion of the image having definable characteristics.
  • the entities located within an image may have a different shape, color, texture, or other definable characteristic, but still belong to the same classification.
  • entities comprising a similar color, and texture may be classified as one type while entities comprising a different color, and texture may be classified as another type.
  • An image may contain multiple entities and each entity may belong to a different class.
  • the system may quantify image data according to a set of changing criteria and derive one or more classifications for the entities in the image. Once the image data is classified, the total number of entities in the image is calculated and presented to the user.
  • Embodiments provide a way for a computer to determine what kind of entities (e.g., entities) are in an image and counts the total number of entities that can be visually identified in the image. Information utilized during a training process may be stored and applied across different images.
  • the following aspects of the invention involve a user interactively providing user feedback on an area of misclassification in the tissue classification map, with the user feedback further being indicative of a correction of the misclassification.
  • the user feedback is used to adjust a prior probability map which is used as input in the automated tissue classification technique, thereby obtaining an adjusted prior probability map.
  • the automated tissue classification technique is then re-run based on the adjusted prior probability map.
  • a first aspect of the invention provides a system for brain tissue classification, the system comprising:
  • an image data interface for accessing an image of a brain of a patient
  • a processor configured to apply an automated tissue classification technique to the image based on a prior probability map, the prior probability map being registered to the image and being indicative of a probability of a particular location in the brain belonging to a particular brain tissue class, the automated tissue classification technique providing as output a tissue classification map of the brain of the patient;
  • a user interaction subsystem comprising:
  • a display output for displaying the tissue classification map on a display
  • a user device input for receiving input commands from a user device operable by a user, wherein the input commands represent user feedback which is indicative of a) an area of misclassification in the tissue classification map and b) a correction of the
  • processor is configured to:
  • a further aspect of the invention provides a workstation or imaging apparatus comprising the system.
  • a further aspect of the invention provides a method for brain tissue classification, the method comprising:
  • an automated tissue classification technique to the image based on a prior probability map, the prior probability map being registered to the image and being indicative of a probability of a particular location in the brain belonging to a particular brain tissue class, the automated tissue classification technique providing as output a tissue classification map of the brain of the patient;
  • a further aspect of the invention provides a computer program product comprising instructions for causing a processor system to perform the method.
  • the above measures involve accessing an image of a brain of a patient.
  • the image may thus represent a brain scan, and may be obtained from various imaging modalities, including but not limited to Tl -weighted Magnetic Resonance Imaging (MRI).
  • MRI Magnetic Resonance Imaging
  • An automated tissue classification technique is applied to the image based on a prior probability map.
  • prior probability maps are known per se, and may describe the likelihood of a known position in the brain belonging to one of the various tissue classes. Typically, these prior probability maps have been generated from a sample cohort of correctly classified brain scans.
  • Automated tissue classification techniques which use prior probability maps are also known per se, e.g., from the field of medical image analysis.
  • the prior probability map is registered with the image, e.g., in a manner known per se from the field of medical image registration.
  • a tissue classification map is obtained, locally classifying the brain according to brain tissue type.
  • 'brain tissue classification' is used interchangeably with 'brain tissue segmentation' as the resulting tissue classification map segments the brain into the various tissue types and thereby provides a segmentation.
  • the tissue classification map is displayed on a display and a user is enabled to provide user feedback which is indicative of an area of misclassification in the tissue classification map and which is indicative of a correction of the misclassification.
  • the user provides user feedback which is indicative of where a misclassification occurred and at what the correction should be.
  • the user feedback may indicate a region to be biased towards white matter.
  • the prior probability map is then adjusted based on the user feedback, yielding an adjusted prior probability map which comprises one or more local corrections of probabilities.
  • the automated tissue classification technique is then re-run based on the adjusted prior probability map, yielding a further tissue classification map.
  • the above measures have as effect that a user is enabled to provide user feedback which is indicative of where a misclassification occurred and what the correction should be. Rather than directly correcting the tissue classification map based on the user feedback, the user feedback is used to adjust the prior probability map, and the automated tissue classification technique is then re-run based on the adjusted prior probability map.
  • the inventors have recognized that in case of an automated tissue classification technique providing an erroneous tissue classification map, it may,
  • tissue classification elsewhere in the brain may also improve, since the contrast between tissue classes can be better modelled by having more 'supervised' evidence.
  • the user interaction subsystem is configured to enable the user to indicate a point in the area of misclassification, thereby obtaining a user- indicated point;
  • the processor is configured to determine a boundary of the area of misclassification based on the user-indicated point.
  • the user may thus suffice with indicating a point which lies within the area of the
  • Such a user- indicated point may nevertheless enable the system to determine the (entire) area of the misclassification, namely by making use of a boundary detection technique.
  • the processor may consider the user-indicated point as a seed point in a region-growing technique, thereby obtaining the boundary of the area of misclassification.
  • other boundary detection techniques may be used, as known per se from the field of medical image analysis, including but not limited to connected component analysis and techniques based on morphological operations.
  • the user interaction subsystem is configured to enable the user to indicate the correction of the misclassification by manually specifying a brain tissue class, thereby obtaining a user-specified brain tissue class. It may occur that the user is able to directly determine the brain tissue class in the area of misclassification. The user is enabled to provide such user feedback, namely by directly specifying the brain tissue class.
  • the processor is configured to adjust the prior probability map by increasing, in the prior probability map, a probability of the user-specified brain tissue class in the area of misclassification. Based on the user directly specifying the brain tissue class, the probability of said brain tissue class may be increased in the prior probability map within the area of misclassification. For example, the probability may be increased to 80% or higher, 90% or higher, 95% or higher, or to substantially 100%, e.g., to 99% or higher.
  • the user interface subsystem is configured to enable the user to indicate the correction of the misclassification by changing a probability ratio between grey matter tissue and white matter tissue.
  • the changing of the probability ratio between grey matter tissue and white matter tissue has been found to be a particularly advantageous way of providing user feedback in case of over- or under-pronunciation of cortical gray matter near its border to white matter.
  • the user may be enabled to incrementally change the probability ratio, e.g., by dragging a mouse up or down while pressing the left mouse button, by operating specific keys on a keyboard (e.g., the plus and minus keys), etc.
  • the user interaction subsystem is configured to enable the user to indicate the area of misclassification in the tissue classification map as displayed on the display.
  • the user is thus enabled to specifically indicate the area of misclassification, as it occurs in the tissue classification map, in the tissue classification map itself.
  • the user may use an annotation tool to draw a contour in the displayed tissue classification map.
  • the user interface subsystem is configured to:
  • the user may indicate said area in the image, which may be displayed simultaneously with the tissue classification map. For example, the user may use an annotation tool to draw a contour in the displayed image.
  • the automated tissue classification technique is based on
  • Expectation Maximization Automated tissue classification techniques based on Expectation Maximization, in combination with Markov Random Fields regularization, have recently shown to give a best overall performance in academic literature. However, other automated tissue classification techniques which use prior probability maps may be used as well.
  • multi-dimensional image data e.g. to two-dimensional (2D), three-dimensional (3D) or four- dimensional (4D) images, acquired by various acquisition modalities such as, but not limited to, standard X-ray Imaging, Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Ultrasound (US), Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), and Nuclear Medicine (NM).
  • CT Computed Tomography
  • MRI Magnetic Resonance Imaging
  • US Ultrasound
  • PET Positron Emission Tomography
  • SPECT Single Photon Emission Computed Tomography
  • NM Nuclear Medicine
  • Fig. 1 shows a system for brain tissue classification, in which a prior probability map is adjusted based on user feedback and the adjusted prior probability map is used to re-apply an automated tissue classification technique to a brain image;
  • Fig. 2A shows a part of a MRI image
  • Fig. 2B shows a tissue classification map obtained from an automated tissue classification of the MRI image, with the tissue classification map containing a
  • Fig. 3A shows a part of another MRI image
  • Fig. 3B shows a tissue classification map obtained from an automated tissue classification of the MRI image, with the tissue classification map containing a
  • Fig. 4A illustrate the user providing user feedback which is indicative of the area of misclassification by indicating a point in the area of misclassification
  • Fig. 4B shows a result of a boundary of the area of misclassification being automatically determined based on the user-indicated point
  • Fig. 4C illustrates a result of the user indicating the correction of the misclassification by manually specifying a brain tissue class
  • Fig. 5 shows a method for brain tissue classification, in which a prior probability map is adjusted based on user feedback and the adjusted prior probability map is used to re-apply an automated tissue classification technique to a brain image
  • Fig. 6 shows a computer readable medium comprising instructions for causing a processor system to perform the method.
  • Fig. 1 shows a system 100 for brain tissue classification.
  • a system may be used in various medical applications, including, but not limited to, the detection and differential diagnosis of neuro-degenerative diseases.
  • the system 100 may, briefly stated, apply an automated tissue classification technique to an image of a brain based on a prior probability map, thereby obtaining a tissue classification map of the brain.
  • a user is then enabled to, using a user interaction subsystem, provide user feedback which is indicative of a) an area of misclassification in the tissue classification map and b) a correction of the misclassification.
  • the system may then adjust the prior probability map based on the user feedback to obtain an adjusted prior probability map, and re-apply the automated tissue classification technique to the image based on the adjusted prior probability map.
  • the system 100 comprises an image data interface 120 for accessing an image 022 of a brain of a patient, henceforth also referred to simply as brain image 022.
  • the image data interface 120 is shown to be connected to an external image repository 020 which comprises the brain image 022.
  • the image repository 020 may be constituted or be part of a Picture Archiving and Communication System (PACS) of a Hospital Information System (HIS) to which the system 100 may be connected or comprised in. Accordingly, the system 100 may obtain access to the brain image 022.
  • the brain image 022 may be accessed from an internal data storage of the system 100.
  • the image data interface 120 may take various forms, such as a network interface to a local or wide area network, e.g., the Internet, a storage interface to an internal or external data storage, etc. It is further noted that, where applicable, a reference to the brain image 022 is to be understood as a reference to the image's image data.
  • the system 100 further comprises a processor 160 configured to apply an automated tissue classification technique to the brain image 022 based on a prior probability map 042.
  • the processor 160 is shown to receive the brain image 022 from the image data interface 120, and the prior probability map 040 from a prior probability data interface 140.
  • Said prior probability data interface 140 may enable the system 100 to access the prior probability map 042 on an external database 040, such as a PACS.
  • the system 100 may access the prior probability map 042 internally or from another source.
  • the prior probability data interface 140 may take various forms, including but not limited to a network interface to a local or wide area network, e.g., the Internet, a storage interface to an internal or external data storage, etc.
  • a tissue classification map 162 of the brain of the patient is then obtained.
  • the system 100 further comprises a display output 182 for displaying visual output of the system 100 on a display 060, with the visual output including at least the tissue classification map.
  • the display output 182 is shown to internally communicate with the processor 162, e.g., to obtain data visualizing the tissue classification map, and to provide display data 062 to the display 060.
  • the system 100 further comprises a user device input 184 for receiving input commands 082 from a user device 080 operable by a user.
  • the user device 080 may take various forms, including but not limited to a computer mouse 080, touch screen, keyboard, etc.
  • the user device input 184 may be of a type which corresponds to the type of user device 080.
  • the display output 182 and the user device input 184 may form a user interaction subsystem 180 which enables the user to interactively provide user feedback to the system 100.
  • the user feedback may be indicative of a) an area of
  • the user may click on a part of the tissue classification map which is incorrectly classified and select a correct classification from an on-screen menu.
  • the user feedback may then be available to the system 100 in the form of user feedback data indicating, for example, coordinate(s) of the misclassification in a coordinate system associated with the tissue classification map, and data indicative of the correction.
  • the processor 160 may adjust the prior probability map based on the user feedback, thereby obtaining an adjusted prior probability map, and subsequently re-apply the automated tissue classification technique to the image based on the adjusted prior probability map.
  • the system 100 may be embodied as, or in, a single device or apparatus, such as a workstation or imaging apparatus.
  • the device or apparatus may comprise one or more microprocessors which execute appropriate software.
  • the software may have been downloaded and/or stored in a corresponding memory, e.g., a volatile memory such as RAM or a non-volatile memory such as Flash.
  • the functional units of the system may be implemented in the device or apparatus in the form of programmable logic, e.g., as a
  • each functional unit of the system may be implemented in the form of a circuit. It is noted that the system 100 may also be implemented in a distributed manner, e.g., involving different devices or apparatuses. For example, the distribution may be in accordance with a client-server model.
  • Figs. 2A-3B illustrate two common types of tissue classification errors, namely isolated "blob-like” misclassifications (Figs. 2A-2B), as well as over- or under- pronunciation of cortical gray matter near its border to white matter (Figs. 3A-3B).
  • Fig. 2A shows part of a MRI image 024, and Fig. 2B shows a tissue classification map 030 obtained from an automated tissue classification of the MRI image.
  • blue black in a greyscale reproduction
  • green indicates Gray Matter (GM)
  • pink indicates White matter (WM).
  • the tissue classification map 030 contains a misclassification in the form of a blob-like region having been incorrectly classified as cortical gray matter rather than white matter.
  • Fig. 3A shows a part of another MRI image 026, and Fig. 3B shows a tissue classification map 032 obtained from an automated tissue classification of the MRI image.
  • a same color (or greyscale) coding is applied as in Fig. 2B.
  • the tissue classification map 032 of Fig. 3B contains a misclassification in the form of cortical gray matter being under-pronounced at the border to white matter.
  • the above classification errors may frequently occur, e.g., for the reasons indicated in the background section, thereby yielding erroneous tissue classification maps.
  • the user may provide user feedback to the system which is indicative of a) an area of misclassification in the tissue classification map and b) a correction of the misclassification.
  • Such user feedback may take various forms.
  • Fig. 4A illustrates a particular example of such user feedback, namely the user indicating a point 070 in the area of misclassification. The example is based on the blob-like misclassification show in Fig. 2B, with Figs. 4A-4B showing a zoomed-in part of the corresponding tissue classification map 034A.
  • the user may indicate the point 070 in various ways. For example, the user may click on a position in the displayed tissue classification map 034A with an on-screen cursor 064.
  • a boundary of the area of misclassification may then be determined, e.g., using a boundary detection technique.
  • the user-indicated point 070 may be considered as a seed point in a region-growing technique, thereby obtaining the boundary of the area of misclassification.
  • other boundary detection techniques may be used, as known per se from the field of medical image analysis, including but not limited to connected component analysis and techniques based on morphological operations.
  • the resulting boundary 072 is indicated in Fig. 4B as a dashed-line delineating the blob-like misclassification in the tissue classification map 034A from its surroundings.
  • Fig. 4C illustrates a result of the user further indicating the correction of the misclassification.
  • the user may manually specify a brain tissue class which is to be applied to the misclassified area 072.
  • the user may select a correct brain tissue class from an on-screen menu (menu not shown in Fig. 4C).
  • the correction may be shown to the user, namely by the misclassified area 072 being corrected in the displayed tissue classification map 034A to the user-specified brain tissue class.
  • the area may be classified as white matter rather than grey matter.
  • the prior probability map may then be adjusted, thereby obtaining an adjusted prior probability map.
  • the original and adjusted prior probability maps are not explicitly shown in view of such probability maps being difficult to adequately visualize by way of each location in the map typically having probability values for each brain tissue class, thereby hindering a greyscale or even color-based visualization.
  • the adjustment may take a similar form as that shown for the tissue classification map 034A, 034B in Figs. 4B and Fig.
  • a corresponding region in the prior probability map may be adjusted by increasing the probability of the user- specified brain tissue class in the area of misclassification to substantially 100%, effectively Overwriting' existing probability values.
  • the automated tissue classification technique may then be re-applied to the image, yielding a different, typically (much) improved tissue classification map.
  • the user may directly delineate the misclassified area in the tissue classification map.
  • the user may indicate a point in the misclassified area, with the system then assuming the misclassified area to be a predetermined area around the point.
  • misclassified area which exists in the tissue classification map, may not be indicated by the user in the tissue classification map but rather elsewhere.
  • the user may learn the area of misclassification from studying the displayed tissue classification map, but may then indicate said area to the system in the image itself, e.g., by drawing a rectangle and thereby marking a region in the image.
  • a first example is the aforementioned direct specifying of the correct brain tissue class.
  • a user may rather change a probability ratio between brain tissue classes, such as the probability ratio between grey matter tissue and white matter tissue.
  • the correction is not a binary correction in class but rather a correction in probability.
  • Such probability ratio may be changed incrementally, e.g., in steps.
  • the system may or may not re-apply the automated tissue classification technique to the image after each incremental change.
  • the system may automatically propagate the probabilities surrounding the misclassified area into the misclassified area. As such, the indication of the misclassified area thereby effectively also serves for indicating the correction of the misclassification, in that it is assumed to follow from its surroundings.
  • an example use-case may be the following.
  • a user may point and click to a location in a blob-like misclassified region and assign a correct label to the region, e.g., via a drop-down box.
  • This activity may trigger the following operations: 1) the boundary of the misclassified region may be automatically determined based on a local analysis of voxels in the labeled image, e.g., by connected component analysis, morphological operations, or region growing using the labels of the segmented image, 2) assignment of a fixed label to voxels in the region in the form of a 100%
  • Another example use-case may be the following.
  • the user may mark a region of interest and continuously change its probability of belonging to a particular brain tissue class in an interactive way. Once satisfied, the automated tissue classification algorithm may be re-run.
  • This use-case may involve the following operations:
  • the user may mark the region of interest with an image contouring tool.
  • an image contouring tool For example, the user may draw a contour or use an annotation tool with a specific shape, such as a disk or a rectangle.
  • the brain scan may be a 3D scan and the user may be shown a 2D image slice of the 3D scan, the region may be automatically extended/propagated to 2D image slices before and behind the displayed image slice with known techniques, such as a 3D annotation tool with a spherical shape instead of a disk.
  • the user may start an interactive operation that increases/decreases prior probability values for gray matter as opposed to white matter.
  • the changes in probability value may be indicated using, e.g., specific keys on the keyboard (+/-), by pressing the left mouse button and moving the mouse up/down, etc.
  • prior probability maps may be registered to the brain scan in a pre-processing step.
  • the expectation maximization algorithm may iteratively perform two steps: (i) a so-termed M-Step, in which, given the tissue class probabilities for each voxel, an intensity model may be determined/updated for a specific brain tissue class, and (ii) a so-termed E-Step, in which, given the (updated) intensity model for a specific brain tissue class, the voxel- wise probabilities may be refined.
  • a parameter or configuration of the automated tissue classification technique may be adjusted.
  • the misclassified area may have been corrected by the user, e.g., in an interactive manner as indicated with reference to Figs. 4A-4C.
  • the automated tissue classification technique may then be re-run on a region of interest that does not include the user-identified misclassified area.
  • the user- identified misclassified area may be excluded from a binary voxel mask that defines the region of interest.
  • the final tissue classification map may then be composed of the results from the automated tissue classification technique in the region of interest, together with the misclassified area as interactively corrected by the user. It will be appreciated that, in addition to the abovementioned example of excluding the manually corrected area from a region of interest, other adjustments of parameters or configurations of the automated tissue classification technique are within reach of the skilled person.
  • Fig. 5 shows a method 200 for brain tissue classification.
  • the method 200 comprises, in an operation titled "ACCESSING BRAIN IMAGE", accessing 210 an image of a brain of a patient.
  • the method 200 further comprises, in an operation titled "APPLYING
  • AUTOMATED TISSUE CLASSIFICATION applying 220 an automated tissue classification technique to the image based on a prior probability map, the prior probability map being registered to the image and being indicative of a probability of a particular location in the brain belonging to a particular brain tissue class, the automated tissue classification technique providing as output a tissue classification map of the brain of the patient.
  • the method 200 further comprises, in an operation titled "DISPLAYING TISSUE CLASSIFICATION MAP", displaying 230 the tissue classification map on a display.
  • the method 200 further comprises, in an operation titled "RECEIVING USER FEEDBACK", receiving 240 input commands from a user device operable by a user, wherein the input commands represent user feedback which is indicative of i) an area of misclassification in the tissue classification map and ii) a correction of the misclassification.
  • the method 200 further comprises, in an operation titled "adjusting prior probability map”, adjusting 250 the prior probability map based on the user feedback, thereby obtaining an adjusted prior probability map.
  • the method 200 further comprises, in an operation titled "RE-APPLYING
  • Fig. 5 shows the method 200 being performed in an iterative manner, namely by the operations of applying 220 and re-applying 260 being indicated as being essentially a similar step by way of the arrow 255, albeit being performed at a later stage and using different input, namely different prior probability maps.
  • the method 200 may be implemented on a computer as a computer implemented method, as dedicated hardware, or as a combination of both. As also illustrated in Fig.
  • instructions for the computer may be stored on a computer readable medium 270, e.g., in the form of a series 280 of machine readable physical marks and/or as a series of elements having different electrical, e.g., magnetic, or optical properties or values.
  • the executable code may be stored in a transitory or non-transitory manner.
  • Examples of computer readable mediums include memory devices, optical storage devices, integrated circuits, servers, online software, etc.
  • Fig. 6 shows an optical disc 270.
  • the invention also applies to computer programs, particularly computer programs on or in a carrier, adapted to put the invention into practice.
  • the program may be in the form of a source code, an object code, a code intermediate source and an object code such as in a partially compiled form, or in any other form suitable for use in the implementation of the method according to the invention.
  • a program may have many different architectural designs.
  • a program code implementing the functionality of the method or system according to the invention may be sub-divided into one or more sub-routines. Many different ways of distributing the functionality among these sub-routines will be apparent to the skilled person.
  • the subroutines may be stored together in one executable file to form a self-contained program.
  • Such an executable file may comprise computer-executable instructions, for example, processor instructions and/or interpreter instructions (e.g. Java interpreter instructions).
  • one or more or all of the sub-routines may be stored in at least one external library file and linked with a main program either statically or dynamically, e.g. at run-time.
  • the main program contains at least one call to at least one of the sub-routines.
  • the sub-routines may also comprise function calls to each other.
  • An embodiment relating to a computer program product comprises computer-executable instructions corresponding to each processing stage of at least one of the methods set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically.
  • Another embodiment relating to a computer program product comprises computer-executable instructions corresponding to each means of at least one of the systems and/or products set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically.
  • the carrier of a computer program may be any entity or device capable of carrying the program.
  • the carrier may include a data storage, such as a ROM, for example, a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example, a hard disk.
  • the carrier may be a transmissible carrier such as an electric or optical signal, which may be conveyed via electric or optical cable or by radio or other means.
  • the carrier may be constituted by such a cable or other device or means.
  • the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted to perform, or used in the performance of, the relevant method.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Neurology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Analysis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Nuclear Medicine (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
PCT/EP2016/059115 2015-04-30 2016-04-25 Brain tissue classification Ceased WO2016173957A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2017552066A JP6596100B2 (ja) 2015-04-30 2016-04-25 脳組織分類
RU2017141759A RU2713707C2 (ru) 2015-04-30 2016-04-25 Классификация тканей головного мозга
EP16720773.7A EP3289563B1 (en) 2015-04-30 2016-04-25 Brain tissue classification
US15/564,263 US10331981B2 (en) 2015-04-30 2016-04-25 Brain tissue classification
CN201680024874.8A CN107567637B (zh) 2015-04-30 2016-04-25 脑组织分类

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562154768P 2015-04-30 2015-04-30
US62/154,768 2015-04-30
EP15170208 2015-06-02
EP15170208.1 2015-06-02

Publications (1)

Publication Number Publication Date
WO2016173957A1 true WO2016173957A1 (en) 2016-11-03

Family

ID=53276026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/059115 Ceased WO2016173957A1 (en) 2015-04-30 2016-04-25 Brain tissue classification

Country Status (5)

Country Link
US (1) US10331981B2 (enExample)
EP (1) EP3289563B1 (enExample)
JP (1) JP6596100B2 (enExample)
CN (1) CN107567637B (enExample)
WO (1) WO2016173957A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3761230B1 (en) * 2018-03-01 2025-12-17 Tencent Technology (Shenzhen) Company Limited Image processing method, device, storage medium and computer program product

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150108701A (ko) * 2014-03-18 2015-09-30 삼성전자주식회사 의료 영상 내 해부학적 요소 시각화 시스템 및 방법
US10692211B2 (en) * 2017-06-20 2020-06-23 Case Western Reserve University Intra-perinodular textural transition (IPRIS): a three dimenisonal (3D) descriptor for nodule diagnosis on lung computed tomography (CT) images
US10671896B2 (en) * 2017-12-04 2020-06-02 International Business Machines Corporation Systems and user interfaces for enhancement of data utilized in machine-learning based medical image review
US10607122B2 (en) * 2017-12-04 2020-03-31 International Business Machines Corporation Systems and user interfaces for enhancement of data utilized in machine-learning based medical image review
US11741365B2 (en) 2018-05-14 2023-08-29 Tempus Labs, Inc. Generalizable and interpretable deep learning framework for predicting MSI from histopathology slide images
US10957041B2 (en) 2018-05-14 2021-03-23 Tempus Labs, Inc. Determining biomarkers from histopathology slide images
US11348661B2 (en) 2018-05-14 2022-05-31 Tempus Labs, Inc. Predicting total nucleic acid yield and dissection boundaries for histology slides
CA3287321A1 (en) * 2018-12-31 2025-11-29 Tempus Ai Inc Artificial intelligence segmentation of tissue images
US11967070B2 (en) 2019-05-13 2024-04-23 The General Hospital Corporation Systems and methods for automated image analysis
JP7296799B2 (ja) * 2019-06-28 2023-06-23 セコム株式会社 領域分割装置、領域分割方法、及び領域分割プログラム
CN111403030B (zh) * 2020-02-27 2024-02-02 合创汽车科技有限公司 心理健康监测方法、装置、计算机设备和存储介质
EP4272635B1 (en) * 2020-12-30 2025-12-17 Neurophet Inc. Method for providing diagnosis assistance information, and device performing same
EP4109463A1 (en) * 2021-06-24 2022-12-28 Siemens Healthcare GmbH Providing a second result dataset

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186882A1 (en) 2001-04-25 2002-12-12 Cotman Carl W. Method and apparatus for generating special-purpose image analysis algorithms
US20150036900A1 (en) * 2012-02-01 2015-02-05 Koninklijke Philips N.V. Object image labeling apparatus, method and program

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4721693B2 (ja) * 2004-12-09 2011-07-13 富士フイルムRiファーマ株式会社 頭蓋内容積および局所脳構造物解析プログラム、記録媒体および頭蓋内容積および局所脳構造物解析方法
WO2008089492A2 (en) 2007-01-19 2008-07-24 Mayo Foundation For Medical Education And Research Electronic stool subtraction using quadratic regression and intelligent morphology
US8213696B2 (en) 2007-07-12 2012-07-03 Siemens Medical Solutions Usa, Inc. Tissue detection method for computer aided diagnosis and visualization in the presence of tagging
CA2716598A1 (en) * 2008-03-04 2009-09-11 Tomotherapy Incorporated Method and system for improved image segmentation
SG176860A1 (en) 2009-06-23 2012-01-30 Agency Science Tech & Res A method and system for segmenting a brain image
WO2011040473A1 (ja) * 2009-09-29 2011-04-07 大日本印刷株式会社 医用画像処理方法、装置およびプログラム
US9659364B2 (en) * 2010-03-11 2017-05-23 Koninklijke Philips N.V. Probabilistic refinement of model-based segmentation
WO2011160309A1 (zh) 2010-06-25 2011-12-29 中国科学院自动化研究所 基于鲁棒统计信息传播的多模态三维磁共振图像脑肿瘤分割方法
EP2646978A1 (en) 2010-12-01 2013-10-09 BrainLAB AG Longitudinal monitoring of pathology
CN102115741B (zh) * 2010-12-10 2013-07-17 许汉鹏 一种原位分类提取生物大分子的方法及其系统
US9888876B2 (en) * 2012-03-21 2018-02-13 The Johns Hopkins University Method of analyzing multi-sequence MRI data for analysing brain abnormalities in a subject
NZ704197A (en) * 2012-06-29 2017-03-31 Douwe Egberts Bv Aromatization of coffee
CN103366379B (zh) * 2013-07-29 2016-08-10 江苏中惠医疗科技股份有限公司 基于遗传核模糊聚类的水平集医学图像分割方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186882A1 (en) 2001-04-25 2002-12-12 Cotman Carl W. Method and apparatus for generating special-purpose image analysis algorithms
US20150036900A1 (en) * 2012-02-01 2015-02-05 Koninklijke Philips N.V. Object image labeling apparatus, method and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HARMOUCHE ROLA ET AL: "Probabilistic Multiple Sclerosis Lesion Classification Based on Modeling Regional Intensity Variability and Local Neighborhood Information", IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, IEEE SERVICE CENTER, PISCATAWAY, NJ, USA, vol. 62, no. 5, 17 April 2015 (2015-04-17), pages 1281 - 1292, XP011578784, ISSN: 0018-9294, [retrieved on 20150417], DOI: 10.1109/TBME.2014.2385635 *
KOEN VAN LEEMPUT ED - RASMUS LARSEN ET AL: "Probabilistic Brain Atlas Encoding Using Bayesian Inference", October 2006, MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2006 LECTURE NOTES IN COMPUTER SCIENCE;;LNCS, SPRINGER, BERLIN, DE, PAGE(S) 704 - 711, ISBN: 978-3-540-44707-8, XP019043529 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3761230B1 (en) * 2018-03-01 2025-12-17 Tencent Technology (Shenzhen) Company Limited Image processing method, device, storage medium and computer program product

Also Published As

Publication number Publication date
US20180137394A1 (en) 2018-05-17
EP3289563B1 (en) 2020-08-05
JP6596100B2 (ja) 2019-10-23
JP2018520710A (ja) 2018-08-02
CN107567637B (zh) 2022-01-25
EP3289563A1 (en) 2018-03-07
CN107567637A (zh) 2018-01-09
US10331981B2 (en) 2019-06-25

Similar Documents

Publication Publication Date Title
EP3289563B1 (en) Brain tissue classification
US10885392B2 (en) Learning annotation of objects in image
US10643331B2 (en) Multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging
Yazdani et al. Image segmentation methods and applications in MRI brain images
US9959486B2 (en) Voxel-level machine learning with or without cloud-based support in medical imaging
CN113256553B (zh) 用于使用深层神经网络一致地呈现医学图像的系统和方法
CN107567638B (zh) 对解剖结构的基于模型的分割
US11715208B2 (en) Image segmentation
US11182901B2 (en) Change detection in medical images
JP2016007270A (ja) 医用画像処理装置
JP6415878B2 (ja) 画像処理装置、画像処理方法及び医用画像診断装置
EP3441944A1 (en) Volume rendering of volumetric image data with interactive segmentation
JP7726404B2 (ja) 注釈のための訓練データの選択
RU2713707C2 (ru) Классификация тканей головного мозга
Prasad et al. Volumetric tumour detection using improved region grow algorithm
US20240338815A1 (en) Artifact segmentation and/or uniformity assessment of a gamma camera
EP4198884A1 (en) Method and system for processing an image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16720773

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017552066

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15564263

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017141759

Country of ref document: RU